[Home/Bio]   [Publications]   [Presentations]   [News/Blog]   [Software]   [Research]   [Students]   [Links]   [Résumé/CV]


Joseph Lizier -- Research

Home > Research

Information dynamics in complex systems

My primary research interest lies in the information dynamics of distributed computation in complex systems. The physics or nature of distributed computation has long been of interest in complex systems, artificial life, bioinformatics and computational neuroscience. Systems in all of these domains are often described in terms of memory, communication or signalling and processing. The hypothesis I am following is that if we can describe and quantify distributed computation in these terms, with particular attention to their dynamics, then we will be better able to understand computation in nature and its sources of complexity. It will also allow us to answer meaningful questions about computation in complex systems; e.g. when and how much information is transferred between two brain regions. This approach should also provide insights on how to better design distributed computing systems.

I have produced a framework to quantitatively define each of these distributed operations on information during computation. Formally they are information storage, transfer and modification, and collectively referred to as information dynamics. They are measured information-theoretically, and are called dynamics since they are studied on a local scale in space and time. I have introduced new measures including the active information storage and separable information (for modification), localised existing measures including the transfer entropy and excess entropy, and compared to other related measures including causal information flow.

Importantly, the framework has provided quantitative evidence for several long-held conjectures regarding distributed computation in theoretical systems, such as the roles of emergent structures in cellular automata. I have also studied whether these computational properties are maximised in order-chaos phase transitions. Further, I have produced a Java toolkit to implement the measures, and applied them to study computation in models of gene regulatory networks, artificial life systems, and in computational neuroscience, with promising results in each domain. The existing work shows the approach is theoretically sound and has strong potential for making inroads for complex systems science into many disciplines.

Currently I am studying how the physical structure of complex networks relates to their computational capabilities. For example, I have found that ordered networks tend to be biased towards information storage behaviour, random networks tend to be biased towards information transfer behaviour, while small-world networks exhibit something of a balance between these operations. I am also concentrating on applications to complex networks in computational neuroscience, e.g. examining task-based differences in spatiotemporal information transfer patterns in brain imaging data.

I have successfully collaborated on this work with researchers from the Max Planck Institute for Mathematics in the Sciences, CSIRO ICT Centre, The University of Sydney, Indiana University, Bernstein Center for Computational Neuroscience, Osaka University, University of Delaware, and Doshisha University.