↓ Skip to main content

PLOS

Integrated Information in Discrete Dynamical Systems: Motivation and Theoretical Framework

Overview of attention for article published in PLoS Computational Biology, June 2008
Altmetric Badge

Mentioned by

news
3 news outlets
twitter
8 X users
facebook
1 Facebook page
wikipedia
2 Wikipedia pages
googleplus
4 Google+ users

Readers on

mendeley
546 Mendeley
citeulike
9 CiteULike
connotea
1 Connotea
Title
Integrated Information in Discrete Dynamical Systems: Motivation and Theoretical Framework
Published in
PLoS Computational Biology, June 2008
DOI 10.1371/journal.pcbi.1000091
Pubmed ID
Authors

David Balduzzi, Giulio Tononi

Abstract

This paper introduces a time- and state-dependent measure of integrated information, phi, which captures the repertoire of causal states available to a system as a whole. Specifically, phi quantifies how much information is generated (uncertainty is reduced) when a system enters a particular state through causal interactions among its elements, above and beyond the information generated independently by its parts. Such mathematical characterization is motivated by the observation that integrated information captures two key phenomenological properties of consciousness: (i) there is a large repertoire of conscious experiences so that, when one particular experience occurs, it generates a large amount of information by ruling out all the others; and (ii) this information is integrated, in that each experience appears as a whole that cannot be decomposed into independent parts. This paper extends previous work on stationary systems and applies integrated information to discrete networks as a function of their dynamics and causal architecture. An analysis of basic examples indicates the following: (i) phi varies depending on the state entered by a network, being higher if active and inactive elements are balanced and lower if the network is inactive or hyperactive. (ii) phi varies for systems with identical or similar surface dynamics depending on the underlying causal architecture, being low for systems that merely copy or replay activity states. (iii) phi varies as a function of network architecture. High phi values can be obtained by architectures that conjoin functional specialization with functional integration. Strictly modular and homogeneous systems cannot generate high phi because the former lack integration, whereas the latter lack information. Feedforward and lattice architectures are capable of generating high phi but are inefficient. (iv) In Hopfield networks, phi is low for attractor states and neutral states, but increases if the networks are optimized to achieve tension between local and global interactions. These basic examples appear to match well against neurobiological evidence concerning the neural substrates of consciousness. More generally, phi appears to be a useful metric to characterize the capacity of any physical system to integrate information.

X Demographics

X Demographics

The data shown below were collected from the profiles of 8 X users who shared this research output. Click here to find out more about how the information was compiled.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 546 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
United States 22 4%
Germany 8 1%
Spain 7 1%
United Kingdom 6 1%
Italy 5 <1%
Japan 5 <1%
France 4 <1%
Estonia 2 <1%
Netherlands 2 <1%
Other 15 3%
Unknown 470 86%

Demographic breakdown

Readers by professional status Count As %
Student > Ph. D. Student 131 24%
Researcher 115 21%
Student > Master 65 12%
Professor > Associate Professor 37 7%
Student > Bachelor 35 6%
Other 106 19%
Unknown 57 10%
Readers by discipline Count As %
Agricultural and Biological Sciences 98 18%
Computer Science 73 13%
Neuroscience 60 11%
Psychology 59 11%
Physics and Astronomy 40 7%
Other 147 27%
Unknown 69 13%