首页 | 本学科首页   官方微博 | 高级检索  
     


Integrated information in discrete dynamical systems: motivation and theoretical framework
Authors:Balduzzi David  Tononi Giulio
Affiliation:Department of Psychiatry, University of Wisconsin, Madison, Wisconsin, USA.
Abstract:This paper introduces a time- and state-dependent measure of integrated information, [var phi], which captures the repertoire of causal states available to a system as a whole. Specifically, [var phi] quantifies how much information is generated (uncertainty is reduced) when a system enters a particular state through causal interactions among its elements, above and beyond the information generated independently by its parts. Such mathematical characterization is motivated by the observation that integrated information captures two key phenomenological properties of consciousness: (i) there is a large repertoire of conscious experiences so that, when one particular experience occurs, it generates a large amount of information by ruling out all the others; and (ii) this information is integrated, in that each experience appears as a whole that cannot be decomposed into independent parts. This paper extends previous work on stationary systems and applies integrated information to discrete networks as a function of their dynamics and causal architecture. An analysis of basic examples indicates the following: (i) [var phi] varies depending on the state entered by a network, being higher if active and inactive elements are balanced and lower if the network is inactive or hyperactive. (ii) [var phi] varies for systems with identical or similar surface dynamics depending on the underlying causal architecture, being low for systems that merely copy or replay activity states. (iii) [var phi] varies as a function of network architecture. High [var phi] values can be obtained by architectures that conjoin functional specialization with functional integration. Strictly modular and homogeneous systems cannot generate high [var phi] because the former lack integration, whereas the latter lack information. Feedforward and lattice architectures are capable of generating high [var phi] but are inefficient. (iv) In Hopfield networks, [var phi] is low for attractor states and neutral states, but increases if the networks are optimized to achieve tension between local and global interactions. These basic examples appear to match well against neurobiological evidence concerning the neural substrates of consciousness. More generally, [var phi] appears to be a useful metric to characterize the capacity of any physical system to integrate information.
Keywords:
本文献已被 PubMed 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号