Characterizing Motif Dynamics of Electric Brain Activity Using Symbolic Analysis

M. Zanin and D. Papoconn

Entropy, 16:5655-5667 (2014).

Motifs are small recurring circuits of interactions which constitute the backbone of networked systems. Characterizing motif dynamics is therefore key to understanding the functioning of such systems. Here we propose a method to define and quantify the temporal variability and time scales of electroencephalogram (EEG) motifs of resting brain activity. Given a triplet of EEG sensors, links between them are calculated by means of linear correlation; each pattern of links (i.e., each motif) is then associated to a symbol, and its appearance frequency is analyzed by means of Shannon entropy. Our results show that each motif becomes observable with different coupling thresholds and evolves at its own time scale, with fronto-temporal sensors emerging at high thresholds and changing at fast time scales, and parietal ones at low thresholds and changing at slower rates. Finally, while motif dynamics differed across individuals, for each subject, it showed robustness across experimental conditions, indicating that it could represent an individual dynamical signature.

[Read more in Entropy]

Advertisements

Complex network theory and the brain‎

D. Papo, J.M. Buldú, S. Boccaletti and E.T. Bullmoreconn

Philosophical Transactions of the Royal Society B369:20130520 (2014).

Complex network theory is a statistical physics understanding of graph ‎theory, itself a sample_cover PTRSBmuch older branch of pure mathematics. The statistical physics approach aims at explaining observable macroscopic behaviour of a given system as emerging in a non-trivial way from the interactions of a vast number of microscopic units or agents. Complex network theory can be thought of as a subfield of statistical physics for structurally disordered, dynamically heterogeneous systems with non-trivial topology; and as an extension of graph theory to systems with high structural heterogeneity and ‎inherently dynamical properties, two key properties of the vast ‎majority of real-life systems, including brains‎.

Can this approach be useful when studying brain anatomy and function?

Read more in Philosophical Transactions]   [Read interview in Phil. Trans Blog]    [Listen to podcast in Nature]

Functional brain networks: great expectations, hard times, and the big leap forward

D. Papo, M. Zanin, J.A. Pineda-Pardo, S. Boccaletti, and J.M. Buldúconn

Philosophical Transactions of the Royal Society B 369:20130525 (2014).

Many physicasample_cover PTRSBl and biological systems can be studied using complex network theory, a new statistical physics understanding of graph theory. The recent application of complex network theory to the study of functional brain networks generated great enthusiasm as it allows addressing hitherto non-standard issues in the field, such as efficiency of brain functioning or vulnerability to damage. However, in spite of its high degree of generality, the theory was originally designed to describe systems profoundly different from the brain. We discuss some important caveats in the wholesale application of existing tools and concepts to a field they were not originally designed to describe. At the same time, we argue that complex network theory has not yet been taken full advantage of, as many of its important aspects are yet to make their appearance in the neuroscience literature. Finally, we propose that, rather than simply borrowing from an existing theory, functional neural networks can inspire a fundamental reformulation of complex network theory, to account for its exquisitely complex functioning mode.

[Read more in Philosophical Transactions]    [Read more in ArXiv]       [Read interview in Phil. Trans Blog]    [Listen to podcast in Nature]

Parenclitic networks: uncovering new functions in biological data

images (3)M. Zanin, J. Medina Alcazar, J. Vicente Carbajosa, M. Gomez Paez, D. Papo, P. Sousa, E. Menasalvas, and S. Boccaletti

Scientific Reports, 4:5112 (2014).

We introduce a novel method to represent time independent, scalar data sets as complex networks. We apply our method to investigate gene expression in the response to osmotic stress of Arabidopsis thaliana. In the proposed network representation, the most important genes for the plant response turn out to be the nodes with highest centrality in appropriately reconstructed networks. We also performed a target experiment, in which the predicted genes were artificially induced one by one, and the growth of the corresponding phenotypes compared to that of the wild-type. The joint application of the network reconstruction method and of the in vivo experiments allowed identifying 15 previously unknown key genes, and provided models of their mutual relationships. This novel representation extends the use of graph theory to data sets hitherto considered outside of the realm of its application, vastly simplifying the characterization of their underlying structure.

[Read more in Scientific Reports]        [Read more in ArXiv]

Functional significance of complex fluctuations in brain activity: from resting state to cognitive neuroscience

D. PapoAvatar Inv

Frontiers in Systems Neuroscience, 8:112 (2014).

Behavioural studies have shown that human cognition is characterized by properties such as temporal scale invariance, heavy-tailed non-Gaussian distributions, and long-range correlations at long time scales, suggesting models of how (non observable) components of cognition interact. On the other hand, results from functional neuroimaging studies show that complex scaling and intermittency may be generic spatio-temporal properties of the brain at rest. Somehow surprisingly, though, hardly ever have the neural correlates of cognition been studied at time scales comparable to those at which cognition shows scaling properties. Here, we analyze the meanings of scaling properties and the significance of their task-related modulations for cognitive neuroscience. It is proposed that cognitive processes can be framed in terms of complex generic properties of brain activity at rest and, ultimately, of functional equations, limiting distributions, symmetries, and possibly universality classes characterizing them.

[Read more in Frontiers in Systems Neuroscience]

Measuring brain temperature without a thermometer

Avatar InvD. Papo

Frontiers in Physiology, 5:24 (2014).

Temperature has profound effects on a wide range of parameters of neural activity at various scales [1]. At the cell level, ionic currents, membrane potential, input resistance, action potential amplitude, duration and propagation, and synaptic transmission have all been shown to be affected by temperature variations [1-5]. At mesoscopic scales of neural activity, temperature changes can steer network activity toward different functional regimes [6], affecting the duration, frequency and firing rate of activated states during slow frequency oscillations, and the ability to end these states [7]. Temperature also has a substantial effect on chemical reaction rates [8], and affects the blood oxygen saturation level by changing haemoglobin affinity for oxygen [9]. Furthermore, cooling reduces metabolic processes [10], and has been used to silence cortical areas to study their function [11].

[Read more in Frontiers in Fractal Physiology]

Time scales in cognitive neuroscience

D. PapoAvatar Inv

Frontiers in Physiology, 4:86 (2013).

Cognitive neuroscience boils down to describing the ways in which cognitive function results from brain activity. In turn, brain activity shows complex fluctuations, with structure at many spatio-temporal scales. Exactly how cognitive function inherits the physical dimensions of neural activity, though, is highly non-trivial, and so are generally the corresponding dimensions of cognitive phenomena. As for any physical phenomenon, when studying cognitive function, the first conceptual step should be that of establishing its dimensions. Here, we provide a systematic presentation of the temporal aspects of task-related brain activity, from the smallest scale of the brain imaging technique’s resolution, to the observation time of a given experiment, through the characteristic time scales of the process under study. We first review some standard assumptions on the temporal scales of cognitive function. In spite of their general use, these assumptions hold true to a high degree of approximation for many cognitive (viz. fast perceptual) processes, but have their limitations for other ones (e.g., thinking or reasoning). We define in a rigorous way the temporal quantifiers of cognition at all scales, and illustrate how they qualitatively vary as a function of the properties of the cognitive process under study. We propose that each phenomenon should be approached with its own set of theoretical, methodological and analytical tools. In particular, we show that when treating cognitive processes such as thinking or reasoning, complex properties of ongoing brain activity, which can be drastically simplified when considering fast (e.g., perceptual) processes, start playing a major role, and not only characterize the temporal properties of task-related brain activity, but also determine the conditions for proper observation of the phenomena. Finally, some implications on the design of experiments, data analyses, and the choice of recording parameters are discussed.

[Read more in Frontiers in Fractal Physiology]