How can we study reasoning in the brain?

D. PapoAvatar Inv

Frontiers in Human Neuroscience, 10:423 (2016).

The brain did not develop a dedicated device for reasoning. This fact bears dramatic consequences. While for perceptuo-motor functions neural activity is shaped by the input’s statistical properties, and processing is carried out at high speed in hardwired spatially segregated modules, in reasoning, neural activity is driven by internal dynamics and processing times, stages, and functional brain geometry are largely unconstrained a priori. Here, it is shown that the complex properties of spontaneous activity, which can be ignored in a short-lived event-related world, become prominent at the long time scales of certain forms of reasoning. It is argued that the neural correlates of reasoning should in fact be defined in terms of non-trivial generic properties of spontaneous brain activity, and that this implies resorting to concepts, analytical tools, and ways of designing experiments that are as yet non-standard in cognitive neuroscience. The implications in terms of models of brain activity, shape of the neural correlates, methods of data analysis, observability of the phenomenon and experimental designs are discussed.

Advertisements

Characterizing Motif Dynamics of Electric Brain Activity Using Symbolic Analysis

M. Zanin and D. Papoconn

Entropy, 16:5655-5667 (2014).

Motifs are small recurring circuits of interactions which constitute the backbone of networked systems. Characterizing motif dynamics is therefore key to understanding the functioning of such systems. Here we propose a method to define and quantify the temporal variability and time scales of electroencephalogram (EEG) motifs of resting brain activity. Given a triplet of EEG sensors, links between them are calculated by means of linear correlation; each pattern of links (i.e., each motif) is then associated to a symbol, and its appearance frequency is analyzed by means of Shannon entropy. Our results show that each motif becomes observable with different coupling thresholds and evolves at its own time scale, with fronto-temporal sensors emerging at high thresholds and changing at fast time scales, and parietal ones at low thresholds and changing at slower rates. Finally, while motif dynamics differed across individuals, for each subject, it showed robustness across experimental conditions, indicating that it could represent an individual dynamical signature.

[Read more in Entropy]

Complex network theory and the brain‎

D. Papo, J.M. Buldú, S. Boccaletti and E.T. Bullmoreconn

Philosophical Transactions of the Royal Society B369:20130520 (2014).

Complex network theory is a statistical physics understanding of graph ‎theory, itself a sample_cover PTRSBmuch older branch of pure mathematics. The statistical physics approach aims at explaining observable macroscopic behaviour of a given system as emerging in a non-trivial way from the interactions of a vast number of microscopic units or agents. Complex network theory can be thought of as a subfield of statistical physics for structurally disordered, dynamically heterogeneous systems with non-trivial topology; and as an extension of graph theory to systems with high structural heterogeneity and ‎inherently dynamical properties, two key properties of the vast ‎majority of real-life systems, including brains‎.

Can this approach be useful when studying brain anatomy and function?

Read more in Philosophical Transactions]   [Read interview in Phil. Trans Blog]    [Listen to podcast in Nature]

Functional brain networks: great expectations, hard times, and the big leap forward

D. Papo, M. Zanin, J.A. Pineda-Pardo, S. Boccaletti, and J.M. Buldúconn

Philosophical Transactions of the Royal Society B 369:20130525 (2014).

Many physicasample_cover PTRSBl and biological systems can be studied using complex network theory, a new statistical physics understanding of graph theory. The recent application of complex network theory to the study of functional brain networks generated great enthusiasm as it allows addressing hitherto non-standard issues in the field, such as efficiency of brain functioning or vulnerability to damage. However, in spite of its high degree of generality, the theory was originally designed to describe systems profoundly different from the brain. We discuss some important caveats in the wholesale application of existing tools and concepts to a field they were not originally designed to describe. At the same time, we argue that complex network theory has not yet been taken full advantage of, as many of its important aspects are yet to make their appearance in the neuroscience literature. Finally, we propose that, rather than simply borrowing from an existing theory, functional neural networks can inspire a fundamental reformulation of complex network theory, to account for its exquisitely complex functioning mode.

[Read more in Philosophical Transactions]    [Read more in ArXiv]       [Read interview in Phil. Trans Blog]    [Listen to podcast in Nature]

Parenclitic networks: uncovering new functions in biological data

images (3)M. Zanin, J. Medina Alcazar, J. Vicente Carbajosa, M. Gomez Paez, D. Papo, P. Sousa, E. Menasalvas, and S. Boccaletti

Scientific Reports, 4:5112 (2014).

We introduce a novel method to represent time independent, scalar data sets as complex networks. We apply our method to investigate gene expression in the response to osmotic stress of Arabidopsis thaliana. In the proposed network representation, the most important genes for the plant response turn out to be the nodes with highest centrality in appropriately reconstructed networks. We also performed a target experiment, in which the predicted genes were artificially induced one by one, and the growth of the corresponding phenotypes compared to that of the wild-type. The joint application of the network reconstruction method and of the in vivo experiments allowed identifying 15 previously unknown key genes, and provided models of their mutual relationships. This novel representation extends the use of graph theory to data sets hitherto considered outside of the realm of its application, vastly simplifying the characterization of their underlying structure.

[Read more in Scientific Reports]        [Read more in ArXiv]

Functional significance of complex fluctuations in brain activity: from resting state to cognitive neuroscience

D. PapoAvatar Inv

Frontiers in Systems Neuroscience, 8:112 (2014).

Behavioural studies have shown that human cognition is characterized by properties such as temporal scale invariance, heavy-tailed non-Gaussian distributions, and long-range correlations at long time scales, suggesting models of how (non observable) components of cognition interact. On the other hand, results from functional neuroimaging studies show that complex scaling and intermittency may be generic spatio-temporal properties of the brain at rest. Somehow surprisingly, though, hardly ever have the neural correlates of cognition been studied at time scales comparable to those at which cognition shows scaling properties. Here, we analyze the meanings of scaling properties and the significance of their task-related modulations for cognitive neuroscience. It is proposed that cognitive processes can be framed in terms of complex generic properties of brain activity at rest and, ultimately, of functional equations, limiting distributions, symmetries, and possibly universality classes characterizing them.

[Read more in Frontiers in Systems Neuroscience]

Measuring brain temperature without a thermometer

Avatar InvD. Papo

Frontiers in Physiology, 5:24 (2014).

Temperature has profound effects on a wide range of parameters of neural activity at various scales [1]. At the cell level, ionic currents, membrane potential, input resistance, action potential amplitude, duration and propagation, and synaptic transmission have all been shown to be affected by temperature variations [1-5]. At mesoscopic scales of neural activity, temperature changes can steer network activity toward different functional regimes [6], affecting the duration, frequency and firing rate of activated states during slow frequency oscillations, and the ability to end these states [7]. Temperature also has a substantial effect on chemical reaction rates [8], and affects the blood oxygen saturation level by changing haemoglobin affinity for oxygen [9]. Furthermore, cooling reduces metabolic processes [10], and has been used to silence cortical areas to study their function [11].

[Read more in Frontiers in Fractal Physiology]