Back

A Novel Quantitative Metric Based on a Complete and Unique Characterization of Neural Network Activity: 4D Shannon`s Entropy

Deshpande, S. S.; van Drongelen, W.

2023-09-15 neuroscience
10.1101/2023.09.15.557974 bioRxiv
Show abstract

The human brain comprises an intricate web of connections that generate complex neural networks capable of storing and processing information. This information depends on multiple factors, including underlying network structure, connectivity, and interactions; and thus, methods to characterize neural networks typically aim to unravel and interpret a combination of these factors. Here, we present four-dimensional (4D) Shannons entropy, a novel quantitative metric of network activity based on the Triple Correlation Uniqueness (TCU) theorem. Triple correlation, which provides a complete and unique characterization of the network, relates three nodes separated by up to four spatiotemporal lags. Here, we evaluate the 4D entropy from the spatiotemporal lag probability distribution function (PDF) of the network activitys triple correlation. Given a spike raster, we compute triple correlation by iterating over time and space. Summing the contributions to the triple correlation over each of the spatial and temporal lag combinations generates a unique 4D spatiotemporal lag distribution, from which we estimate a PDF and compute Shannons entropy. To outline our approach, we first compute 4D Shannons entropy from feedforward motif-class patterns in a simulated spike raster. We then apply this methodology to spiking activity recorded from rat cortical cultures to compare our results to previously published results of pairwise (2D) correlated spectral entropy over time. We find that while first- and second-order metrics of activity (spike rate and cross-correlation) show agreement with previously published results, our 4D entropy computation (which also includes third-order interactions) reveals a greater depth of underlying network organization compared to published pairwise entropy. Ultimately, because our approach is based on the TCU, we propose that 4D Shannons entropy is a more complete tool for neural network characterization. Author SummaryHere, we present a novel entropy metric for neural network characterization, 4D Shannons entropy, based on triple correlation, which measures interactions among up to three neurons in time and space. Per the Triple Correlation Uniqueness (TCU) theorem, our 4D entropy approach is based on a complete and unique characterization of network activity. We first outline the method to obtain 4D Shannons entropy using a simulated spike raster of feedforward three-neuron configurations. We then apply this metric to an open-source, experimental dataset of rat cortical cultures over time to show that while first- and second-order interactions (spike rate and cross-correlation) show similar trends to published results, the TCU-based 4D Shannons entropy metric provides greater insights into later-stage network activity compared to the published pairwise entropy. As this metric is computed from a 4D distribution unique to the network, we propose that utilization of 4D entropy offers a clear advantage compared to currently utilized pairwise entropy metrics for neural network analyses. For this reason, neuroscientific and clinical applications abound - these may include analysis of distinct dynamical states, characterizing responses to medication, and identification of pathological brain networks, such as seizures.

Matching journals

The top 6 journals account for 50% of the predicted probability mass.

1
Neural Computation
36 papers in training set
Top 0.1%
18.7%
2
Scientific Reports
3102 papers in training set
Top 6%
10.1%
3
PLOS Computational Biology
1633 papers in training set
Top 5%
7.2%
4
Network Neuroscience
116 papers in training set
Top 0.1%
6.3%
5
Chaos, Solitons & Fractals
32 papers in training set
Top 0.4%
4.9%
6
Frontiers in Neuroscience
223 papers in training set
Top 1%
4.0%
50% of probability mass above
7
Neuroinformatics
40 papers in training set
Top 0.2%
3.7%
8
eneuro
389 papers in training set
Top 3%
3.3%
9
PLOS ONE
4510 papers in training set
Top 45%
2.6%
10
Entropy
20 papers in training set
Top 0.1%
2.1%
11
Journal of Computational Neuroscience
23 papers in training set
Top 0.2%
2.1%
12
Journal of Neural Engineering
197 papers in training set
Top 1%
1.9%
13
Neural Networks
32 papers in training set
Top 0.3%
1.9%
14
Neuroscience
88 papers in training set
Top 1.0%
1.8%
15
Cognitive Neurodynamics
15 papers in training set
Top 0.1%
1.8%
16
Frontiers in Neural Circuits
36 papers in training set
Top 0.2%
1.8%
17
Frontiers in Computational Neuroscience
53 papers in training set
Top 1%
1.7%
18
Physical Review E
95 papers in training set
Top 0.8%
1.3%
19
Neurocomputing
13 papers in training set
Top 0.3%
1.3%
20
Frontiers in Cellular Neuroscience
79 papers in training set
Top 0.7%
1.2%
21
Journal of Neuroscience Methods
106 papers in training set
Top 1%
1.0%
22
Frontiers in Systems Neuroscience
19 papers in training set
Top 0.3%
1.0%
23
BMC Bioinformatics
383 papers in training set
Top 6%
0.9%
24
Communications Biology
886 papers in training set
Top 21%
0.8%
25
NeuroImage
813 papers in training set
Top 6%
0.7%
26
Biomedical Signal Processing and Control
18 papers in training set
Top 0.5%
0.7%
27
Frontiers in Neuroinformatics
38 papers in training set
Top 0.8%
0.7%
28
Chaos: An Interdisciplinary Journal of Nonlinear Science
16 papers in training set
Top 0.3%
0.7%
29
Proceedings of the National Academy of Sciences
2130 papers in training set
Top 45%
0.7%
30
iScience
1063 papers in training set
Top 37%
0.6%