Back

Efficient Inference on a Network of Spiking Neurons using Deep Learning

Baldy, N.; Breyton, M.; Woodman, M. M.; Jirsa, V. K.; Hashemi, M.

2024-01-26 neuroscience
10.1101/2024.01.26.577077 bioRxiv
Show abstract

The process of making inference on networks of spiking neurons is crucial to decipher the underlying mechanisms of neural computation. Mean-field theory simplifies the interactions between neurons to produce macroscopic network behavior, facilitating the study of information processing and computation within the brain. In this study, we perform inference on a mean-field model of spiking neurons to gain insight into likely parameter values, uniqueness and degeneracies, and also to explore how well the statistical relationship between parameters is maintained by traversing across scales. We benchmark against state-of-the-art optimization and Bayesian estimation algorithms to identify their strengths and weaknesses in our analysis. We show that when confronted with dynamical noise or in the case of missing data in the presence of bistability, generating probability distributions using deep neural density estimators outperforms other algorithms, such as adaptive Monte Carlo sampling. However, this class of deep generative models may result in an overestimation of uncertainty and correlation between parameters. Nevertheless, this issue can be improved by incorporating time-delay embedding. Moreover, we show that training deep Neural ODEs on spiking neurons enables the inference of system dynamics from microscopic states. In summary, this work demonstrates the enhanced accuracy and efficiency of inference on networks of spiking neurons when deep learning is harnessed to solve inverse problems in neural computation.

Matching journals

The top 4 journals account for 50% of the predicted probability mass.

1
PLOS Computational Biology
1633 papers in training set
Top 0.5%
23.5%
2
Neural Computation
36 papers in training set
Top 0.1%
15.3%
3
Frontiers in Computational Neuroscience
53 papers in training set
Top 0.3%
7.5%
4
Journal of Computational Neuroscience
23 papers in training set
Top 0.1%
5.1%
50% of probability mass above
5
Scientific Reports
3102 papers in training set
Top 21%
5.1%
6
Neural Networks
32 papers in training set
Top 0.1%
3.8%
7
Bulletin of Mathematical Biology
84 papers in training set
Top 0.8%
2.2%
8
Physical Review E
95 papers in training set
Top 0.5%
2.0%
9
Neurocomputing
13 papers in training set
Top 0.2%
1.9%
10
Chaos, Solitons & Fractals
32 papers in training set
Top 0.9%
1.8%
11
NeuroImage
813 papers in training set
Top 4%
1.8%
12
Frontiers in Neuroscience
223 papers in training set
Top 4%
1.7%
13
Network Neuroscience
116 papers in training set
Top 0.7%
1.5%
14
Journal of Neural Engineering
197 papers in training set
Top 1%
1.4%
15
Biological Cybernetics
12 papers in training set
Top 0.1%
1.3%
16
Entropy
20 papers in training set
Top 0.3%
1.0%
17
PLOS ONE
4510 papers in training set
Top 62%
1.0%
18
Frontiers in Neural Circuits
36 papers in training set
Top 0.6%
0.8%
19
Cognitive Neurodynamics
15 papers in training set
Top 0.4%
0.8%
20
eneuro
389 papers in training set
Top 9%
0.7%
21
Journal of The Royal Society Interface
189 papers in training set
Top 5%
0.7%
22
Frontiers in Neuroinformatics
38 papers in training set
Top 0.8%
0.7%
23
BMC Bioinformatics
383 papers in training set
Top 7%
0.7%
24
eLife
5422 papers in training set
Top 60%
0.7%
25
Mathematics
11 papers in training set
Top 0.5%
0.7%