Back

Toward One-Shot Learning in Neuroscience-Inspired Deep Spiking Neural Networks

Faghihi, F.; Molhem, H.; Moustafa, A.

2019-11-04 neuroscience
10.1101/829556 bioRxiv
Show abstract

Conventional deep neural networks capture essential information processing stages in perception. Deep neural networks often require very large volume of training examples, whereas children can learn concepts such as hand-written digits with few examples. The goal of this project is to develop a deep spiking neural network that can learn from few training trials. Using known neuronal mechanisms, a spiking neural network model is developed and trained to recognize hand-written digits with presenting one to four training examples for each digit taken from the MNIST database. The model detects and learns geometric features of the images from MNIST database. In this work, a novel biological back-propagation based learning rule is developed and used to a train the network to detect basic features of different digits. For this purpose, randomly initialized synaptic weights between the layers are being updated. By using a neuroscience inspired mechanism named synaptic pruning and a predefined threshold, some of the synapses through the training are deleted. Hence, information channels are constructed that are highly specific for each digit as matrix of synaptic connections between two layers of spiking neural networks. These connection matrixes named information channels are used in the test phase to assign a digit class to each test image. As similar to humans abilities to learn from small training trials, the developed spiking neural network needs a very small dataset for training, compared to conventional deep learning methods checked on MNIST dataset.

Matching journals

The top 6 journals account for 50% of the predicted probability mass.

1
PLOS ONE
4510 papers in training set
Top 15%
12.6%
2
Frontiers in Computational Neuroscience
53 papers in training set
Top 0.2%
10.2%
3
Chaos, Solitons & Fractals
32 papers in training set
Top 0.1%
10.2%
4
Neurocomputing
13 papers in training set
Top 0.1%
8.3%
5
Neural Networks
32 papers in training set
Top 0.1%
6.4%
6
Scientific Reports
3102 papers in training set
Top 23%
4.9%
50% of probability mass above
7
Neuroinformatics
40 papers in training set
Top 0.2%
4.0%
8
Neural Computation
36 papers in training set
Top 0.2%
3.6%
9
PLOS Computational Biology
1633 papers in training set
Top 12%
2.6%
10
Frontiers in Neuroscience
223 papers in training set
Top 3%
2.1%
11
Frontiers in Neuroinformatics
38 papers in training set
Top 0.2%
2.1%
12
Cognitive Neurodynamics
15 papers in training set
Top 0.1%
2.1%
13
Frontiers in Neural Circuits
36 papers in training set
Top 0.2%
1.9%
14
Frontiers in Aging Neuroscience
67 papers in training set
Top 2%
1.2%
15
Entropy
20 papers in training set
Top 0.2%
1.2%
16
Bioengineering
24 papers in training set
Top 0.7%
1.2%
17
Biology
43 papers in training set
Top 1%
1.2%
18
Neuroscience
88 papers in training set
Top 2%
1.2%
19
Biomedical Signal Processing and Control
18 papers in training set
Top 0.3%
1.2%
20
Heliyon
146 papers in training set
Top 6%
0.8%
21
IEEE Journal of Biomedical and Health Informatics
34 papers in training set
Top 2%
0.7%
22
Sensors
39 papers in training set
Top 2%
0.7%
23
Journal of Computational Neuroscience
23 papers in training set
Top 0.4%
0.7%
24
eneuro
389 papers in training set
Top 9%
0.7%
25
Nonlinear Dynamics
10 papers in training set
Top 0.5%
0.7%
26
Frontiers in Human Neuroscience
67 papers in training set
Top 3%
0.6%
27
Frontiers in Physiology
93 papers in training set
Top 8%
0.5%
28
Brain Sciences
52 papers in training set
Top 3%
0.5%
29
Journal of Neuroscience Methods
106 papers in training set
Top 2%
0.5%
30
Computer Methods and Programs in Biomedicine
27 papers in training set
Top 1%
0.5%