Back

Visualizing and sonifying neurodata (ViSoND) for enhanced observation

Blankenship, L.; Sterrett, S. C.; Martins, D. M.; Findley, T. M.; Abe, E. T. T.; Parker, P. R. L.; Niell, C.; Smear, M. C.

2026-03-24 neuroscience
10.64898/2026.03.21.713430 bioRxiv
Show abstract

Neuroscience needs observation. Observation lets us evaluate data quality, judge whether models are biologically realistic, and generate new hypotheses. However, high-dimensional behavioral and neural data are too complex to be easily displayed and eye-tested. Computational methods can reduce the dimensionality of data and reveal statistically robust dynamical structure but often yield results that are difficult to relate back to the underlying biology. In addition, the choice of what parameters to quantify may not capture unexpectedly relevant aspects of the data. To supplement quantification with enhanced qualitative observation, we developed Visualization and Sonification of NeuroData (ViSoND), an open-source approach for displaying multiple data streams using video and sonification. Sonification is nothing new to neuroscience. Scientists have sonified their physiological preparations since Lord Adrians earliest recordings. We extend this tradition by mapping multiple physiological datastreams to musical notes using MIDI. Synchronizing MIDI to video provides an opportunity to watch an animals movement while listening to physiological signals such as action potentials. Here we provide two demonstrations of this approach. First, we used ViSoND to interpret behavioral structure revealed by a computational model trained on the breathing rhythms of freely behaving mice. Second, ViSoND revealed patterns of neural activity in mouse visual cortex corresponding to eye blinks, events that were previously filtered out of analysis. These use cases show that ViSoND can supplement quantitative rigor with observational interpretability. Additionally, ViSoND provides an accessible way to display data which may broaden the audience for communication of neuroscientific findings.

Matching journals

The top 5 journals account for 50% of the predicted probability mass.

1
eneuro
389 papers in training set
Top 0.2%
17.0%
2
PLOS Computational Biology
1633 papers in training set
Top 2%
13.9%
3
eLife
5422 papers in training set
Top 7%
9.8%
4
Patterns
70 papers in training set
Top 0.1%
8.2%
5
Scientific Reports
3102 papers in training set
Top 32%
3.8%
50% of probability mass above
6
Frontiers in Neuroinformatics
38 papers in training set
Top 0.2%
3.5%
7
Nature Communications
4913 papers in training set
Top 41%
3.5%
8
Neuroinformatics
40 papers in training set
Top 0.3%
2.7%
9
PLOS ONE
4510 papers in training set
Top 45%
2.7%
10
GigaScience
172 papers in training set
Top 1%
2.0%
11
iScience
1063 papers in training set
Top 11%
2.0%
12
Frontiers in Behavioral Neuroscience
46 papers in training set
Top 0.4%
1.7%
13
Proceedings of the National Academy of Sciences
2130 papers in training set
Top 31%
1.7%
14
PLOS Biology
408 papers in training set
Top 10%
1.6%
15
Nature Methods
336 papers in training set
Top 5%
1.4%
16
Bioinformatics
1061 papers in training set
Top 8%
1.3%
17
Nature Computational Science
50 papers in training set
Top 0.9%
1.3%
18
Neuron
282 papers in training set
Top 7%
0.9%
19
Communications Biology
886 papers in training set
Top 18%
0.9%
20
BMC Bioinformatics
383 papers in training set
Top 6%
0.9%
21
Frontiers in Neuroscience
223 papers in training set
Top 6%
0.9%
22
Journal of Neuroscience Methods
106 papers in training set
Top 1%
0.9%
23
Frontiers in Cellular Neuroscience
79 papers in training set
Top 1%
0.9%
24
Frontiers in Neural Circuits
36 papers in training set
Top 0.7%
0.7%
25
Cell Reports Methods
141 papers in training set
Top 6%
0.7%
26
Nature
575 papers in training set
Top 17%
0.6%
27
Frontiers in Psychiatry
83 papers in training set
Top 4%
0.6%
28
NeuroImage
813 papers in training set
Top 7%
0.6%