Back

Plug-and-Play automated behavioral tracking of zebrafish larvae with DeepLabCut and SLEAP: pre-trained networks and datasets of annotated poses

Scholz, L. A.; Mancienne, T. G.; Stednitz, S. J.; Scott, E. K.; Lee, C. C. Y.

2025-06-06 animal behavior and cognition
10.1101/2025.06.04.657938 bioRxiv
Show abstract

Zebrafish are an important model system in behavioral neuroscience due to their rapid development and suite of distinct, innate behaviors. Quantifying many of these larval behaviors requires detailed tracking of eye and tail kinematics, which in turn demands imaging at high spatial and temporal resolution, ideally using semi or fully automated tracking methods for throughput efficiency. However, creating and validating accurate tracking models is time-consuming and labor intensive, with many research groups duplicating efforts on similar images. With the goal of developing a useful community resource, we trained pose estimation models using a diverse array of video parameters and a 15-keypoint pose model. We deliver an annotated dataset of free-swimming and head-embedded behavioral videos of larval zebrafish, along with four pose estimation networks from DeepLabCut and SLEAP (two variants of each). We also evaluated model performance across varying imaging conditions to guide users in optimizing their imaging setups. This resource will allow other researchers to skip the tedious and laborious training steps for setting up behavioral analyses, guide model selection for specific research needs, and provide ground truth data for benchmarking new tracking methods. SIGNIFICANCE STATEMENTLarval zebrafish are an emerging model in systems neuroscience, offering unique advantages for linking brain activity to behavior. However, detailed behavioral tracking, essential for such studies, requires time- and labor-intensive annotation and model training. To eliminate this bottleneck, here we provide a high-quality, annotated dataset of zebrafish behaviors for both free-swimming and head-embedded preparations alongside four pre-trained pose estimation models using DeepLabCut and SLEAP. We benchmark models performance across diverse imaging conditions to guide optimal setup choices. This community resource will allow researchers to bypass the most time-consuming stages of data annotation and training, enabling immediate behavioral analysis. By removing this key hurdle, this work will accelerate project initiation, support reproducibility, and provide a foundation for future tracking method development.

Matching journals

The top 2 journals account for 50% of the predicted probability mass.

1
eneuro
389 papers in training set
Top 0.1%
40.0%
2
eLife
5422 papers in training set
Top 3%
12.9%
50% of probability mass above
3
Proceedings of the National Academy of Sciences
2130 papers in training set
Top 13%
4.9%
4
Neuron
282 papers in training set
Top 3%
4.2%
5
PLOS Biology
408 papers in training set
Top 2%
4.0%
6
PLOS Computational Biology
1633 papers in training set
Top 9%
3.9%
7
Nature Communications
4913 papers in training set
Top 48%
1.9%
8
Nature Machine Intelligence
61 papers in training set
Top 2%
1.7%
9
Communications Biology
886 papers in training set
Top 9%
1.7%
10
PLOS ONE
4510 papers in training set
Top 54%
1.7%
11
Scientific Reports
3102 papers in training set
Top 62%
1.5%
12
BMC Biology
248 papers in training set
Top 1%
1.5%
13
Methods in Ecology and Evolution
160 papers in training set
Top 2%
1.2%
14
Nature Methods
336 papers in training set
Top 5%
1.2%
15
Epilepsia
49 papers in training set
Top 0.6%
1.0%
16
Human Brain Mapping
295 papers in training set
Top 4%
0.8%
17
The Journal of Neuroscience
928 papers in training set
Top 8%
0.8%
18
Journal of Visualized Experiments
30 papers in training set
Top 0.8%
0.7%
19
Nature Human Behaviour
85 papers in training set
Top 4%
0.7%
20
Nature Neuroscience
216 papers in training set
Top 6%
0.7%
21
Nature
575 papers in training set
Top 16%
0.7%
22
Cell Reports Methods
141 papers in training set
Top 6%
0.7%
23
Frontiers in Neuroscience
223 papers in training set
Top 9%
0.5%
24
iScience
1063 papers in training set
Top 40%
0.5%