Fast animal pose estimation using deep neural networks

Pereira, Talmo D.; Aldarondo, Diego E.; Willmore, Lindsay; Kislin, Mikhail; Wang, Samuel S.-H.; Murthy, Mala; Shaevitz, Joshua W.
Issue date: 30 May 2018
Cite as:
Pereira, Talmo D., Aldarondo, Diego E., Willmore, Lindsay, Kislin, Mikhail, Wang, Samuel S.-H., Murthy, Mala, & Shaevitz, Joshua W. (2018). Fast animal pose estimation using deep neural networks [Data set]. https://doi.org/10.34770/2jce-gm62
@electronic{pereira_talmo_d_2018,
  author      = {Pereira, Talmo D. and
                Aldarondo, Diego E. and
                Willmore, Lindsay and
                Kislin, Mikhail and
                Wang, Samuel S.-H. and
                Murthy, Mala and
                Shaevitz, Joshua W.},
  title       = {{Fast animal pose estimation using deep n
                eural networks}},
  year        = 2018,
  url         = {https://doi.org/10.34770/2jce-gm62}
}
Abstract:

Recent work quantifying postural dynamics has attempted to define the repertoire of behaviors performed by an animal. However, a major drawback to these techniques has been their reliance on dimensionality reduction of images which destroys information about which parts of the body are used in each behavior. To address this issue, we introduce a deep learning-based method for pose estimation, LEAP (LEAP Estimates Animal Pose). LEAP automatically predicts the positions of animal body parts using a deep convolutional neural network with as little as 10 frames of labeled data for training. This framework consists of a graphical interface for interactive labeling of body parts and software for training the network and fast prediction on new data (1 hr to train, 185 Hz predictions). We validate LEAP using videos of freely behaving fruit flies (Drosophila melanogaster) and track 32 distinct points on the body to fully describe the pose of the head, body, wings, and legs with an error rate of <3% of the animal's body length. We recapitulate a number of reported findings on insect gait dynamics and show LEAP's applicability as the first step in unsupervised behavioral classification. Finally, we extend the method to more challenging imaging situations (pairs of flies moving on a mesh-like background) and movies from freely moving mice (Mus musculus) where we track the full conformation of the head, body, and limbs.

Show More
Description:

This dataset contains videos of freely moving fruit flies, as well as trained networks and body position estimates for all ~21 million frames. Download the README.txt file for a detailed description of this dataset's content. See the code repository (https://github.com/talmo/leap) for usage examples of these files.

Show More
# Filename Description Filesize
1 README.txt 821 Bytes
2 dsets_2018-05-03_cluster-sampled.k=10,n=150.h5 72.1 MB
3 dsets_2018-05-03_cluster-sampled.k=10,n=150.labels.mat 299 KB
4 expts_01.tar 13.5 GB
5 expts_02.tar 14.3 GB
6 expts_03.tar 12.6 GB
7 expts_04.tar 12.2 GB
8 expts_05.tar 13.3 GB
9 expts_06.tar 13.4 GB
10 expts_07.tar 13.6 GB
11 expts_072212_163153.h5 2.84 GB
12 expts_08.tar 14.1 GB
13 expts_09.tar 14.6 GB
14 expts_10.tar 14.2 GB
15 expts_11.tar 14 GB
16 expts_12.tar 7.97 GB
17 models_FlyAging-DiegoCNN_v1.0_filters=64_rot=15_lrfactor=... 1.48 GB
18 models_hourglass.tar.gz 1.71 GB
19 models_rotate_angle_sweep.tar.gz 1.51 GB
20 models_sample_size_sweep.tar.gz 2.27 GB
21 models_stacked_hourglass.tar.gz 3.3 GB
22 preds_FlyAging-DiegoCNN_v1.0_filters=64_rot=15_lrfactor=0... 1.47 GB