Have a look around our new website for the discovery and sharing of research data and let us know what you think. See How to Submit for instructions on how to publish your research data and code.
Pereira, Talmo D.; Aldarondo, Diego E.; Willmore, Lindsay; Kislin, Mikhail; Wang, Samuel S.-H.; Murthy, Mala; Shaevitz, Joshua W.
Abstract:
Recent work quantifying postural dynamics has attempted to define the repertoire of behaviors performed by an animal. However, a major drawback to these techniques has been their reliance on dimensionality reduction of images which destroys information about which parts of the body are used in each behavior. To address this issue, we introduce a deep learning-based method for pose estimation, LEAP (LEAP Estimates Animal Pose). LEAP automatically predicts the positions of animal body parts using a deep convolutional neural network with as little as 10 frames of labeled data for training. This framework consists of a graphical interface for interactive labeling of body parts and software for training the network and fast prediction on new data (1 hr to train, 185 Hz predictions). We validate LEAP using videos of freely behaving fruit flies (Drosophila melanogaster) and track 32 distinct points on the body to fully describe the pose of the head, body, wings, and legs with an error rate of <3% of the animal's body length. We recapitulate a number of reported findings on insect gait dynamics and show LEAP's applicability as the first step in unsupervised behavioral classification. Finally, we extend the method to more challenging imaging situations (pairs of flies moving on a mesh-like background) and movies from freely moving mice (Mus musculus) where we track the full conformation of the head, body, and limbs.
Kim, Donghoon; Tracy, Sally J; Smith, Raymond F; Gleason, Arianna E; Bolme, Cindy A; Prakapenka, Vitali B; Appel, Karen; Speziable, Sergio; Wicks, June K; Berryman, Eleanor J; Han, Sirus K; Schoelmerich, Markus O; Lee, Hae Ja; Nagler, Bob; Cunningham, Eric F; Akin, Minta C; Asimow, Paul D; Eggert, Jon H; Duffy, Thomas S
We provide all the test data and corresponding predictions for our paper, “Practical Fluorescence Reconstruction Microscopy for High-Content Imaging”. Please refer to the Methods section in this paper for experimental details. For each experimental condition, we provide the input transmitted-light images (either phase contrast or DIC), the ground truth fluorescence images, and the output predicted fluorescence images which should reconstruct the ground truth fluorescence images.
Sharma, A. Y.; Cole, M. D. J.; Görler, T.; Chen, Y.; Hatch, D. R.; Guttenfelder, W.; Hager, R.; Sturdevant, B. J.; Ku, S.; Mishchenko, A.; Chang, C. S.
The materials include codes and example input / output files for Monte Carlo simulations of lattice chains in the grand canonical ensemble, for determining phase behavior, critical points, and formation of aggregates.
Force-driven parallel shear flow in a spatially periodic domain is shown to be linearly unstable
with respect to both the Reynolds number and the domain aspect ratio. This finding is confirmed
by computer simulations, and a simple expression is derived to determine stable flow conditions.
Periodic extensions of Couette and Poiseuille flows are unstable at Reynolds numbers two orders
of magnitude smaller than their aperiodic equivalents because the periodic boundaries impose
fundamentally different constraints. This instability has important implications for designing computational models of nonlinear dynamic processes with periodicity.