1 - 37 of 37 results

FIMTrack / FTIR-based Imaging Method Track

Represents a tracking program primarily designed for acquiring locomotion trajectories of Drosophila melanogaster larvae but it is also possible to analyze the behavior of others small model organism. FIMTrack is optimized for (FTIR-based Imaging Method) FIM images and Drosophila larvae, its algorithms described above only segment the animals if the background is darker than the foreground. It offers different tracking strategies to works on a wide range of different model organisms.


A multitracking algorithm that extracts a characteristic fingerprint from each animal in a video recording of a group. IdTracker then uses these fingerprints to identify every individual throughout the video. Tracking by identification prevents propagation of errors, and the correct identities can be maintained indefinitely. idTracker distinguishes animals even when humans cannot, such as for size-matched siblings, and reidentifies animals after they temporarily disappear from view or across different videos. We tested it on fish (Danio rerio and Oryzias latipes), flies (Drosophila melanogaster), ants (Messor structor) and mice (Mus musculus).

JAABA / Janelia Automatic Animal Behavior Annotator

Enables researchers to automatically compute interpretable, quantitative statistics describing video of behaving animals. Through JAABA system, users encode their intuition about the structure of behavior by labeling the behavior of the animal in a small set of video frames. This software uses machine learning techniques to convert these manual labels into behavior detectors that can then be used to automatically classify the behaviors of animals in large data sets with high throughput. JAABA combines an intuitive graphical user interface, a fast and powerful machine learning algorithm, and visualizations of the classifier into an interactive, usable system for creating automatic behavior detectors.


Provides information concerning the position, orientation, distance and speed of each mouse’s body part. MiceProfiler can resolve interacting mice that are in close contact. Using this low-level geometrical and positional data, we categorized behavioral states and monitored their temporal evolution. With this information, we built hierarchical spatiotemporal data representations in the form of chronograms and motivation graphs, and merged behavioral data with data from the field of view of the mice. This approach can help uncover which elements of the distal and proximal space mice take into account when making behavioral choices.


A machine vision program for estimating the positions and orientations of many walking flies, maintaining their individual identities over long periods of time. Ctrax was designed to allow high-throughput, quantitative analysis of behavior in freely moving flies. The primary goal in this project is to provide quantitative behavior analysis tools to the neuroethology community. Thus, the system is adaptable to other labs' setups. To compensate for identity and other tracking errors, we provide the FixErrors Matlab GUI that identifies suspicious sequences of frames and allows users to correct tracking errors. We also distribute the Behavioral Microarray Matlab Toolbox for defining and detecting a broad palette of individual and social behaviors. This software inputs the trajectories output by Ctrax and computes descriptive statistics of the behavior of each individual fly.


Allows automated detection of labelled forepaws in freely behaving in mice. M-Track uses color detection and back projection algorithms to locate the position of color-labelled paws in videos of multiple freely-behaving mice. This software through an intuitive graphical user interface provides a tool to obtain quantitative information on fine aspects of spontaneous grooming behaviors, to improve the current understanding of the functional properties of brain neuronal circuits in biomedical research studies.


Combines three of the most revolutionary innovations of the last decades - 3D printing, small single-board computers and machine learning - into a novel paradigm for behavioural researchers. Ethoscopes present four unique features: (i) a software and hardware solution that is reproducible and easily scalable, (ii) perform not just real-time tracking, but faithful real-time profiling of behaviour using a supervised machine learning algorithm, (iii) stimulates flies in a feedback-loop mode, and (iv) highly customisable and open source. It can be easily built using 3D printing technology and they rely on Raspberry Pi and Arduino to provide affordable and flexible hardware.


Permits to track a single row of whiskers in a fully automated fashion. Whisk is a cross-platform package for fully automated tracking of single rows of whiskers in high-speed video. It consists of a set of command-line utilities and a graphical interface for semi-automated tracking. Python and Matlab interfaces are also provided. This approach uses statistics gleaned from the video itself to estimate the most likely identity of each traced object that maintains the expected order of whiskers along the face.

BEEtag / BEhavioral Ecology tag

An image-based tracking system in Matlab for tracking uniquely identifiable visual markers. The primary advantages of BEEtag are that it (i) independently identifies animals or marked points in each frame of a video, limiting error propagation, (ii) performs well in images with complex backgrounds, and (iii) is low-cost. To validate the use of this tracking system in animal behavior, we mark and track individual bumblebees and recover individual patterns of space use and activity within the nest. BEEtag can be uniquely identified in a still image or movie frame without prior knowledge of its position.

BUNS / Barnes-maze UNbiased Strategy

A support vector machine for memory acquisition, reversal training and probe trials. The BUNS algorithm provides a standardized method of strategy classification and cognitive scoring scale, which cannot be derived from typical Barnes maze data analysis. It also uses generic features of the mouse path to enable unbiased analysis of spatial learning strategies, and presents a cognitive scoring scale based on the spatial learning strategy used by the mice. The BUNStool enables this analysis to be performed in different variants of the Barnes maze, such as during acquisition, probe trials, reversal, and memory extinction, which enables one to obtain a complete picture of the spatial cognition phenotype of the mice.

BEMOVI / BEhavior and MOrphology from VIdeos

A digital video processing and analysis workflow to extract abundance, behavior and morphology of individual organisms from video sequences. BEMOVI identifies individuals present in a video, reconstructs their movement trajectories through space and time and merges this information into a single database. BEMOVI is a modular set of functions, which can be customized to allow for peculiarities of the videos to be analyzed, in terms of organism features and how they can be distinguished from the background. We tested utility for microbes, it is likely that BEMOVI will be useful for analyzing any objects moving against a relatively stationary background.


Analyses and visualizes high-throughput animal behavioural data. Rethomics is an R package designed to focuses on analysing data from the Ethoscope platform as well as TriKinetics' Drosophila Activity Monitor. It provides (i) consistent data import (the data from different acquisition system will have the same internal structure), (ii) publication quality graph, (iii) a modern web-interface to control multiple devices effectively, (iv) high-throughput detailed post-hock analysis, (v) a modular design (it is straightforward to modify both devices and software and create new experimental paradigms) and (vi) a high scalability.


A complete suite for analysis of sleep and locomotor activity in Drosophila melanogaster. pySolo has been developed with the specific aim of being accessible, portable, fast and easily expandable through an intuitive plug-in structure. It provides a user-friendly graphic interface and includes a powerful video recording solution and a versatile software for analysis of video as well as of traditional (infrared based) data. Support for development of additional plug-ins is provided through a community website.

MAPLE Control Software / Modular Automated Platform for Large-scale Experiments Control Software

Offers a platform dedicated to the automatization of various fly handling tasks. MAPLE Control Software provides a graphic user interface with the aim of increasing the performing of large-scale experiments and expanding automated fly handling experimental capabilities. The robot includes: (i) three independent Z axes; (ii) a high-resolution camera; (iii) a LED ring, and (iv) two manipulators (fly and small part).


Records rodents’ behavior during the elevated plus-maze and the open-field test. Phobos is able to generate all basic locomotor-related behavioral results at once, immediately after a simple manual record of the rodent’s position, along with simultaneous analysis of the experiment in 5-min periods. It manages to ease the experimenter from laborious work by providing self-explanatory characteristics and a convenient way to record the behavior of the animal, while it quickly calculates all basic locomotor-related parameters, easing behavioral studies.


star_border star_border star_border star_border star_border
star star star star star
An open-source quantitative analysis toolkit for animal experiments. UMATracker provides a video preprocessing by the visual programming and estimates positions and/or orientations of each animals. In addition, it can also analyze the output-data; the distance from the designated position, the region-of-interests and the interactions between each animal. With this tool, it is possible to perform the analysis of the pre-treatment, individual track, result correction and results of the moving image.

Nemo / Nematode Movement

Allows users to analyze nematode locomotion. Nemo is designed to track deformable objects from a video sequence in high resolution. It assists in: (1) extracting morphological features, (2) proceeding with segmentation of the animal body, and (3) retrieving information related to the position of the center of mass of each body section separately. Moreover, it permits users to select regions of interest and compute specific locomotion features related to these regions.


Records the exact location of a fly in real time. Tracker is a video-based tracking system that allows the detection of very small movements at any location within the tube. In addition to circadian locomotor activity, this tool is able to collect other information, such as distance, speed, food proximity, place preference, and multiple additional parameters that relate to sleep structure. This technique promises to provide valuable data to researchers conducting long-term behavioral studies.


Enables the tracking and management of hundreds of thousands of animals from birth to death. MouseTRACS is a tool that provides the advantages of electronic records management, experimental data access and analysis, regulatory compliance, and inventory cost control. It identifies mutants, visualizes data, or maps mutations, and also displays and integrates phenotype and genotype data using likelihood odds ratio (LOD) plots of genetic linkage between genotype and phenotype.


Tracks and analyzes the behavior, movement, and activity of any animal. EthoVision is a versatile image processing system designed to automate behavioural observation and movement tracking on multiple animals simultaneously against a variety of complex backgrounds. It is used in a wide range of fields, mostly related to neurosciences, such as toxicology, safety pharmacology, psychopharmacology, drug discovery, molecular biology, genetics, behavioral neurosciences, but also in applied ethology, and animal welfare studies.


A simple, user-friendly tool for interactive image classification, segmentation and analysis. It is built as a modular software framework, which currently has workflows for automated (supervised) pixel- and object-level classification, automated and semi-automated object tracking, semi-automated segmentation and object counting without detection. Most analysis operations are performed lazily, which enables targeted interactive processing of data subvolumes, followed by complete volume analysis in offline batch mode.


Tracks animal behavior for ethological studies. K-Track adoptes predicting linear movement or regional matching as a method to identify individuals when they overlapped. K-Track implements three main processes: (i) the object bee's region is detected by simple threshold processing on gray scale images, (ii) Individuals are identified by size, shape and spatiotemporal positional changes, and (iii) centers of mass of identified individuals are connected through all movie frames to yield individual behavioral trajectories. The proposed algorithm showed better performance in tracking multiple bees compared to Ctrax, in terms of both robustness (fewer tracking errors and losses in movies showing complex motion patterns), and richness (number of identified behavioral states) of the behavioral classifier.