By continuing to browse this site you agree to us using cookies as described in About Cookies
Wiley Online Library is migrating to a new platform powered by Atypon, the leading provider of scholarly publishing platforms. The new Wiley Online Library will be migrated over the weekend of February 24 & 25 and will be live on February 26, 2018. For more information, please visit our migration page: http://www.wileyactual.com/WOLMigration/
One of the most powerful approaches to understanding the development of multicellular organisms is the reconstruction of cell lineages. Over the past decades, numerous methods have been devised to follow cells and their descendants in the living embryo (Stern and Fraser,2001). Of these, the preferred ones rely on noninvasive, direct observation of cell behavior, as they interfere the least with normal development. Owing to its optical clarity, rapid development, and the variety of cell labeling techniques available, the zebrafish serves as a good vertebrate model organism for noninvasive lineage analysis.
Lineage tracing in living embryos requires specialized “tools”. On the biological side, a reliable, stable, and nontoxic marker is essential. The histoneH2a–green fluorescent protein (GFP) transgenic zebrafish line generated by the Campos-Ortega laboratory (Pauls et al.,2001) represents such a tool, as it provides a sharply defined, robust marker of nucleus (and, therefore, cell) position.
On the instrument side, laser scanning confocal microscopy (LSM) has become the method of choice for live imaging (Megason and Fraser,2003). With modern confocal microscopes, cell movement can be followed for extended time periods without deleterious effects on cells. Two-photon microscopy (Denk et al.,1990) has the added benefit of better tissue penetration and lower phototoxicity than single-photon LSM. As an example, it has been used recently to analyze cell lineage, cell movement, and cell shape changes in gastrulation and early nervous system of the zebrafish embryo (Ulrich et al.,2003; Langenberg and Brand,2005).
In Figure 1, we illustrate the dataflow of confocal time-lapse image acquisition and processing. Usually, live specimen are imaged in multiple focal planes over minutes to hours at intervals of several minutes. Depending on the number of confocal channels used and the size and number of images acquired per time point, the amount of data in the resulting four-dimensional (4D) data set can easily accumulate to several gigabytes. In many cases, processing steps will be interdependent: For example, object reconstruction by surface or volume rendering will require prior thresholding and, thus, image processing. Today's available software usually features several of the listed functions (Megason and Fraser,2003). Commercial software performs surface and volume rendering, image processing, and analysis, and their 4D data analysis functions have greatly improved over the past 2 years. Of the commercial software suits, “Imaris” (Bitplane) and “Volocity” (Improvision) are, to our knowledge, the most widely used and they offer several tracking and analysis functions. However, both tracking and data visualization tools, i.e., simple, interactive, graphical representations of cell movement (tracks) and secondary features, such as speed, velocity, or consistency are not implemented in the basic packages. They are offered as (costly) add-ons mainly designed for automatic object tracking and some subsequent analysis (see the Discussion section).
To streamline cell tracking data analysis, we devised TracePilot, a Java-based open-source software that allows interactive and intuitive tracking data manipulation and visualization. Here, we demonstrate the utility of TracePilot by visualizing the dynamics of zebrafish neuroepithelial cell division and movement.
The midbrain–hindbrain boundary (mhb) region of the developing neural tube of vertebrate embryos contains a group of cells, the mhb organizer, that patterns the surrounding neuroepithelium (for review, see Raible and Brand,2004). Controlling the position and movement of these cells is of key importance for the embryo. We used the histoneH2a-GFP transgenic line to follow the movement of neuroepithelial cells in the mhb region between early somitogenesis stages and approximately 24 hr postfertilization (hpf; Fig. 2 and Langenberg and Brand,2005). To assign a status to the tracked nuclei at the end of the time lapse, we stained the embryos for the expression of a midbrain marker gene and matched nuclei position in the last live image stack with the antibody staining (Fig. 2A–C). Nuclei were then manually tracked backward in the time lapse, and their position recorded at intervals of approximately 1 hr (Fig. 2D,E). In addition, nuclei positions at the start and end of the time lapse were recorded in cell rows distance from the Otx(+)/Otx(−) interface (Fig. 2F).
Through this “continuous fate mapping,” we were able to show that Otx(+) and Otx(−) cells derive from separate, lineage-restricted populations in early somitogenesis. This finding can be visualized using plots exported from three-dimensionally rendered scenes in Matlab (Fig. 2G–J). This type of data plotting is sufficient to demonstrate that the two cell populations are separated from each other; however, it is very difficult to visualize the dynamics of cell movement with this type of static plot.
To facilitate the dynamic analysis and visualization of tracking data, we devised a software program, TracePilot, that takes tracking coordinates in two or three dimensions over time as input data. The tracked objects (i.e., cells) can be sorted in the input data sheet or within the program into user-defined groups. At the coordinates defined by the input data, the program places objects in a 3D, fully zoom-able and rotate-able scene, whereby the appearance (shape and color) of objects in the scene can be defined by the group they belong to (Fig. 3). The lineage, which is stored as an inherent feature that is independent of grouping, can be highlighted for individual cell families or groups of cells. With a time slider, the movement of cells can be animated between selected time points. TracePilot's color-code view allows visualization of dynamic features of cell movement. In this first release, the color-code view has been programmed to visualize cell movement consistency, whereby consistency is the ratio between the distance covered by a moving cell and the shortest distance between the start and end point of the movement. One of TracePilot's unique features is its ability to display derived features such as consistency and the sorting of cells into groups at the same time in different display windows, if needed from different viewing angles (e.g., lateral and dorsal, Fig. 3). Finally, TracePilot allows movie export into the QuickTime format (Supplementary Movie, which can be viewed at http://www.interscience.wiley.com/jpages/1058-8388/suppmat), which should be very useful for data presentation and publications. TracePilot is Java-based open-source software and available from a Web site, including a detailed documentation: http://www.mpi-cbg.de/tracepilot. Our aim is to include more analysis tools in future releases, and we encourage users to make suggestions for other types of tools to be implemented.
For successful lineage tracing, it is of key importance to be able to follow cells during divisions. Figure 4 illustrates that we were able to record nuclei positions for all three types of cell division in the developing zebrafish brain (pseudocolored nuclei) (Concha and Adams,1998; Geldmacher-Voss et al.,2003): bilateral, anterior–posterior (a–p), and dorsal–ventral (d–v). Importantly, by acquiring multiple z-sections with close time intervals (3–4 min), we were able to track divisions even when several cells divided next to each other (Fig. 4, first column) and when cells divided in d–v orientation (Fig. 4, columns 3–5) in the late keel- to neural plate stage. This latter type of division with a strong d–v component can only be followed with high temporal and spatial resolution time-lapse imaging. Figure 4 demonstrates how this type of division can be visualized within TracePilot: The movement of daughter cells deriving from an oblique division (yellow) can be compared with that of cells dividing bilaterally (red and green) in both a dorsal (column 4) and pseudoaxial view (column 5). The latter view would normally require object rendering. This process is not always possible, as it requires very high axial resolution, which can usually only be achieved by decreasing the temporal resolution. Figure 4 demonstrates that a simplified view in TracePilot of the actual imaging data can give a better impression of cell movement dynamics than sophisticated three-dimensional reconstruction. This view has the added benefit of showing color-code cell grouping.
Figure 5 (and Supplementary Movie 1) further illustrates how the TracePilot software can be used to analyze the movement of tracked cells. Tracking data from one of our time-lapse movies (Langenberg and Brand,2005) was imported into the program and cells were sorted into four different groups at the beginning of the time lapse. Posterior midbrain cells are red; immediately neighboring, anterior hindbrain cells blue; and cells further away from the interface are color-coded yellow and green, respectively. The position of cells and their descendants is shown in approximately 25 time point intervals (4 min each). Notice the mixing of red and blue group cells with neighboring cells but no mixing between the blue and red group, showing clearly that midbrain and hindbrain cells are kept separated, while cell mixing in general is not restricted. The Supplementary Movie shows how the TracePilot software creates an animation by interpolating cell positions between time points with known coordinates.
In the course of imaging and subsequent cell tracking, very large amounts of data are generated: Even when using only one microscope channel at 8-bit-depth and with a resolution of 512 × 512 pixels, a several hour time apse can easily amount to gigabytes of image data. The analysis and interpretation of such amounts of data is only possible with the right types of software tools. However, although handling gigabytes of imaging information has become a problem for scientists, there is—to our knowledge—no standardized, widely used software available that is capable of managing, processing, analyzing, and visualizing imaging data at this scale. To date, the progress in image acquisition hardware has outpaced the development of appropriate software. Several academic initiatives were launched that promise to improve this situation (for an overview, see Megason and Fraser,2003). Of these initiatives, the Open Microscopy Environment (OME; Swedlow et al.,2003) is of particular interest, as it suggests standards for storing images and their associated metadata in a database format (Fig. 1). Due to its open architecture, OME allows analysis and visualization software to be integrated into its environment.
Imaris and Volocity serve as good examples for commercial image processing and visualization software tools. Both take 3D or 4D image information as input data and create volume- and/or surface-rendered scenes that can be animated over time. During the development of TracePilot, both companies have added or improved tracking and tracking data analysis functions to their software that can be bought as add-on packages.
Imaris' (version 4.1) tracking module, which requires the “measurement pro” add-on, features the automatic detection or manual placement of so-called spots or isosurfaces and the generation of tracks by connecting objects over time. Objects and tracks can be grouped and color-coded according to the grouping. Tracked objects can be overlaid in 3D with the volume data. Although many statistics of spots and surface-rendered objects can be generated, track measurement functions are limited to track length, speed, and the number of splits and merges. Generated tracks can be visualized as lines or cylinders and the time can be indexed as a changing color. Movies can be exported from the generated scenes. A recent interesting extension to Imaris, ImarisXT, is the integration of user-created extensions by means of a COM interface, allowing crosstalk between Imaris and programming languages such as Java or Matlab.
Volocity's measurement tools come as a stand-alone application that works independently of the rendering module. Like Imaris, Volocity (version 3.5) features automatic tracking through applying a “classifier” to the image data. The classifier identifies objects that can be highlighted in the rendered scene (this feature requires the rendering module) and be connected to build tracks which can also be grouped. Subsequently, track statistics and several different chart types can be generated and exported from the program. Track length, velocity, displacement, displacement rate, and a “meandering index” (directionality) can be measured. Branching and fusion of objects are not implemented, although tracks can be linearly merged to build longer tracks. Manual tracking is not directly supported and requires the user to make use of the “magic wand” tool to add coordinates one by one to a spreadsheet. Avi and QuickTime movies can be exported from rendered sequences.
Despite the availability of sophisticated rendering and (automatic) object tracking software, we believed that a simple to use, interactive method for visualizing and displaying tracked cells was lacking. To fill this gap, we created Trace Pilot.
TracePilot is a Java-based, open-source program that takes tracking coordinates as input data and processes them in a user-interactive way. The program's various tools enable the user to quickly sort cells into groups, identify individual cells on the screen, hide or highlight cells or groups of cells, and to display their movement over time. Importantly, the cell lineage is preserved as an inherent characteristic of the tracked cells, allowing the identification of related cells at any time, independent of grouping.
The success of open-source and freeware programs such as NIHImage and its derivatives convinced us to make the TracePilot software available free of charge. Our goal is to enhance the functionality of TracePilot by integrating algorithms for cell movement analysis. We would greatly appreciate and acknowledge user feedback and suggestions for improvements and functions to be incorporated into the program.
TracePilot executes on the Java platform (version 1.4 and higher). It uses the Java3D library to perform visualizations and the Java Media Framework library to export animations in QuickTime movie format. The program was developed using NetBeans IDE 4.0 (www.netbeans.org). To eliminate potential bugs in critical parts of the program at the earliest stage of its development, several testing classes were created that use JUnit to perform automatic testing of the source code.
Nuclei Tracking, Plots, and Movie Export
The 4D volumes were generated within the WCIF ImageJ version (http://www.uhnresearch.ca/facilities/wcif/imagej) using the hypervolume opener (Joachim Walter) and hypervolume browser plugins (Patrick Pirrotte and Jerome Mutterer). Coordinates were imported into Excel, converted to TracePilot's format, and exported to tab-delimited text files. Screenshots and the Supplementary Movie were exported from TracePilot. Plots (Fig. 2) were exported from three-dimensionally rendered scenes within Matlab using self-written routines.
This paper is dedicated to the memory of the late José Campos-Ortega, a pioneer in the field of developmental neurobiology. We thank members of the Brand laboratory as well as Dieter Göhlmann (Bitplane) and Olaf Michalski (Improvision) for helpful discussions and Ilona Hübner for secretarial assistance.