Automated cell tracking in populations is important for research and discovery in biology and medicine. In this paper, we propose a cell tracking method based on global spatiotemporal data association which considers hypotheses of initialization, termination, translation, division and false positive in an integrated formulation. Firstly, reliable tracklets (i.e., short trajectories) are generated by linking detection responses based on frame-by-frame association. Next, these tracklets are globally associated over time to obtain final cell trajectories and lineage trees. During global association, tracklets form tree structures where a mother cell divides into two daughter cells. We formulate the global association for tree structures as a maximum-a-posteriori (MAP) problem and solve it by linear programming. This approach is quantitatively evaluated on sequences with thousands of cells captured over several days.
We present several algorithms for cell image analysis including microscopy image restoration, cell event detection and cell tracking in a large population. The algorithms are integrated into an automated system capable of quantifying cell proliferation metrics in vitro in real-time. This offers unique opportunities for biological applications such as efficient cell behavior discovery in response to different cell culturing conditions and adaptive experiment control. We quantitatively evaluated our system's performance on 16 microscopy image sequences with satisfactory accuracy for biologists' need. We have also developed a public website compatible to the system's local user interface, thereby allowing biologists to conveniently check their experiment progress online. The website will serve as a community resource that allows other research groups to upload their cell images for analysis and comparison.
Automated digital histopathology image segmentation is an important task to help pathologists diagnose tumors and cancer subtypes. For pathological diagnosis of cancer subtypes, pathologists usually change the magnification of whole-slide images (WSI) viewers. A key assumption is that the importance of the magnifications depends on the characteristics of the input image, such as cancer subtypes. In this paper, we propose a novel semantic segmentation method, called Adaptive-Weighting-Multi-Fieldof-View-CNN (AWMF-CNN), that can adaptively use image features from images with different magnifications to segment multiple cancer subtype regions in the input image. The proposed method aggregates several expert CNNs for images of different magnifications by adaptively changing the weight of each expert depending on the input image. It leverages information in the images with different magnifications that might be useful for identifying the subtypes. It outperformed other state-of-the-art methods in experiments.
Cell shape analysis is important in biomedical research. Deep learning methods may perform to segment individual cells if they use sufficient training data that the boundary of each cell is annotated. However, it is very time-consuming for preparing such detailed annotation for many cell culture conditions. In this paper, we propose a weakly supervised method that can segment individual cell regions who touch each other with unclear boundaries in dense conditions without the training data for cell regions. We demonstrated the efficacy of our method using several data-set including multiple cell types captured by several types of microscopy. Our method achieved the highest accuracy compared with several conventional methods. In addition, we demonstrated that our method can perform without any annotation by using fluorescence images that cell nuclear were stained as training data.Code is publicly available in https://github.com/naivete5656/WSISPDR
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.