Summary Advances in highly multiplexed tissue imaging are transforming our understanding of human biology by enabling detection and localization of 10-100 proteins at subcellular resolution ( Bodenmiller, 2016 ). Efforts are now underway to create public atlases of multiplexed images of normal and diseased tissues ( Rozenblatt-Rosen et al., 2020 ). Both research and clinical applications of tissue imaging benefit from recording data from complete specimens so that data on cell state and composition can be studied in the context of overall tissue architecture. As a practical matter, specimen size is limited by the dimensions of microscopy slides (2.5 × 7.5 cm or ~2-8 cm 2 of tissue depending on shape). With current microscopy technology, specimens of this size can be imaged at sub-micron resolution across ~60 spectral channels and ~10 6 cells, resulting in image files of terabyte size. However, the rich detail and multiscale properties of these images pose a substantial computational challenge ( Rashid et al., 2020 ). See Rashid et al. (2020) for an comparison of existing visualization tools targeting these multiplexed tissue images.
The imminent release of tissue atlases combining multichannel microscopy with single-cell sequencing and other omics data from normal and diseased specimens creates an urgent need for data and metadata standards to guide data deposition, curation and release. We describe a Minimum Information about Highly Multiplexed Tissue Imaging (MITI) standard that applies best practices developed for genomics and for other microscopy data to highly multiplexed tissue images and traditional histology.
Connectomics has recently begun to image brain tissue at nanometer resolution, which produces petabytes of data. This data must be aligned, labeled, proofread, and formed into graphs, and each step of this process requires visualization for human verification. As such, we present the BUTTERFLY middleware, a scalable platform that can handle massive data for interactive visualization in connectomics. Our platform outputs image and geometry data suitable for hardware-accelerated rendering, and abstracts low-level data wrangling to enable faster development of new visualizations. We demonstrate scalability and extendability with a series of open source Web-based applications for every step of the typical connectomics workflow: data management and storage, informative queries, 2D and 3D visualizations, interactive editing, and graph-based analysis. We report design choices for all developed applications and describe typical scenarios of isolated and combined use in everyday connectomics research. In addition, we measure and optimize rendering throughput-from storage to display-in quantitative experiments. Finally, we share insights, experiences, and recommendations for creating an open source data management and interactive visualization platform for connectomics.
The recent introduction of highly multiplexed imaging of human tissues and tumors promises to fundamentally advance research in tissue biology and human disease. At the same time, histopathology in the clinical setting is undergoing a rapid transition to digital methods. Thus, repositories of imaging data from research and clinical specimens will soon join genomic databases as a means to systematically explore the molecular basis of disease. Even with recent advances in machine learning, experience in anatomic pathology has shown that there is no substitute for expert visual review, annotation, and description of image data. We review the ecosystem of software available for atlas and histopathology images and introduce a new Web-based software tool, Minerva Story, that addresses a critical unmet need. Minerva is an interpretative and interactive guide to complex images organized around guided analysis. We discuss how Minerva and similar software will be integrated into multi-omic browsers for data dissemination of future atlases.
New highly-multiplexed imaging technologies have enabled the study of tissues in unprecedented detail. These methods are increasingly being applied to understand how cancer cells and immune response change during tumor development, progression, and metastasis, as well as following treatment. Yet, existing analysis approaches focus on investigating small tissue samples on a per-cell basis, not taking into account the spatial proximity of cells, which indicates cell-cell interaction and specific biological processes in the larger cancer microenvironment. We present Visinity, a scalable visual analytics system to analyze cell interaction patterns across cohorts of whole-slide multiplexed tissue images. Our approach is based on a fast regional neighborhood computation, leveraging unsupervised learning to quantify, compare, and group cells by their surrounding cellular neighborhood. These neighborhoods can be visually analyzed in an exploratory and confirmatory workflow. Users can explore spatial patterns present across tissues through a scalable image viewer and coordinated views highlighting the neighborhood composition and spatial arrangements of cells. To verify or refine existing hypotheses, users can query for specific patterns to determine their presence and statistical significance. Findings can be interactively annotated, ranked, and compared in the form of small multiples. In two case studies with biomedical experts, we demonstrate that Visinity can identify common biological processes within a human tonsil and uncover novel white-blood cell networks and immune-tumor interactions.
ThiNew multiplexed tissue imaging technologies have enabled the study of normal and diseased tissues in unprecedented detail. These methods are increasingly being applied to understand how cancer cells and immune response change during tumor development, progression, and metastasis as well as following treatment. Yet, existing analysis approaches focus on investigating small tissue samples on a per-cell basis, not taking into account the spatial proximity of cells, which indicates cell-cell interaction and specific biological processes in the larger cancer microenvironment. We present Visinity, a scalable visual analytics system to analyze cell interaction patterns across cohorts of whole-slide multiplexed tissue images. Our approach is based on a fast regional neighborhood computation, leveraging unsupervised learning to quantify, compare, and group cells by their surrounding cellular neighborhood. These neighborhoods can be visually analyzed in an exploratory and confirmatory workflow. Users can explore spatial patterns present across tissues through a scalable image viewer and coordinated views highlighting the neighborhood composition and spatial arrangements of cells. To verify or refine existing hypotheses, users can query for specific patterns to determine their presence and statistical significance. Findings can be interactively annotated, ranked, and compared in the form of small multiples. In two case studies with biomedical experts, we demonstrate that Visinity can identify common biological processes within a human tonsil and uncover novel white-blood networks and immune-tumor interactions.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
334 Leonard St
Brooklyn, NY 11211
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.