Single plane wave transmissions are promising for automated imaging tasks requiring high ultrasound frame rates over an extended field of view. However, a single plane wave insonification typically produces suboptimal image quality. To address this limitation, we are exploring the use of deep neural networks (DNNs) as an alternative to delay-and-sum beamforming. The objectives of this work are to obtain information directly from raw channel data and to simultaneously generate both a segmentation map for automated ultrasound tasks and a corresponding ultrasound B-mode image for interpretable supervision of the automation. We focus on visualizing and segmenting anechoic targets surrounded by tissue and ignoring or de-emphasizing less important surrounding structures. DNNs trained with Field II simulations were tested with simulated, experimental phantom, and in vivo datasets that were not included during training. With unfocused input channel data (i.e., prior to the application of receive time delays), simulated, experimental phantom, and in vivo test datasets achieved mean ± standard deviation Dice similarity coefficients of 0.92 ± 0.13, 0.92±0.03, and 0.77±0.07, respectively, and generalized contrastto-noise ratios (gCNR) of 0.95±0.08, 0.93±0.08, and 0.75±0.14, respectively. With subaperture beamformed channel data and a modification to the input layer of the DNN architecture to accept these data, the fidelity of image reconstruction increased (e.g., mean gCNR of multiple acquisitions of two in vivo breast cysts ranged 0.89-0.96), but DNN display frame rates were reduced from 395 Hz to 287 Hz. Overall, the DNNs successfully translated feature representations learned from simulated data to phantom and in vivo data, which is promising for this novel approach to simultaneous ultrasound image formation and segmentation.
High‐throughput tissue barrier models can yield critical insights on how barrier function responds to therapeutics, pathogens, and toxins. However, such models often emphasize multiplexing capability at the expense of physiologic relevance. Particularly, the distal lung's air–blood barrier is typically modeled with epithelial cell monoculture, neglecting the substantial contribution of endothelial cell feedback in the coordination of barrier function. An obstacle to establishing high‐throughput coculture models relevant to the epithelium/endothelium interface is the requirement for underside cell seeding, which is difficult to miniaturize and automate. Therefore, this paper describes a scalable, low‐cost seeding method that eliminates inversion by optimizing medium density to float cells so they attach under the membrane. This method generates a 96‐well model of the distal lung epithelium–endothelium barrier with serum‐free, glucocorticoid‐free air–liquid differentiation. The polarized epithelial–endothelial coculture exhibits mature barrier function, appropriate intercellular junction staining, and epithelial‐to‐endothelial transmission of inflammatory stimuli such as polyinosine:polycytidylic acid (poly(I:C)). Further, exposure to influenza A virus PR8 and human beta‐coronavirus OC43 initiates a dose‐dependent inflammatory response that propagates from the epithelium to endothelium. While this model focuses on the air–blood barrier, the underside seeding method is generalizable to various coculture tissue models for scalable, physiologic screening.
Blinking, the transient occlusion of the eye by one or more membranes, serves several functions including wetting, protecting, and cleaning the eye. This behavior is seen in nearly all living tetrapods and absent in other extant sarcopterygian lineages suggesting that it might have arisen during the water-to-land transition. Unfortunately, our understanding of the origin of blinking has been limited by a lack of known anatomical correlates of the behavior in the fossil record and a paucity of comparative functional studies. To understand how and why blinking originates, we leverage mudskippers (Oxudercinae), a clade of amphibious fishes that have convergently evolved blinking. Using microcomputed tomography and histology, we analyzed two mudskipper species, Periophthalmus barbarus and Periophthalmodon septemradiatus , and compared them to the fully aquatic round goby, Neogobius melanostomus . Study of gross anatomy and epithelial microstructure shows that mudskippers have not evolved novel musculature or glands to blink. Behavioral analyses show the blinks of mudskippers are functionally convergent with those of tetrapods: P. barbarus blinks more often under high-evaporation conditions to wet the eye, a blink reflex protects the eye from physical insult, and a single blink can fully clean the cornea of particulates. Thus, eye retraction in concert with a passive occlusal membrane can achieve functions associated with life on land. Osteological correlates of eye retraction are present in the earliest limbed vertebrates, suggesting blinking capability. In both mudskippers and tetrapods, therefore, the origin of this multifunctional innovation is likely explained by selection for increasingly terrestrial lifestyles.
The mistakes and corrections are as follows:• In the Figure 3A caption, "n = 3 technical replicates per MOI." should read "n = 3 technical replicates per MOI (24h); n = 2 technical replicates per MOI (48h and 72h)." • In Figure 3A, the y-axis units "μg/cm 2 " should read "pg/cm 2 ". • In the Statistical Analysis section, it was originally reported that Figure 3A data were analyzed with 1-way ANOVA. This should read "2-way ANOVA".The overall results of the study and implications of the paper remain the same.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations –citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
334 Leonard St
Brooklyn, NY 11211
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.