Fusion transcripts are used as biomarkers in companion diagnoses. Although more than 15,000 fusion RNAs have been identified from diverse cancer types, few common features have been reported. Here, we compared 16,410 fusion transcripts detected in cancer (from a published cohort of 9,966 tumor samples of 33 cancer types) with genome-wide RNA–DNA interactions mapped in two normal, noncancerous cell types [using iMARGI, an enhanced version of the mapping of RNA–genome interactions (MARGI) assay]. Among the top 10 most significant RNA–DNA interactions in normal cells, 5 colocalized with the gene pairs that formed fusion RNAs in cancer. Furthermore, throughout the genome, the frequency of a gene pair to exhibit RNA–DNA interactions is positively correlated with the probability of this gene pair to present documented fusion transcripts in cancer. To test whether RNA–DNA interactions in normal cells are predictive of fusion RNAs, we analyzed these in a validation cohort of 96 lung cancer samples using RNA sequencing (RNA-seq). Thirty-seven of 42 fusion transcripts in the validation cohort were found to exhibit RNA–DNA interactions in normal cells. Finally, by combining RNA-seq, single-molecule RNA FISH, and DNA FISH, we detected a cancer sample with EML4-ALK fusion RNA without forming the EML4-ALK fusion gene. Collectively, these data suggest an RNA-poise model, where spatial proximity of RNA and DNA could poise for the creation of fusion transcripts.
Multiple familial trichoepithelioma (MFT) and familial cylindromatosis are two clinically distinct cancer syndromes. MFT patients developed mostly trichoepithelioma in the face while cylindromatosis patients developed cylindromas predominantly (approximately 90%) on the head and neck. However, multiple familial trichoepithelioma is occasionally associated with familial cylindromatosis while cylindromatosis patients can also develop trichoepithelioma. This has led to the speculation that the 2 types of dermatoses may be caused by dysfunction of a common pathway. Previously, a candidate MTF locus has been mapped to 9p21 while disease gene for familial cylindromatosis, the CYLD gene located on 16q21-13 has been identified. Here, we show that mutations in the CYLD gene are also the genetic basis for three different Chinese families with MFT. Sequence analysis reveal a single nucleotide deletion, c.1462delA (P.Ile488fsX9) in exon 9, a nonsense mutation, c.2128C>T (p. Gln710X) in exon 17, and a missense mutation, c.2822A>T (p. Asp941Val) in exon 21 in each of the three families respectively. This provides direct evidence that the mutations in CYLD can cause two clinically distinct cancer syndromes.
Summary Background Pioneering effort has been made to facilitate the recognition of pathology in malignancies based on whole‐slide images (WSIs) through deep learning approaches. It remains unclear whether we can accurately detect and locate basal cell carcinoma (BCC) using smartphone‐captured images. Objectives To develop deep neural network frameworks for accurate BCC recognition and segmentation based on smartphone‐captured microscopic ocular images (MOIs). Methods We collected a total of 8046 MOIs, 6610 of which had binary classification labels and the other 1436 had pixelwise annotations. Meanwhile, 128 WSIs were collected for comparison. Two deep learning frameworks were created. The ‘cascade’ framework had a classification model for identifying hard cases (images with low prediction confidence) and a segmentation model for further in‐depth analysis of the hard cases. The ‘segmentation’ framework directly segmented and classified all images. Sensitivity, specificity and area under the curve (AUC) were used to evaluate the overall performance of BCC recognition. Results The MOI‐ and WSI‐based models achieved comparable AUCs around 0·95. The ‘cascade’ framework achieved 0·93 sensitivity and 0·91 specificity. The ‘segmentation’ framework was more accurate but required more computational resources, achieving 0·97 sensitivity, 0·94 specificity and 0·987 AUC. The runtime of the ‘segmentation’ framework was 15·3 ± 3·9 s per image, whereas the ‘cascade’ framework took 4·1 ± 1·4 s. Additionally, the ‘segmentation’ framework achieved 0·863 mean intersection over union. Conclusions Based on the accessible MOIs via smartphone photography, we developed two deep learning frameworks for recognizing BCC pathology with high sensitivity and specificity. This work opens a new avenue for automatic BCC diagnosis in different clinical scenarios. What's already known about this topic? The diagnosis of basal cell carcinoma (BCC) is labour intensive due to the large number of images to be examined, especially when consecutive slide reading is needed in Mohs surgery. Deep learning approaches have demonstrated promising results on pathological image‐related diagnostic tasks. Previous studies have focused on whole‐slide images (WSIs) and leveraged classification on image patches for detecting and localizing breast cancer metastases. What does this study add? Instead of WSIs, microscopic ocular images (MOIs) photographed from microscope eyepieces using smartphone cameras were used to develop neural network models for recognizing BCC automatically. The MOI‐ and WSI‐based models achieved comparable areas under the curve around 0·95. Two deep learning frameworks for recognizing BCC pathology were developed with high sensitivity and specificity. Recognizing BCC through a smartphone could be considered a future clinical choice.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.