Snakes possess a unique sensory system for detecting infrared radiation, enabling them to generate a ‘thermal image’ of predators or prey. Infrared signals are initially received by the pit organ, a highly specialized facial structure that is innervated by nerve fibers of the somatosensory system. How this organ detects and transduces infrared signals into nerve impulses is not known. Here we use an unbiased transcriptional profiling approach to identify TRPA1 channels as infrared receptors on sensory nerve fibers that innervate the pit organ. TRPA1 orthologues from pit bearing snakes (vipers, pythons, and boas) are the most heat sensitive vertebrate ion channels thus far identified, consistent with their role as primary transducers of infrared stimuli. Thus, snakes detect infrared signals through a mechanism involving radiant heating of the pit organ, rather than photochemical transduction. These findings illustrate the broad evolutionary tuning of TRP channels as thermosensors in the vertebrate nervous system.
Context The Igls criteria were developed to provide a consensus definition for outcomes of β-cell replacement therapy in the treatment of diabetes during a January 2017 workshop sponsored by the International Pancreas & Islet Transplant Association (IPITA) and the European Pancreas & Islet Transplant Association (EPITA). In July 2019, a symposium at the 17 th IPITA World Congress was held to examine the Igls criteria after two years in clinical practice, including validation against continuous glucose monitoring (CGM)-derived glucose targets, and to propose future refinements that would allow for comparison of outcomes with artificial pancreas system approaches. Evidence acquisition Utilization of the criteria in various clinical and research settings were illustrated by population as well as individual outcome data of four islet and/or pancreas transplant centers. Validation against CGM metrics was conducted in 55 islet transplant recipients followed-up to 10 years from a fifth center. Evidence synthesis The Igls criteria provided meaningful clinical assessment on an individual patient and treatment group level, allowing for comparison both within and between different β-cell replacement modalities. Important limitations include the need to account for changes in insulin requirements and C-peptide levels relative to baseline. In islet transplant recipients, CGM glucose time-in-range improved with each category of increasing β-cell graft function. Conclusions Future Igls 2.0 criteria should consider absolute rather than relative levels of insulin use and C-peptide as qualifiers with treatment success based on glucose assessment using CGM-metrics on par with assessment of HbA1c and severe hypoglycemia events.
Background: Scoring systems have been proposed to select donation after circulatory death (DCD) donors and recipients for liver transplantation (LT). We hypothesized that complex scoring systems derived in large datasets might not predict outcomes locally.Methods: Based on 1-year DCD-LT graft survival predictors in multivariate logistic regression models, we designed, validated, and compared a simple index using the University of California, San Francisco (UCSF) cohort (n = 136) and a universal-comprehensive (UC)-DCD score using the United Network for Organ Sharing (UNOS) cohort (n = 5,792) to previously published DCD scoring systems.Results: The total warm ischemia time (WIT)-index included donor WIT (dWIT) and hepatectomy time (dHep). The UC-DCD score included dWIT, dHep, recipient on mechanical ventilation, transjugular-intrahepatic-portosystemic-shunt, cause of liver disease, model for end-stage liver disease, body mass index, donor/recipient age, and cold ischemia time. In the UNOS cohort, the UC-score outperformed all previously published scores in predicting DCD-LT graft survival (AUC: 0.635 vs. ≤0.562). In the UCSF cohort, the total WIT index successfully stratified survival and biliary complications, whereas other scores did not.Conclusion: DCD risk scores generated in large cohorts provide general guidance for safe recipient/donor selection, but they must be tailored based on non-/partially-modifiable local circumstances to expand DCD utilization.
Donation-after-circulatory-death (DCD), donation-after-brain-death (DBD), and living-donation (LD) are the three possible options for liver transplantation (LT), each with unique benefits and complication rates. We aimed to compare DCD-, DBD-, and LD-LT-specific graft survival and biliary complications (BC). We collected data on 138 DCD-, 3,027 DBD- and 318 LD-LTs adult recipients from a single center and analyzed patient/graft survival. BC (leak and anastomotic/non-anastomotic stricture (AS/NAS)) were analyzed in a subset of 414 patients. One-/five-year graft survival were 88.6%/70.0% for DCD-LT, 92.6%/79.9% for DBD-LT, and, 91.7%/82.9% for LD-LT. DCD-LTs had a 1.7-/1.3-fold adjusted risk of losing their graft compared to DBD-LT and LD-LT, respectively (p < 0.010/0.403). Bile leaks were present in 10.1% (DCD-LTs), 7.2% (DBD-LTs), and 36.2% (LD-LTs) (ORs, DBD/LD vs. DCD: 0.7/4.2, p = 0.402/<0.001). AS developed in 28.3% DCD-LTs, 18.1% DBD-LTs, and 43.5% LD-LTs (ORs, DBD/LD vs. DCD: 0.5/1.8, p = 0.018/0.006). NAS was present in 15.2% DCD-LTs, 1.4% DBDs-LT, and 4.3% LD-LTs (ORs, DBD/LD vs. DCD: 0.1/0.3, p = 0.001/0.005). LTs w/o BC had better liver graft survival compared to any other groups with BC. DCD-LT and LD-LT had excellent graft survival despite significantly higher BC rates compared to DBD-LT. DCD-LT represents a valid alternative whose importance should increase further with machine/perfusion systems.
ObjectiveWe aimed to compare general surgery emergency (GSE) volume, demographics and disease severity before and during COVID-19.BackgroundPresentations to the emergency department (ED) for GSEs fell during the early COVID-19 pandemic. Barriers to accessing care may be heightened, especially for vulnerable populations, and patients delaying care raises public health concerns.MethodsWe included adult patients with ED presentations for potential GSEs at a single quaternary-care hospital from January 2018 to August 2020. To compare GSE volumes in total and by subgroup, an interrupted time-series analysis was performed using the March shelter-in-place order as the start of the COVID-19 period. Bivariate analysis was used to compare demographics and disease severity.Results3255 patients (28/week) presented with potential GSEs before COVID-19, while 546 (23/week) presented during COVID-19. When shelter-in-place started, presentations fell by 8.7/week (31%) from the previous week (p<0.001), driven by decreases in peritonitis (β=−2.76, p=0.017) and gallbladder disease (β=−2.91, p=0.016). During COVID-19, patients were younger (54 vs 57, p=0.001), more often privately insured (44% vs 38%, p=0.044), and fewer required interpreters (12% vs 15%, p<0.001). Fewer patients presented with sepsis during the pandemic (15% vs 20%, p=0.009) and the average severity of illness decreased (p<0.001). Length of stay was shorter during the COVID-19 period (3.91 vs 5.50 days, p<0.001).ConclusionsGSE volumes and severity fell during the pandemic. Patients presenting during the pandemic were less likely to be elderly, publicly insured and have limited English proficiency, potentially exacerbating underlying health disparities and highlighting the need to improve care access for these patients.Level of evidenceIII.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.