Diagnostic point-of-care (POC) testing is intended to minimize the time to obtain a test result, thereby allowing clinicians and patients to make an expeditious clinical decision. As POC tests expand into resource-limited settings (RLS), the benefits must outweigh the costs. To optimize POC testing in RLS, diagnostic POC tests need rigorous evaluations focused on relevant clinical outcomes and operational costs, which differ from evaluations of conventional diagnostic tests. Here, we reviewed published studies on POC testing in RLS, and found no clearly defined metric for the clinical utility of POC testing. Therefore, we propose a framework for evaluating POC tests, and suggest and define the term “test efficacy” to describe a diagnostic test’s capacity to support a clinical decision within its operational context. We also proposed revised criteria for an ideal diagnostic POC test in resource-limited settings. Through systematic evaluations, comparisons between centralized diagnostic testing and novel POC technologies can be more formalized, and health officials can better determine which POC technologies represent valuable additions to their clinical programs.
Individuals with latent tuberculosis had 79% lower risk of progressive tuberculosis after reinfection than uninfected individuals. The risk reduction estimated in this study is greater than most previous estimates made through population models.
In developed nations, monitoring for drug-induced liver injury via serial measurements of serum transaminases (aspartate aminotransferase (AST) and alanine aminotransferase (ALT)) in at-risk individuals is the standard of care. Despite the need, monitoring for drug-related hepatotoxicity in resource-limited settings is often limited by expense and logistics, even for patients at highest risk. This manuscript describes the development and clinical testing of a paper-based, multiplexed microfluidic assay designed for rapid, semi-quantitative measurement of AST and ALT in a fingerstick specimen. Using 223 clinical specimens obtained by venipuncture and 10 fingerstick specimens from healthy volunteers, we have shown that our assay can, in 15 minutes, provide visual measurements of AST and ALT in whole blood or serum which allow the user to place those values into one of three readout “bins” (<3x upper limit of normal (ULN), 3-5x ULN, and >5x ULN, corresponding to tuberculosis/HIV treatment guidelines) with >90% accuracy. These data suggest that the ultimate point-of-care fingerstick device will have high impact on patient care in low-resource settings.
BackgroundHIV counseling and testing may serve as an entry point for non-communicable disease screening.ObjectivesTo determine the yield of newly-diagnosed HIV, tuberculosis (TB) symptoms, diabetes and hypertension, and to assess CD4 count testing, linkage to care as well as correlates of linkage and barriers to care from a mobile testing unit.MethodsA mobile unit provided screening for HIV, TB symptoms, diabetes and hypertension in Cape Town, South Africa between March 2010 and September 2011. The yield of newly-diagnosed cases of these conditions was measured and clients were followed-up between January and November 2011 to assess linkage. Linkage to care was defined as accessing care within one, three or six months post-HIV diagnosis (dependent on CD4 count) and one month post-diagnosis for other conditions. Clinical and socio-demographic correlates of linkage to care were evaluated using Poisson regression and barriers to care were determined.ResultsOf 9,806 clients screened, the yield of new diagnoses was: HIV (5.5%), TB suspects (10.1%), diabetes (0.8%) and hypertension (58.1%). Linkage to care for HIV-infected clients, TB suspects, diabetics and hypertensives was: 51.3%, 56.7%, 74.1% and 50.0%. Only disclosure of HIV-positive status to family members or partners (RR=2.6, 95% CI: 1.04-6.3, p=0.04) was independently associated with linkage to HIV care. The main barrier to care reported by all groups was lack of time to access a clinic.ConclusionScreening for HIV, TB symptoms and hypertension at mobile units in South Africa has a high yield but inadequate linkage. After-hours and weekend clinics may overcome a major barrier to accessing care.
BACKGROUND The cost-effectiveness of early antiretroviral therapy (ART) in persons infected with human immunodeficiency virus (HIV) in serodiscordant couples is not known. Using a computer simulation of the progression of HIV infection and data from the HIV Prevention Trials Network 052 study, we projected the cost-effectiveness of early ART for such persons. METHODS For HIV-infected partners in serodiscordant couples in South Africa and India, we compared the early initiation of ART with delayed ART. Five-year and lifetime outcomes included cumulative HIV transmissions, life-years, costs, and cost-effectiveness. We classified early ART as very cost-effective if its incremental cost-effectiveness ratio was less than the annual per capita gross domestic product (GDP; $8,100 in South Africa and $1,500 in India), as cost-effective if the ratio was less than three times the GDP, and as cost-saving if it resulted in a decrease in total costs and an increase in life-years, as compared with delayed ART. RESULTS In South Africa, early ART prevented opportunistic diseases and was cost-saving over a 5-year period; over a lifetime, it was very cost-effective ($590 per life-year saved). In India, early ART was cost-effective ($1,800 per life-year saved) over a 5-year period and very cost-effective ($530 per life-year saved) over a lifetime. In both countries, early ART prevented HIV transmission over short periods, but longer survival attenuated this effect; the main driver of life-years saved was a clinical benefit for treated patients. Early ART remained very cost-effective over a lifetime under most modeled assumptions in the two countries. CONCLUSIONS In South Africa, early ART was cost-saving over a 5-year period. In both South Africa and India, early ART was projected to be very cost-effective over a lifetime. With individual, public health, and economic benefits, there is a compelling case for early ART for serodiscordant couples in resource-limited settings. (Funded by the National Institute of Allergy and Infectious Diseases and others.)
BackgroundNo published systematic reviews have assessed the natural history of colonization with methicillin-resistant Staphylococcus aureus (MRSA) or vancomycin-resistant Enterococcus (VRE). Time to clearance of colonization has important implications for patient care and infection control policy.MethodsWe performed parallel searches in OVID Medline for studies that reported the time to documented clearance of MRSA and VRE colonization in the absence of treatment, published between January 1990 and July 2012.ResultsFor MRSA, we screened 982 articles, identified 16 eligible studies (13 observational studies and 3 randomized controlled trials), for a total of 1,804 non-duplicated subjects. For VRE, we screened 284 articles, identified 13 eligible studies (12 observational studies and 1 randomized controlled trial), for a total of 1,936 non-duplicated subjects. Studies reported varying definitions of clearance of colonization; no study reported time of initial colonization. Studies varied in the frequency of sampling, assays used for sampling, and follow-up period. The median duration of total follow-up was 38 weeks for MRSA and 25 weeks for VRE. Based on pooled analyses, the model-estimated median time to clearance was 88 weeks after documented colonization for MRSA-colonized patients and 26 weeks for VRE-colonized patients. In a secondary analysis, clearance rates for MRSA and VRE were compared by restricting the duration of follow-up for the MRSA studies to the maximum observed time point for VRE studies (43 weeks). With this restriction, the model-fitted median time to documented clearance for MRSA would occur at 41 weeks after documented colonization, demonstrating the sensitivity of the pooled estimate to length of study follow-up.ConclusionsFew available studies report the natural history of MRSA and VRE colonization. Lack of a consistent definition of clearance, uncertainty regarding the time of initial colonization, variation in frequency of sampling for persistent colonization, assays employed and variation in duration of follow-up are limitations of the existing published literature. The heterogeneity of study characteristics limits interpretation of pooled estimates of time to clearance, however, studies included in this review suggest an increase in documented clearance over time, a result which is sensitive to duration of follow-up.
Developing cortical GABAergic interneurons rely on genetic programs, neuronal activity, and environmental cues to construct inhibitory circuits during early postnatal development. Disruption of these events can cause long-term changes in cortical inhibition and may be involved in neurological disorders associated with inhibitory circuit dysfunction. We hypothesized that tonic glutamate signaling in the neonatal cortex contributes to, and is necessary for, the maturation of cortical interneurons. To test this hypothesis, we used mice of both sexes to quantify extracellular glutamate concentrations in the cortex during development, measure ambient glutamate-mediated activation of developing cortical interneurons, and manipulate tonic glutamate signaling using subtype-specific NMDA receptor antagonists in vitro and in vivo. We report that ambient glutamate levels are high (Ϸ100 nM) in the neonatal cortex and decrease (to Ϸ50 nM) during the first weeks of life, coincident with increases in astrocytic glutamate uptake. Consistent with elevated ambient glutamate, putative parvalbumin-positive interneurons in the cortex (identified using G42:GAD1-eGFP reporter mice) exhibit a transient, tonic NMDA current at the end of the first postnatal week. GluN2C/GluN2D-containing NMDA receptors mediate the majority of this current and contribute to the resting membrane potential and intrinsic properties of developing putative parvalbumin interneurons. Pharmacological blockade of GluN2C/GluN2D-containing NMDA receptors in vivo during the period of tonic interneuron activation, but not later, leads to lasting decreases in interneuron morphological complexity and causes deficits in cortical inhibition later in life. These results demonstrate that dynamic ambient glutamate signaling contributes to cortical interneuron maturation via tonic activation of GluN2C/ GluN2D-containing NMDA receptors. Inhibitory GABAergic interneurons make up 20% of cortical neurons and are critical to controlling cortical network activity. Dysfunction of cortical inhibition is associated with multiple neurological disorders, including epilepsy. Establishing inhibitory cortical networks requires in utero proliferation, differentiation, and migration of immature GABAergic interneurons, and subsequent postnatal morphological maturation and circuit integration. Here, we demonstrate that ambient glutamate provides tonic activation of immature, putative parvalbumin-positive GABAergic interneurons in the neonatal cortex via high-affinity NMDA receptors. When this activation is blocked, GABAergic interneuron maturation is disrupted, and cortical networks exhibit lasting abnormal hyperexcitability. We conclude that temporally precise activation of developing cortical interneurons by ambient glutamate is critically important for establishing normal cortical inhibition.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.