Background During the COVID-19 pandemic, the scarcity of resources has necessitated triage of critical care for patients with the disease. In patients aged 65 years and older, triage decisions are regularly based on degree of frailty measured by the Clinical Frailty Scale (CFS). However, the CFS could also be useful in patients younger than 65 years. We aimed to examine the association between CFS score and hospital mortality and between CFS score and admission to intensive care in adult patients of all ages with COVID-19 across Europe. Methods This analysis was part of the COVID Medication (COMET) study, an international, multicentre, retrospective observational cohort study in 63 hospitals in 11 countries in Europe. Eligible patients were aged 18 years and older, had been admitted to hospital, and either tested positive by PCR for severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) or were judged to have a high clinical likelihood of having SARS-CoV-2 infection by the local COVID-19 expert team. CFS was used to assess level of frailty: fit (CFS 1-3), mildly frail (CFS 4-5), or frail (CFS 6-9). The primary outcome was hospital mortality. The secondary outcome was admission to intensive care. Data were analysed using a multivariable binary logistic regression model adjusted for covariates (age, sex, number of drugs prescribed, and type of drug class as a proxy for comorbidities). Findings Between March 30 and July 15, 2020, 2434 patients (median age 68 years [IQR 55-77]; 1480 [61%] men, 954 [30%] women) had CFS scores available and were included in the analyses. In the total sample and in patients aged 65 years and older, frail patients and mildly frail patients had a significantly higher risk of hospital mortality than fit patients (total sample: CFS 6-9 vs CFS 1-3 odds ratio [OR] 2•71 [95% CI 2•04-3•60], p<0•0001 and CFS 4-5 vs CFS 1-3 OR 1•54 [1•16-2•06], p=0•0030; age ≥65 years: CFS 6-9 vs CFS 1-3 OR 2•90 [2•12-3•97], p<0•0001 and CFS 4-5 vs CFS 1-3 OR 1•64 [1•20-2•25], p=0•0020). In patients younger than 65 years, an increased hospital mortality risk was only observed in frail patients (CFS 6-9 vs CFS 1-3 OR 2•22 [1•08-4•57], p=0•030; CFS 4-5 vs CFS 1-3 OR 1•08 [0•48-2•39], p=0•86). Frail patients had a higher incidence of admission to intensive care than fit patients (CFS 6-9 vs CFS 1-3 OR 1•54 [1•21-1•97], p=0•0010), whereas mildly frail patients had a lower incidence than fit patients (CFS 4-5 vs CFS 1-3 OR 0•71 [0•55-0•92], p=0•0090). Among patients younger than 65 years, frail patients had an increased incidence of admission to intensive care (CFS 6-9 vs CFS 1-3 OR 2•96 [1•98-4•43], p<0•0001), whereas mildly frail patients had no significant difference in incidence compared with fit patients (CFS 4-5 vs CFS 1-3 OR 0•93 [0•63-1•38], p=0•72). Among patients aged 65 years and older, frail patients had no significant difference in the incidence of admission to intensive care compared with fit patients (CFS 6-9 vs CFS 1-3 OR 1•27 [0•92-1•75], p=0•14), whereas mildly frail patients had a lower incide...
BackgroundSupporting health care sector decisions using time-dependent endpoints (TDEs) such as time to progression (TTP), progression-free survival (PFS), and event-free survival (EFS) remains controversial. This study estimated the quantitative relationship between median TDE and median overall survival (OS) in multiple myeloma (MM) patients.MethodsStudies (excluding allogeneic transplantation) published from 1970 to 2011 were systematically searched (PubMed). The nonparametric Spearman’s rank correlation coefficient measured the association between median TDE and OS. The quantitative relationship between TDEs and OS was estimated with a two-step approach to a simultaneous Tobit model.ResultsWe identified 153 studies: 230 treatment arms, 22,696 patients and mean study duration of 3.8 years. Mean of median TDEs was 22.5 months and median OS was 39.1 months. Correlation coefficients of median TTP, PFS, and EFS with median OS were 0.51 (P = 0.003), 0.75 (P < 0.0001), and 0.84 (P < 0.0001), respectively. We estimate a 2.5 month (95% confidence interval, 1.7–3.2) increase in median OS for each additional month reported for median TDEs. There was no evidence that this relationship differed by type of surrogate.ConclusionTDEs predict OS in MM patients; this relationship may be valuable in clinical trial design, drug comparisons, and economic evaluation.
Objectives Since the outbreak of severe acute respiratory syndrome coronavirus-2 (SARS-CoV-2), the pressure to minimise its impact on public health has led to the implementation of different therapeutic strategies, the efficacy of which for the treatment of coronavirus disease 2019 (COVID-19) was unknown at the time. Remdesivir (REM) was granted its first conditional marketing authorisation in the EU in June 2020. The European Medicines Agency (EMA) and local health authorities all across the EU have since strongly recommended the implementation of pharmacovigilance activities aimed at further evaluating the safety of this new drug. The objective of this study was to evaluate adverse drug reactions (ADRs) attributed to either REM or hydroxychloroquine (HCQ) in patients hospitalised for COVID-19 in Centro Hospitalar de Lisboa Ocidental, a Portuguese hospital centre based in Lisbon. We present the preliminary results reporting plausible adverse effects of either HCQ or REM. Methods An observational cohort study was carried out between 16 March and 15 August 2020. Participants were divided into two cohorts: those prescribed an HCQ regimen, and those prescribed REM. Suspected ADRs were identified using an active monitoring model and reported to the Portuguese Pharmacovigilance System through its online notification tool. The ADR cumulative incidence was compared between the two cohorts. Results The study included 149 patients, of whom 101 were treated with HCQ and the remaining 48 with REM. The baseline characteristics were similar between the two cohorts. A total of 102 ADRs were identified during the study period, with a greater incidence in the HCQ cohort compared with the REM cohort (47.5% vs 12.5%; p<0.001). Causality was assessed in 81 ADRs, all of which were considered possible. Conclusions Real-world data are crucial to further establish the safety profile for REM. HCQ is no longer recommended for the treatment of COVID-19.
IntroductionLow-level viraemia (LLV) is observed in some patients with HIV-1 infection on stable antiretroviral therapy (ART). The significance of these findings remains controversial as it conflicts with traditional optimal clinic outcome. This study aims to evaluate the effect of LLV on the establishment of virological failure (VF) and immune deterioration.MethodsRetrospective observational study of a cohort of HIV-1 infected patients of an Infectious Diseases Clinic, who presented an HIV-1 viral load of 20 to 200 cp/mL, during the year 2012. Patients who were not on ART or non-adherent in the previous 6 months were excluded. Compliance was quantified by clinical and pharmaceutical records. Adherence was defined as ≥95% compliance rate. Demographic, clinical, immunological and therapeutic data were collected from clinical records. LLV was defined as a range of 20–200 cp/mL and stratified as transient (T-LLV): only one measurement, persistent (P-LLV): 2 consecutive measurements with an interval ≥3 months and recurrent (R-LLV): ≥1 T-LLV during an 18-month follow-up. Statistical analysis was performed with Microsoft Office® – Excel 2012. Kolmogorov–Smirnov test, t-test and chi-square test were performed for a significant p value <0.05.ResultsDuring 2012, 2161 HIV-1 infected patients were evaluated at our Clinic, 93% of which were on ART. LLV was documented in 378 (19%), adherence was verified in 151 (52%). The analysis of this cohort (n=151) revealed: 77 (51%) T-LLV, 13 (8.6%) R-LLV and 61 (40%) P-LLV. Mean viral load was 46 cp/mL. Mean TCD4 count was 665 cells/µL with a variation of +63 cells/µL during the study period. There was no VF documented. ART regimens were switched in 16 (11%) patients. Gastrointestinal disturbance was found in 13 (9%). Analysis showed no statistical differences between the analyzed variables (CD4 variation, time of diagnosis and treatment, duration of LLV persistence (less than or more than one year), number of ART regimens, ART regimen and type of NRTI backbone) for all groups (T-LLV, R-LLV, P-LLV), except for mean viral load that showed significant superiority in the T-LLV(38 cp/mL) and R-LLV(36 cp/mL) vs P-LLV(58 cp/mL) (p=0.01 and p<0.01, respectively).ConclusionsThe absence of significant differences in immunological and virological outcomes in this cohort and the absence of VF in all groups, suggests a scarce impact of LLV in patient's prognosis. Prospective studies, with longer follow-up could bring more accurate information.
Despite the increasing optimization of combined antiretroviral therapy (cART) regimens in the last decades, a significant percentage of patients still do not achieve viral replication control. We present a retrospective analysis focusing on human immunodeficiency virus (HIV)-infected population on cART, followed at our ambulatory care clinic between 1st January and 31st December 2011, in order to identify the causes of virological failure. From the 1895 patients in our population we included 1854 in the study. Ten percent (187) of the included patients had detectable HIV RNA (≥40 cp/mL) at the time of last laboratory evaluation: 70,1% were males, mean age was 46 years and 72,7% were Portuguese. Patients with detectable HIV RNA were divided into group A (HIV RNA <200 cp/mL) - 78 (41,7%) patients and group B (HIV RNA ≥200 cp/mL) 109 (58,3%) patients. The comparison of both groups revealed an higher mean count of TCD4+ (568 vs 334 cells/mm3; p<0,001) in group A, although similar mean TCD4+ count at time of cART initiation (276 vs 262 cells/mm3; p=0,412). Group A patients experienced longer exposure to cART (10 vs 8 years; p<0,05) and have undergone, on average, 3 previous regimens (p<0,05). With regard to cARV current regimen: 32,1% patients in group A and 30,3% in group B were prescribed non-nucleoside reverse transcriptase inhibitors based regimes and 51,3% patients in Group A and 59,6% in group B were under cARV based on Protease inhibitors. The identified causes of virologic failure for patients with detectable HIV RNA were: poor adherence (54%); unsuccessful retention in care (14,4%); sporadic detectable HIV RNA (40≤viral load<200), “blips” (14,4%); mutations of resistance to ARVs (13,4%); intolerance to the current regimen (2,1%) and pharmacokinetics drug interactions (1,6%). The estimated rate of virological failure was 10,1% in this population. Insufficient adherence and unsuccessful retention in care were identified in 68,4% of treatment failed patients as main causes of virological failure. Failure of therapy due to intolerance or adverse effects was reported in 2,1% of cases, reflecting a better safety profile and tolerability of recent prescribed regimens. Early identification of causes of virologic failure, timely adjustment of therapeutic regimens, and the adoption of measures to promote adherence and retention in care are key factors for successful treatment of HIV-infected patients.
Background: Time to progression (TTP), progression-free survival (PFS) and event-free survival (EFS) are common surrogates in clinical cancer investigation and acceptable end points informing decisions about new drugs approval and financing. Our aim was to estimate a quantitative relationship between median TTP, PFS and EFS and median overall survival (mOS) in multiple mieloma (MM) from data of prospective (experimental or observational) studies published in the literature. Methods: Studies published in English, Spanish or Portuguese between 1970 and 2007 were systematically searched on PubMed using keywords: progression, event-free, survival, multiple myeloma, clinical trial and observational study. All types of treatments were considered with the exception of allogeneic transplantation. The non-parametric Spearman’s rank correlation coefficient was used as a measure of correlation between median values of the surrogates and median values of OS. The quantitative relationship between surrogates and OS was estimated with a two-step approach to a simultaneous Tobit model. Study arms not reaching median OS were included as censored observations. First the endogenous variable (TTP/PFS/EFS) was regressed on the instrumental variable (overall survival at 12 months) and on the exogenous variables (median age, percentage of females, year of publication, type of surrogate, and the type of patients included in the trials naïve vs non naïve) using Generalized Method of Moments - Cragg estimator to obtain a consistent estimate of the residuals. Second the censored normal model was estimated by maximum likelihood including the estimated residuals as an additional regressor. Estimation was weighted by the number of patients enrolled in each study arm and estimators’ variances were corrected for both endogeneity and heteroskedasticity. Results: Of the 845 studies reviewed 128 were included containing a total of 190 arms: 34 reported TTP, 71 reported EFS, 85 reported PFS, and 142 reported mOS. The mean duration of these studies was 4.1 years. Overall, the sample was composed of 17163 patients, 56% males, mean age (mean of medians) was 60 years, with 52% of arms representing only naïve patients. Mean of median TTP/PFS/EFS was 23 (SD=16) months and mOS was 39 (SD=19). The correlation coefficients of median TTP, PFS and EFS with mOS were 0.48 (p=0.01), 0.74 (p<0.0001), and 0.84 (p<0.0001), respectively. The model estimates a 1.9 (95%CI [1.5;2.2]) months increase in median overall survival for each additional month reported for the median time-dependent surrogate. There was no evidence that this relationship differs by the type of surrogate, when controlling for age, gender, type of patients included in the trials (naïve vs non naïve) and year of publication. Conclusions: The analysis confirms the value of time dependent surrogates (TTP/PFS/EFS) in predicting overall survival in patients with Multiple Myeloma. The quantitative relationship presented is of most value to inform clinical trials’ design and to support new multiple mieloma drugs approval and financing decisions.
Since the use of more tolerable and less toxic combined antiretroviral (ARV) therapy, most drug-naïve HIV patients achieve viral suppression and immunologic recovery, combined with less AIDS related events. Nevertheless, drug switches are still frequent both as a mean of adherence and toxicity management and as a response to virologic or immunologic failure. The aim of the study was to analyse the number, timing and cause of modifications in the first ARV regimen in order to elucidate more about adverse drug reactions in the current HIV ARV therapy guidelines, indirect signs of adherence and premature virological failure. MethodsNon-controlled, observational, retrospective study, based on the clinical files and on a national questionnaire audit for all naïve-patients that began cARV therapy between January 2007 and Mars 2010, followed in an Infectious Diseases Clinic in a central Hospital in Lisbon. SPSS 15.0 was used for statistical analysis. ResultsDuring study period, 69 patients of the 285 naïve-patients who started ARV therapy changed their regimen, 64% were male, with a median age of 43 years old. A significant group was born in African Portuguese speaking countries (30%). Most switches occurred on the first 6 months (n=42), 22% on the first month and just 11% after one year of treatment, with more than one modification in 15% of patients. The drug regimen prior to modification included a NNRTI in 62% of the patients.The back-bone regimen included TDF/FTC in 66%, ABC/3TC in 12% and still AZT/3TC in 22% of them. Adverse drug reactions were the most frequent cause of therapy modification (59%), including toxicity in 19 cases and intolerability in 22, reflecting the known side effects of the drugs. Other switch causes were the evidence of virological failure (15%), the simplification of the regimen (10%) and the adjustment during pregnancy (5%). About a fifth of the patients had adherence irregularities. The rate of viral suppression at week 24 of ARV was significantly lower in the group of patients who switched ARV (39% vs 60%; p=0.006), which had also worse immunological recovery at 24 and 48 weeks of ARV. ConclusionsToxicity and intolerability remain the main reasons to change the first ARV regimen, more frequently during the first to sixth months of therapy, witch reinforce the need to evaluate early events that can compromise the adherence, the emergence of resistances and long term toxicity and longevity of this patients.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.