Objective To estimate population health outcomes with delayed second dose versus standard schedule of SARS-CoV-2 mRNA vaccination. Design Simulation agent based modeling study. Setting Simulated population based on real world US county. Participants The simulation included 100 000 agents, with a representative distribution of demographics and occupations. Networks of contacts were established to simulate potentially infectious interactions though occupation, household, and random interactions. Interventions Simulation of standard covid-19 vaccination versus delayed second dose vaccination prioritizing the first dose. The simulation runs were replicated 10 times. Sensitivity analyses included first dose vaccine efficacy of 50%, 60%, 70%, 80%, and 90% after day 12 post-vaccination; vaccination rate of 0.1%, 0.3%, and 1% of population per day; assuming the vaccine prevents only symptoms but not asymptomatic spread (that is, non-sterilizing vaccine); and an alternative vaccination strategy that implements delayed second dose for people under 65 years of age, but not until all those above this age have been vaccinated. Main outcome measures Cumulative covid-19 mortality, cumulative SARS-CoV-2 infections, and cumulative hospital admissions due to covid-19 over 180 days. Results Over all simulation replications, the median cumulative mortality per 100 000 for standard dosing versus delayed second dose was 226 v 179, 233 v 207, and 235 v 236 for 90%, 80%, and 70% first dose efficacy, respectively. The delayed second dose strategy was optimal for vaccine efficacies at or above 80% and vaccination rates at or below 0.3% of the population per day, under both sterilizing and non-sterilizing vaccine assumptions, resulting in absolute cumulative mortality reductions between 26 and 47 per 100 000. The delayed second dose strategy for people under 65 performed consistently well under all vaccination rates tested. Conclusions A delayed second dose vaccination strategy, at least for people aged under 65, could result in reduced cumulative mortality under certain conditions.
Thrombocytopenia is the most common hematological abnormality encountered in patients with chronic liver disease (CLD). In addition to being an indicator of advanced disease and poor prognosis, it frequently prevents crucial interventions. Historically, thrombocytopenia has been attributed to hypersplenism, which is the increased pooling of platelets in a spleen enlarged by congestive splenomegaly secondary to portal hypertension. Over the past decade, however, there have been significant advances in the understanding of thrombopoiesis, which, in turn, has led to an improved understanding of thrombocytopenia in cirrhosis. Multiple factors contribute to the development of thrombocytopenia and these can broadly be divided into those that cause decreased production, splenic sequestration, and increased destruction. Depressed thrombopoietin levels in CLD, together with direct bone marrow suppression, result in a reduced rate of platelet production. Thrombopoietin regulates both platelet production and maturation and is impaired in CLD. Bone marrow suppression can be caused by viruses, alcohol, iron overload, and medications. Splenic sequestration results from hypersplenism. The increased rate of platelet destruction in cirrhosis also occurs through a number of pathways: increased shear stress, increased fibrinolysis, bacterial translocation, and infection result in an increased rate of platelet aggregation, while autoimmune disease and raised titers of antiplatelet immunoglobulin result in the immunologic destruction of platelets. An in-depth understanding of the complex pathophysiology of the thrombocytopenia of CLD is crucial when considering treatment strategies. This review outlines the recent advances in our understanding of thrombocytopenia in cirrhosis and CLD.
Infection with hepatitis C virus (HCV) is a common cause of chronic liver disease, and HCV-related cirrhosis and hepatocellular carcinoma are the leading causes for liver transplantation in the Western world. Recurrent infection of the transplanted liver allograft is universal in patients with detectable HCV viremia at the time of transplant and can cause a spectrum of disease, ranging from asymptomatic chronic infection to an aggressive fibrosing cholestatic hepatitis. Recurrent HCV is more aggressive in the post-transplant population and is a leading cause of allograft loss, morbidity, and mortality. Historically, treatment of recurrent HCV has been limited by low rates of treatment success and high side effect profiles. Over the past few years, promising new therapies have emerged for the treatment of HCV that have high rates of sustained virological response without the need for interferon based regimens. In addition to being highly effective, these treatments have higher rates of adherence and a lower side effect profile. The purpose of this review is to summarize current therapies in recurrent HCV infection, to review the recent advances in therapy, and to highlight areas of ongoing research.
The left ventricular outflow tract (LVOT) velocity time integral (VTI) is an easily measured echocardiographic stroke volume index analog. Low values predict adverse outcomes in left ventricular failure. We postulate the left ventricular VTI may be a signal of right ventricular dysfunction in acute pulmonary embolism, and therefore a predictor of poor outcomes. We retrospectively reviewed echocardiograms on all Pulmonary Embolism Response Team activations at our institution at the time of pulmonary embolism diagnosis. Low LVOT VTI was defined as ⩽ 15 cm. We examined two composite outcomes: (1) in-hospital death or cardiac arrest; and (2) shock or need for primary reperfusion therapies. Sixty-one of 188 patients (32%) had a LVOT VTI of ⩽ 15 cm. Low VTI was associated with in-hospital death or cardiac arrest (odds ratio (OR) 6, 95% CI 2, 17.9; p = 0.0014) and shock or need for reperfusion (OR 23.3, 95% CI 6.6, 82.1; p < 0.0001). In a multivariable model, LVOT VTI ⩽ 15 remained significant for death or cardiac arrest (OR 3.48, 95% CI 1.02, 11.9; p = 0.047) and for shock or need for reperfusion (OR 8.12, 95% CI 1.62, 40.66; p = 0.011). Among intermediate–high-risk patients, low VTI was the only variable associated with the composite outcome of death, cardiac arrest, shock, or need for reperfusion (OR 14, 95% CI 1.7, 118.4; p = 0.015). LVOT VTI is associated with adverse short-term outcomes in acute pulmonary embolism. The VTI may help risk stratify patients with intermediate–high-risk pulmonary embolism.
ObjectivesTo characterise the variation in composition, leadership, and activation criteria of rapid response and cardiac arrest teams in five north-eastern states of the USA.DesignCross-sectional study consisting of a voluntary 46-question survey of acute care hospitals in north-eastern USA.SettingAcute care hospitals in New York, New Jersey, Rhode Island, Vermont, and Pennsylvania.ParticipantsSurveys were completed by any member of the rapid response team (RRT) with a working knowledge of team composition and function. Participants were all Medicare-participating acute care hospitals, including teaching and community hospitals as well as hospitals from rural, urban and suburban areas.ResultsOut of 378 hospitals, contacts were identified for 303, and 107 surveys were completed. All but two hospitals had an RRT, 70% of which changed members daily. The most common activation criteria were clinical concern (95%), single vital sign abnormalities (77%) and early warning score (59%). Eighty one per cent of hospitals had a dedicated cardiac arrest team.RRT composition varied widely, with respiratory therapists, critical care nurses, physicians and nurse managers being the most likely to attend (89%, 78%, 64% and 51%, respectively). Consistent presence of critical care physicians was uncommon and both cardiac arrest teams and teams were frequently led by trainee physicians, often without senior supervision.ConclusionsAs the largest study to date in the USA, we have demonstrated wide heterogeneity, rapid team turnover and a lack of senior supervision of RRT and cardiac arrest teams. These factors likely contribute to the mixed results seen in studies of RRTs.
Although uncommon, acute pancreatitis is a well-recognized, but generally serious, complication following liver transplantation. In addition to being more prevalent in patients who underwent liver transplantation than in the general population, it has a more aggressive course and can be responsible for significant morbidity and mortality. The post-liver transplant population has altered anatomy, increased comorbidities, and requires a myriad of drugs. These characteristics make them different from the pre-transplant population. Despite their retrospective nature, prior studies have identified numerous etiological factors that are associated with an increased risk of acute pancreatitis following liver transplantation. These can be broadly classified into the following four categories: surgical and anatomical factors, infections, post-transplant management, and post-transplant complications. The aim of this systematic review is to assimilate the available information regarding acute pancreatitis following adult liver transplantation to describe the risk factors and natural history of the disease and to highlight possible areas for further investigation.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.