IMPORTANCEHospitalized patients with COVID-19 are at risk for venous and arterial thromboembolism and death. Optimal thromboprophylaxis dosing in high-risk patients is unknown.OBJECTIVE To evaluate the effects of therapeutic-dose low-molecular-weight heparin (LMWH) vs institutional standard prophylactic or intermediate-dose heparins for thromboprophylaxis in high-risk hospitalized patients with COVID-19. DESIGN, SETTING, AND PARTICIPANTSThe HEP-COVID multicenter randomized clinical trial recruited hospitalized adult patients with COVID-19 with D-dimer levels more than 4 times the upper limit of normal or sepsis-induced coagulopathy score of 4 or greater from May 8, 2020, through May 14, 2021, at 12 academic centers in the US.INTERVENTIONS Patients were randomized to institutional standard prophylactic or intermediate-dose LMWH or unfractionated heparin vs therapeutic-dose enoxaparin, 1 mg/kg subcutaneous, twice daily if creatinine clearance was 30 mL/min/1.73 m 2 or greater (0.5 mg/kg twice daily if creatinine clearance was 15-29 mL/min/1.73 m 2 ) throughout hospitalization. Patients were stratified at the time of randomization based on intensive care unit (ICU) or non-ICU status. MAIN OUTCOMES AND MEASURESThe primary efficacy outcome was venous thromboembolism (VTE), arterial thromboembolism (ATE), or death from any cause, and the principal safety outcome was major bleeding at 30 ± 2 days. Data were collected and adjudicated locally by blinded investigators via imaging, laboratory, and health record data. RESULTSOf 257 patients randomized, 253 were included in the analysis (mean [SD] age, 66.7 [14.0] years; men, 136 [53.8%]; women, 117 [46.2%]); 249 patients (98.4%) met inclusion criteria based on D-dimer elevation and 83 patients (32.8%) were stratified as ICU-level care. There were 124 patients (49%) in the standard-dose vs 129 patients (51%) in the therapeutic-dose group. The primary efficacy outcome was met in 52 of 124 patients (41.9%) (28.2% VTE, 3.2% ATE, 25.0% death) with standard-dose heparins vs 37 of 129 patients (28.7%) (11.7% VTE, 3.2% ATE, 19.4% death) with therapeutic-dose LMWH (relative risk [RR], 0.68; 95% CI, 0.49-0.96; P = .03), including a reduction in thromboembolism (29.0% vs 10.9%; RR, 0.37; 95% CI, 0.21-0.66; P < .001). The incidence of major bleeding was 1.6% with standard-dose vs 4.7% with therapeutic-dose heparins (RR, 2.88; 95% CI, 0.59-14.02; P = .17). The primary efficacy outcome was reduced in non-ICU patients (36.1% vs 16.7%; RR, 0.46; 95% CI, 0.27-0.81; P = .004) but not ICU patients (55.3% vs 51.1%; RR, 0.92; 95% CI, 0.62-1.39; P = .71). CONCLUSIONS AND RELEVANCEIn this randomized clinical trial, therapeutic-dose LMWH reduced major thromboembolism and death compared with institutional standard heparin thromboprophylaxis among inpatients with COVID-19 with very elevated D-dimer levels. The treatment effect was not seen in ICU patients.
We discovered that a shift between the state of tumorigenicity and dormancy in human carcinoma (HEp3) is attained through regulation of the balance between two classical mitogen-activated protein kinase (MAPK)-signaling pathways, the mitogenic extracellular regulated kinase (ERK) and the apoptotic/growth suppressive stress-activated protein kinase 2 (p38 MAPK ), and that urokinase plasminogen activator receptor (uPAR) is an important regulator of these events. This is a novel function for uPAR whereby, when expressed at high level, it enters into frequent, activating interactions with the ␣51-integrin, which facilitates the formation of insoluble fibronectin (FN) fibrils. Activation of ␣51-integrin by uPAR generates persistently high level of active ERK necessary for tumor growth in vivo. Our results show that ERK activation is generated through a convergence of two pathways: a positive signal through uPAR-activated ␣51, which activates ERK, and a signal generated by the presence of FN fibrils that suppresses p38 activity. When fibrils are removed or their assembly is blocked, p38 activity increases. Low uPAR derivatives of HEp3 cells, which are growth arrested (dormant) in vivo, have a high p38/ERK activity ratio, but in spite of a similar level of ␣51-integrin, they do not assemble FN fibrils. However, when p38 activity is inhibited by pharmacological (SB203580) or genetic (dominant negative-p38) approaches, their ERK becomes activated, uPAR is overexpressed, ␣51-integrins are activated, and dormancy is interrupted. Restoration of these properties in dormant cells can be mimicked by a direct re-expression of uPAR through transfection with a uPAR-coding plasmid. We conclude that overexpression of uPAR and its interaction with the integrin are responsible for generating two feedback loops; one increases the ERK activity that feeds back by increasing the expression of uPAR. The second loop, through the presence of FN fibrils, suppresses p38 activity, further increasing ERK activity. Together these results indicate that uPAR and its interaction with the integrin should be considered important targets for induction of tumor dormancy.
Global climate change is predicted to alter the distribution and dynamics of soil-transmitted helminth infections, and yet host immunity can also influence the impact of warming on host-parasite interactions and mitigate the long-term effects. We used time-series data from two helminth species of a natural herbivore and investigated the contribution of climate change and immunity on the longterm and seasonal dynamics of infection. We provide evidence that climate warming increases the availability of infective stages of both helminth species and the proportional increase in the intensity of infection for the helminth not regulated by immunity. In contrast, there is no significant long-term positive trend in the intensity for the immune-controlled helminth, as immunity reduces the net outcome of climate on parasite dynamics. Even so, hosts experienced higher infections of this helminth at an earlier age during critical months in the warmer years. Immunity can alleviate the expected long-term effect of climate on parasite infections but can also shift the seasonal peak of infection toward the younger individuals.long-term climate warming | seasonality | host-parasite interaction | gastrointestinal helminths | European rabbit T he marked progression of climate warming and intensification of extreme climatic events have been implicated in the increased prevalence of infections, epidemic outbreaks, and the geographical shifting of endemic foci of infections (1-3). Experimental manipulations of vectors and infective stages have shown that warming, coupled with increased variability in temperature, can influence the vital rates of the parasite and the immunophysiological characteristics of the host (4-9). Predictive models of infection dynamics have reinforced the importance of these findings by showing that the risk of infection and transmission are likely to increase with the projected temperature changes (10-15) and synchronous, unpredicted weather events (16), although exceptions have been reported (17)(18)(19). Despite these findings, the extent to which climate disruption is in fact having an effect on parasite infections in natural systems is far from clear, as nonlinear effects in the parasite-host relationship and confounding variables can be difficult to disentangle.The way climate change modifies the development and survival of vectors and infective stages, and the consequences for the contact rate between infective and susceptible individuals, has been the central focus of many of the predictions and trials. However, this approach ignores the contribution of variability in the response of the hosts, which we expect to be critical in regulating the parasite abundance and the changes in the transmission rate. Given that immunity to infections is an important source of variation among individuals (20-23), if climate increases exposure to parasites, do we expect a proportional increase in the intensity of infection in hosts that mount an immune response against these parasites? In other words, how does variatio...
When rate responsive pacing using a CRT device is achieved in patients with advanced CHF and severe CI, parameters of aerobic exercise performance improve acutely. Routine exercise testing to ensure successful restoration of heart rate response may be beneficial to optimize CRT settings in this patient population.
Past analysis has shown that the population dynamics of Alpine ibex Capra ibex ibex are regulated by both population density and winter snow accumulation. However, recent time series of the ibex counts in the Gran Paradiso National Park, Italy, show interesting trends in comparison with historical snow data: while the winter snow depth has steadily decreased since the beginning of the 1980s, the ibex population experienced rapid growth during the 1980s and the early 1990s, followed by a strong decrease. To explain these dynamics, we built novel age-structured population models in which demographic parameters depended on density and snow depth. They included a non-monotonic effect of snow depth and density on the vital rates, the age and sex structure of the population, and spatial segregation between females and males. Using information criteria (AIC c , BIC and SRM), we selected the best models and found that: 1) snow and density interacted in determining the demography of all population sex and age classes, thus confirming that unfavourable climatic conditions intensified the density dependence of the population, 2) the effect of snow was non-monotonic on weaning success and rate of demographic variation of kids, which were maximal for intermediate snow depths, and 3) accounting for spatial segregation between sexes improved the fitting of the models, which suggests that the different use of space made by males and females influenced intraspecific competition. When the selected models were recalibrated using past data and used to simulate recent trends, they underestimated both the rapid growth of the 1980s-1990s and the recent decline of the population. Using the novel sex-and age-structured models, we found that the underestimation of the peak was mainly due to deficiencies of adult demography models, while the overestimation of the recent population abundance was caused by shortcomings in the models of recruitment.
Coronavirus disease-2019 (COVID-19) has been associated with significant risk of venous thromboembolism (VTE), arterial thromboembolism (ATE), and mortality particularly among hospitalized patients with critical illness and elevated D-dimer (Dd) levels. Conflicting data have yet to elucidate optimal thromboprophylaxis dosing. HEP-COVID (NCT04401293) is a Phase 3, multicenter, pragmatic, prospective, randomized, pseudo-blinded, active control trial to evaluate efficacy and safety of therapeutic-dose low molecular weight heparin (LMWH) versus prophylactic-/intermediate-dose LMWH or unfractionated heparin (UFH) for prevention of a primary efficacy composite outcome of VTE, ATE, and all-cause mortality (ACM) 30 ± 2 days post-enrollment. Eligible patients have COVID-19 diagnosis by nasal swab or serologic testing, requirement for supplemental oxygen per investigator judgment, and Dd >4x upper limit of normal (ULN) or sepsis-induced coagulopathy (SIC) score ≥4. Subjects are randomized to enoxaparin 1 mg/kg SQ/BID (CrCl ≥ 30 ml/min) or 0.5 mg/kg (CrCl 15-30 ml/min) vs local institutional prophylactic regimens including: a) UFH up to 22,500 IU daily (divided BID or TID), b) enoxaparin 30mg and 40mg SQ QD or BID, or c) dalteparin 2500IU or 5000 IU QD. The principal safety outcome is major bleeding. Events are adjudicated locally. Based on expected 40% relative risk reduction (RRR) with treatment-dose compared with prophylactic-dose prophylaxis, 308 subjects will be enrolled (assuming 20% drop-out) to achieve 80% power. Distinguishing design features include an enriched population for the composite endpoint anchored on Dd >4x ULN, stratification by ICU vs non-ICU, and the ability to capture asymptomatic proximal deep venous thrombosis via screening ultrasonography prior to discharge.
All patients with systolic HF, New York Heart Association class II to IV, receiving β-blocker therapy (bisoprolol, carvedilol, or metoprolol) for ≥3 months referred for cardiopulmonary exercise tolerance testing (CPETT) were screened. All patients were followed at the Columbia © 2012 American Heart Association, Inc. Background-Chronotropic incompetence is defined as the inability to reach 80% of heart rate (HR) reserve or 80% of the maximally predicted HR during exercise. The presence of chronotropic incompetence is associated with reduced peak oxygen consumption, and rate-responsive pacing therapy is under investigation to improve exercise capacity in heart failure (HF). However, uncertainty exists about whether chronotropic incompetence and reduced exercise tolerance in HF are attributable to β-blockade. Methods and Results-Subjects with HF and receiving long-term β-blocker therapy underwent cardiopulmonary exercise tolerance testing under 2 conditions in random sequence: (1) after a 27-hour washout period (Off-BB) and (2) 3 hours after β-blocker ingestion (On-BB). Norepinephrine levels were drawn at rest and at peak exercise. β1-response to norepinephrine was assessed using the chronotropic responsiveness index: ΔHR/Δlog norepinephrine. Nineteen patients with systolic HF (left ventricular ejection fraction, 22.8±7.7%) were enrolled. Mean age was 49.4±12.3 years. Average carvedilol equivalent dose was 29.1±17.0 mg daily. Peak HR off/on β-blockers was 62.7±18.7% and 51.4±18.2% HR reserve (P<0.01) and 79.1±11.0% and 70.3±12.3% maximally predicted HR (P<0.01). For the Off-BB and On-BB conditions, the respiratory exchange ratios were 1.05±0.06 and 1.05±0.10 (P=0.77), respectively, confirming maximal and near identical effort in both conditions. The peak oxygen consumption was 16.6±3.34 and 15.9±3.31 mL/kg/min (P=0.03), and the chronotropic responsiveness index was 19.3±7.2 and 16.2±7.1 (P=0.18). Conclusions-Acute β-blocker cessation does not normalize the chronotropic response to exercise in patients with advanced HF and chronotropic incompetence. (Circ Heart Fail. 2012;5:560-565.)Key Words: heart failure ◼ chronotropic incompetence ◼ β-blocker ◼ heart rate ◼ exercise Effect of β-Blocker Hirsh et al β-Blocker Cessation and CI in CHF 561University Medical Center Heart Failure Center by HF specialists. β-blockers were routinely uptitrated by physicians or nurse practitioners unless limited by blood pressure, bradycardia, fatigue, or other symptoms, possibly attributable to β-blockers. Patients were excluded if they had any of the following: atrial fibrillation or atrial flutter, inability to exercise, hospital admission for HF or acute coronary syndrome in the past 90 days or symptoms of myocardial ischemia, or inability to undergo a treadmill exercise test (eg, severe obstructive pulmonary disease or severe osteoarthritis). CI was not a factor in enrollment. Carvedilol equivalent dose was calculated for patients treated with bisoprolol and metoprolol based on the equivalence ratios established by the Metoprolol ...
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
334 Leonard St
Brooklyn, NY 11211
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.