Context: Health care costs in the United States are much higher than those in industrial countries with similar or better health system performance. Wasteful spending has many undesirable consequences that could be alleviated through waste reduction. This article proposes a conceptual framework to guide researchers and policymakers in evaluating waste, implementing waste-reduction strategies, and reducing the burden of unnecessary health care spending. Methods:This article divides health care waste into administrative, operational, and clinical waste and provides an overview of each. It explains how researchers have used both high-level and sector-or procedure-specific comparisons to quantify such waste, and it discusses examples and challenges in both waste measurement and waste reduction. Findings:Waste is caused by factors such as health insurance and medical uncertainties that encourage the production of inefficient and low-value services. Various efforts to reduce such waste have encountered challenges, such as the high costs of initial investment, unintended administrative complexities, and trade-offs among patients', payers', and providers' interests. While categorizing waste may help identify and measure general types and sources of waste, successful reduction strategies must integrate the administrative, operational, and clinical components of care, and proceed by identifying goals, changing systemic incentives, and making specific process improvements. 630T. G.K. Bentley, R.M. Effros, K. Palar, and E.B. Keeler Conclusions: Classifying, identifying, and measuring waste elucidate its causes, clarify systemic goals, and specify potential health care reforms that-by improving the market for health insurance and health care-will generate incentives for better efficiency and thus ultimately decrease waste in the U.S. health care system. Keywords: Health care waste; health care inefficiency; quality of care; health care reform; administrative, operational, and clinical waste.
Objectives. We sought to quantify the impact of the 1998 US Food and Drug Administration (FDA) folic acid fortification policy by estimating folate intake at the population level.Methods. We analyzed total folate intake levels (from food and supplements) according to gender, age, and race/ethnicity, using data from 2 National Health and Nutrition Examination Surveys. We measured pre-and postfortification folate intake distributions, adjusted for measurement error, and examined proportions of the population who reached certain thresholds of daily total folate intake.Results. Mean daily food and total folate intake increased by approximately 100 µg/day after fortification. The proportion of women aged 15-44 years who consume more than 400 µg/day of folate has increased since fortification, but has not yet reached the FDA's 50% target and varies by race/ethnicity from 23% to 33%. Among persons aged 65 years and older who may be at risk for masking a vitamin B 12 deficiency, the percentage who consume more than 1000 µg/day (the "tolerable upper intake level") has at least doubled among Whites and Black men, but has remained less than 5% for all groups.Conclusions. Since fortification, folic acid intake among the US population has increased, and there are substantial variations by age, gender, and race/ethnicity. (Am J Public Health. 2006;96:204096: -204796: . doi:10.210596: /AJPH.2005 Population-Level Changes in Folate Intake by Age, Gender, and Race/Ethnicity after Folic Acid Fortification | Tanya G. K. Bentley, PhD, Walter C. Willett, MD, DrPH, Milton C. Weinstein, PhD, Karen M. Kuntz, ScD dietary intake. Without such correction, it is not possible to accurately estimate the number of women who reach the FDA's 400 µg/ day threshold. In addition, no study has quantified national and population-based intakes by age, gender, and racial/ethnic subgroups. 38 Our analysis provides national, population-based estimates of folate consumption levels by age, gender, and racial/ethnic subgroups and accounts for food and supplement intake, corrected for measurement error because of within-person variation. METHODS DataWe analyzed food and dietary supplement data from 2 periods of the National Health and Nutrition Examination Surveys (NHANES III, 1988-1994 NHANES 1999 NHANES -2000. NHANES are surveys conducted by the National Center for Health Statistics of the Centers for Disease Control and Prevention and are designed to monitor trends in risk behaviors, environmental exposures, diet, nutrition, and health. Data are collected from personal interviews and physical health exams. Nutrient intake data are based primarily on one 24-hour dietary recall measure from the interview component. Data on supplement use are collected during the physical examination component and entail detailed information on specific types and amounts of supplement use over the previous month. Nutrient intake values are calculated by coding the survey data with the US Department of Agriculture's Survey Nutrient Database, which incorporates folate value...
Objective To quantify the health and economic outcomes associated with changes in folic acid consumption following fortification of enriched grain products in the United States. Design Cost-effectiveness analysis. Setting Annual burden of disease, quality-adjusted-life-years (QALYs), and costs were projected for four steady-state strategies: no fortification or fortifying with 140, 350, or 700 micrograms (mcg) folic acid per 100 grams (g) enriched grain. The analysis considered four health outcomes: neural tube defects (NTDs); myocardial infarctions (MIs); colon cancers; and B-12 deficiency maskings. Subjects U.S. adult population subgroups defined by age, gender, and race/ethnicity, with folate intake distributions from the National Health and Nutrition Examination Surveys (1988-1992 and 1999-2000), and reference sources for disease incidence, utility, and economic estimates. Results The greatest benefits from fortification were predicted in MI prevention, with 16,862 and 88,172 cases averted per year in steady state for the 140-mcg and 700-mcg fortification levels, respectively. These projections were 6,261 and 38,805 for colon cancer and 182 and 1,423 for NTDs, while 15 to 820 additional B-12 cases were predicted. Compared with no fortification, all post-fortification strategies provided QALY gains and cost savings for all subgroups, with predicted population benefits of 266,649 QALYs gained and $3.6 billion saved in the long run by changing the fortification level from 140-mcg/100-g enriched grain to 700-mcg/100-g. Conclusions This study indicates that the health and economic gains of folic acid fortification far outweigh the losses for the U.S. population, and that increasing the level of fortification deserves further consideration to maximize net gains.
Six commonly used HRQoL indexes and two of three health status summary measures indicated lower HRQoL with obesity and overweight than with normal BMI, but the degree of decrement varied by index. The association appeared driven primarily by physical health, although mental health also played a role among women. Counter to hypotheses, blacks may have highest HRQoL when overweight.
AOM consistently provided favorable clinical benefits. Under various dosing scenarios, AOM results indicated fewer relapses at lower overall costs or a reasonable cost-effectiveness threshold (i.e., less than the cost of a hospitalization relapse) vs PLAI. Given the heterogeneous nature of schizophrenia and variability in treatment response, health plans may consider open access for treatments like AOM. Since model inputs were based on data from separate placebo-controlled trials, generalization of results to the real-world setting is limited.
The ASCO, ESMO, ICER, and NCCN frameworks demonstrated convergent validity, despite differences in conceptual approaches used. The ASCO inter-rater reliability was high, although potentially at the cost of user burden. The ICER inter-rater reliability was poor, possibly because of its failure to distinguish differential value among the sample of drugs tested. Refinements of all frameworks should continue on the basis of further testing and stakeholder feedback.
Downy mildew is a destructive disease of spinach worldwide. There have been 10 races described since 1824, six of which have been identified in the past 10 years. Race identification is based on qualitative disease reactions on a set of diverse host differentials which include open-pollinated cultivars, contemporary hybrid cultivars, and older hybrid cultivars that are no longer produced. The development of a set of near-isogenic open-pollinated spinach lines (NILs), having different resistance loci in a susceptible and otherwise common genetic background, would facilitate identification of races of the downy mildew pathogen, provide a tool to better understand the genetics of resistance, and expedite the development of molecular markers linked to these disease resistance loci. To achieve this objective, the spinach cv. Viroflay, susceptible to race 6 of Peronospora farinosa f. sp. spinaciae, was used as the recurrent susceptible parent in crosses with the hybrid spinach cv. Lion, resistant to race 6. Resistant F(1) progeny were subsequently backcrossed to Viroflay four times with selection for race 6 resistance each time. Analysis of the segregation data showed that resistance was controlled by a single dominant gene, and the resistance locus was designated Pfs-1. By bulk segregant analysis, an amplified fragment length polymorphism (AFLP) marker (E-ACT/M-CTG) linked to Pfs-1 was identified and used to develop a co-dominant Sequence characterized amplified region (SCAR) marker. This SCAR marker, designated Dm-1, was closely linked ( approximately 1.7 cM) to the Pfs-1 locus and could discriminate among spinach genotypes that were homozygous resistant (Pfs-1Pfs-1), heterozygous resistant (Pfs-1pfs-1), or homozygous susceptible (pfs-1pfs-1) to race 6 within the original mapping population. Evaluation of a wide range of commercial spinach lines outside of the mapping population indicated that Dm-1 could effectively identify Pfs-1 resistant genotypes; the Dm-1 marker correctly predicted the disease resistance phenotype in 120 out of 123 lines tested. In addition, the NIL containing the Pfs-1 locus (Pfs-1Pfs-1) was resistant to multiple races of the downy mildew pathogen indicating Pfs-1 locus may contain a cluster of resistance genes.
BT is a cost-effective treatment option for patients with poorly controlled, severe, persistent asthma.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
334 Leonard St
Brooklyn, NY 11211
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.