processes. Biological or land-based forms of CO 2 utilization can generate economic value in the form of, for example, wood products for buildings, increased plant yields from enhanced soil carbon uptake, and even the production of biofuel and bio-derived chemicals. We use this broader definition deliberately; by thinking functionally, rather than narrowly about specific processes, we hope to promote dialogue across scientific fields, compare costs and benefits across pathways, and consider common techno-economic characteristics across pathways that could potentially assist in the identification of routes towards the mitigation of climate change. In this Perspective, we consider a non-exhaustive selection of ten CO 2 utilization pathways and provide a transparent assessment of the potential scale and cost for each one. The ten pathways are as follows: (1) CO 2-based chemical products, including polymers; (2) CO 2-based fuels; (3) microalgae fuels and other microalgae products; (4) concrete building materials; (5) CO 2 enhanced oil recovery (CO 2-EOR); (6) bioenergy with carbon capture and storage (BECCS); (7) enhanced weathering; (8) forestry techniques, including afforestation/reforestation, forest management and wood products; (9) land management via soil carbon sequestration techniques; and (10) biochar. These ten CO 2 utilization pathways can also be characterized as 'cycling', 'closed' and 'open' utilization pathways (Fig. 1, Table 1, Supplementary Materials). For instance, many (but not all) conventional industrial utilization pathways-such as CO 2-based fuels and chemicals-tend to be 'cycling': they move carbon through industrial systems over timescales of days, weeks or months. Such pathways do not provide net CO 2 removal from the atmosphere, but they can reduce emissions via industrial CO 2 capture that displaces fossil fuel use. By contrast, 'closed' pathways involve utilization and nearpermanent CO 2 storage, such as in the lithosphere (via CO 2-EOR or BECCS), in the deep ocean (via terrestrial enhanced weathering) or in mineralized carbon in the built and natural environments. Finally, 'open' pathways tend to be based in biological systems,
Exposure to ionizing radiation is ubiquitous, and it is well established that moderate and high doses cause ill-health and can be lethal. The health effects of low doses or low dose-rates of ionizing radiation are not so clear. This paper describes a project which sets out to summarize, as a restatement, the natural science evidence base concerning the human health effects of exposure to low-level ionizing radiation. A novel feature, compared to other reviews, is that a series of statements are listed and categorized according to the nature and strength of the evidence that underpins them. The purpose of this restatement is to provide a concise entrée into this vibrant field, pointing the interested reader deeper into the literature when more detail is needed. It is not our purpose to reach conclusions on whether the legal limits on radiation exposures are too high, too low or just right. Our aim is to provide an introduction so that non-specialist individuals in this area (be they policy-makers, disputers of policy, health professionals or students) have a straightforward place to start. The summary restatement of the evidence and an extensively annotated bibliography are provided as appendices in the electronic supplementary material.
Food poisoning caused by Campylobacter (campylobacteriosis) is the most prevalent bacterial disease associated with the consumption of poultry, beef, lamb and pork meat and unpasteurized dairy products. A variety of livestock industry, food chain and public health interventions have been implemented or proposed to reduce disease prevalence, some of which entail costs for producers and retailers. This paper describes a project that set out to summarize the natural science evidence base relevant to campylobacteriosis control in as policy-neutral terms as possible. A series of evidence statements are listed and categorized according to the nature of the underlying information. The evidence summary forms the appendix to this paper and an annotated bibliography is provided in the electronic supplementary material.
Introduction: We have previously shown that combining a polygenic risk score (PRS) for cardiovascular disease (CVD), a numerical summary of an individual’s genetic predisposition to CVD, with standard clinical risk calculators such as ASCVD-PCE and QRISK results in improved estimates of CVD risk. Implementation of such a cardiovascular integrated risk tool (CVD IRT) into real world clinical practice is a key focus for further study. Hypothesis: We assessed the hypothesis that a CVD IRT can be incorporated into routine primary care. Methods: The Healthcare Evaluation of Absolute Risk Testing Study (NCT05294419) is a prospective trial recruiting up to 1,000 healthy participants undergoing health checks across 12 UK NHS general practices. Both QRISK2 and CVD IRT scores were generated and returned to clinicians, who then communicated the results to participants. The primary outcome of this study is operational success as well as feedback from health care providers (HCPs) and participants. The study also measures the impact of the CVD IRT on clinical decision making. Results: These are interim analyses. As of April 2022, 624 eligible participants (62% female, mean age 55) have been recruited. A total of 371 CVD IRT reports have been generated, with 100% of blood samples generating scores that were all returned within the designated time frame. Among the primary care HCPs, 89% (8/9) agreed that the incorporation of CVD IRT into routine care could be done in a straightforward manner. Among the participants who have completed a survey to date, 93% (125/135) would likely or very likely recommend the CVD IRT to friends and family. Average QRISK2 (6.3%) and CVD IRT (6.6%) risk scores did not differ significantly, but there were broad changes in risk among individual patients, with 5% (19/371) of patients crossing above the risk threshold to treat according to NICE guidelines (10-year risk ≥ 10%) as well as 3% (11/371) of patients reclassified as very high risk (10-year risk ≥ 20%). Conclusions: The rollout of an integrated risk tool combining polygenic risk into a standardized CVD risk calculator within primary care is feasible and well accepted by clinicians and participants. The CVD IRT results suggest clinically actionable changes in a substantial proportion of this population.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.