COVID-19 is characterized by an infectious pre-symptomatic period, when newly infected individuals can unwittingly infect others. We are interested in what benefits facemasks could offer as a non-pharmaceutical intervention, especially in the settings where high-technology interventions, such as contact tracing using mobile apps or rapid case detection via molecular tests, are not sustainable. Here, we report the results of two mathematical models and show that facemask use by the public could make a major contribution to reducing the impact of the COVID-19 pandemic. Our intention is to provide a simple modelling framework to examine the dynamics of COVID-19 epidemics when facemasks are worn by the public, with or without imposed ‘lock-down’ periods. Our results are illustrated for a number of plausible values for parameter ranges describing epidemiological processes and mechanistic properties of facemasks, in the absence of current measurements for these values. We show that, when facemasks are used by the public all the time (not just from when symptoms first appear), the effective reproduction number, R e , can be decreased below 1, leading to the mitigation of epidemic spread. Under certain conditions, when lock-down periods are implemented in combination with 100% facemask use, there is vastly less disease spread, secondary and tertiary waves are flattened and the epidemic is brought under control. The effect occurs even when it is assumed that facemasks are only 50% effective at capturing exhaled virus inoculum with an equal or lower efficiency on inhalation. Facemask use by the public has been suggested to be ineffective because wearers may touch their faces more often, thus increasing the probability of contracting COVID-19. For completeness, our models show that facemask adoption provides population-level benefits, even in circumstances where wearers are placed at increased risk. At the time of writing, facemask use by the public has not been recommended in many countries, but a recommendation for wearing face-coverings has just been announced for Scotland. Even if facemask use began after the start of the first lock-down period, our results show that benefits could still accrue by reducing the risk of the occurrence of further COVID-19 waves. We examine the effects of different rates of facemask adoption without lock-down periods and show that, even at lower levels of adoption, benefits accrue to the facemask wearers. These analyses may explain why some countries, where adoption of facemask use by the public is around 100%, have experienced significantly lower rates of COVID-19 spread and associated deaths. We conclude that facemask use by the public, when used in combination with physical distancing or periods of lock-down, may provide an acceptable way of managing the COVID-19 pandemic and re-opening economic activity. These results are relevant to the developed as well as the developing world, where large numbers of people are resource poor, but fabrication of home-made, effective facemasks is possible. A key message from our analyses to aid the widespread adoption of facemasks would be: ‘my mask protects you, your mask protects me’.
Although local eradication is routinely attempted following introduction of disease into a new region, failure is commonplace. Epidemiological principles governing the design of successful control are not well-understood. We analyse factors underlying the effectiveness of reactive eradication of localised outbreaks of invading plant disease, using citrus canker in Florida as a case study, although our results are largely generic, and apply to other plant pathogens (as we show via our second case study, citrus greening). We demonstrate how to optimise control via removal of hosts surrounding detected infection (i.e. localised culling) using a spatially-explicit, stochastic epidemiological model. We show how to define optimal culling strategies that take account of stochasticity in disease spread, and how the effectiveness of disease control depends on epidemiological parameters determining pathogen infectivity, symptom emergence and spread, the initial level of infection, and the logistics and implementation of detection and control. We also consider how optimal culling strategies are conditioned on the levels of risk acceptance/aversion of decision makers, and show how to extend the analyses to account for potential larger-scale impacts of a small-scale outbreak. Control of local outbreaks by culling can be very effective, particularly when started quickly, but the optimum strategy and its performance are strongly dependent on epidemiological parameters (particularly those controlling dispersal and the extent of any cryptic infection, i.e. infectious hosts prior to symptoms), the logistics of detection and control, and the level of local and global risk that is deemed to be acceptable. A version of the model we developed to illustrate our methodology and results to an audience of stakeholders, including policy makers, regulators and growers, is available online as an interactive, user-friendly interface at http://www.webidemics.com/. This version of our model allows the complex epidemiological principles that underlie our results to be communicated to a non-specialist audience.
Summary Effective control of plant disease remains a key challenge. Eradication attempts often involve removal of host plants within a certain radius of detection, targeting asymptomatic infection. Here we develop and test potentially more effective, epidemiologically motivated, control strategies, using a mathematical model previously fitted to the spread of citrus canker in Florida.We test risk‐based control, which preferentially removes hosts expected to cause a high number of infections in the remaining host population. Removals then depend on past patterns of pathogen spread and host removal, which might be nontransparent to affected stakeholders. This motivates a variable radius strategy, which approximates risk‐based control via removal radii that vary by location, but which are fixed in advance of any epidemic.Risk‐based control outperforms variable radius control, which in turn outperforms constant radius removal. This result is robust to changes in disease spread parameters and initial patterns of susceptible host plants. However, efficiency degrades if epidemiological parameters are incorrectly characterised.Risk‐based control including additional epidemiology can be used to improve disease management, but it requires good prior knowledge for optimal performance. This focuses attention on gaining maximal information from past epidemics, on understanding model transferability between locations and on adaptive management strategies that change over time.
Compartmental models have become the dominant theoretical paradigm in mechanistic modeling of plant disease and offer well-known advantages in terms of analytic tractability, ease of simulation, and extensibility. However, underlying assumptions of constant rates of infection and of exponentially distributed latent and infectious periods are difficult to justify. Although alternative approaches, including van der Plank's seminal discrete time model and models based on the integro-differential formulation of Kermack and McKendrick's model, have been suggested for plant disease and relax these unrealistic assumptions, they are challenging to implement and to analyze. Here, we propose an extension to the susceptible, exposed, infected, and removed (SEIR) compartmental model, splitting the latent and infection compartments and thereby allowing time-varying infection rates and more realistic distributions of latent and infectious periods to be represented. Although the model is, in fact, more general, we specifically target plant disease by demonstrating how it can represent both the van der Plank model and the most commonly used variant of the Kermack and McKendrick (K & M) model (in which the infectivity response is delay Gamma distributed). We show how our reformulation retains the numeric and analytic tractability of SEIR models, and how it can be used to replicate earlier analyses of the van der Plank and K & M models. Our reformulation has the advantage of using elementary mathematical techniques, making implementation easier for the nonspecialist. We show a practical implication of these results for disease control. By taking advantage of the easy extensibility characteristic of compartmental models, we also investigate the effects of including additional biological realism. As an example, we show how the more realistic infection responses we consider interact with host demography and lead to divergent invasion thresholds when compared with the "standard" SEIR model. An ever-increasing number of analyses purportedly extract more biologically realistic invasion thresholds by adding additional biological detail to the SEIR model framework; we contend that our results demonstrate that extending a model that has such a simplistic representation of the infection dynamics may not, in fact, lead to more accurate results. Therefore, we suggest that modelers should carefully consider the underlying assumptions of the simplest compartmental models in their future work.
Wheat rust diseases pose one of the greatest threats to global food security, including subsistence farmers in Ethiopia. The fungal spores transmitting wheat rust are dispersed by wind and can remain infectious after dispersal over long distances. The emergence of new strains of wheat rust has exacerbated the risks of severe crop loss. We describe the construction and deployment of a near realtime early warning system (EWS) for two major wind-dispersed diseases of wheat crops in Ethiopia that combines existing environmental research infrastructures, newly developed tools and scientific expertise across multiple organisations in Ethiopia and the UK. The EWS encompasses a sophisticated framework that integrates field and mobile phone surveillance data, spore dispersal and disease environmental suitability forecasting, as well as communication to policy-makers, advisors and smallholder farmers. The system involves daily automated data flow between two continents during the wheat season in Ethiopia. The framework utilises expertise and environmental research infrastructures from within the cross-disciplinary spectrum of biology, agronomy, meteorology, computer science and telecommunications. The EWS successfully provided timely information to assist policy makers formulate decisions about allocation of limited stock of fungicide during the 2017 and 2018 wheat seasons. Wheat rust alerts and advisories were sent by short message service and reports to 10 000 development agents and approximately 275 000 smallholder farmers in Ethiopia who rely on wheat for subsistence and livelihood security. The framework represents one of the first advanced crop disease EWSs implemented in a developing country. It provides policy-makers, extension agents and farmers with timely, actionable information on priority diseases affecting a staple food crop. The framework together with the underpinning technologies are transferable to forecast wheat rusts in other regions and can be readily adapted for other wind-dispersed pests and disease of major agricultural crops.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.