Soil-transmitted helminth (STH) infections are among the most prevalent of chronic human infections worldwide. Based on the demonstrable impact on child development, there is a global commitment to finance and implement control strategies with a focus on school-based chemotherapy programmes. The major obstacle to the implementation of cost-effective control is the lack of accurate descriptions of the geographical distribution of infection. In recent years considerable progress has been made in the use of geographical information systems (GIS) and remote sensing (RS) to better understand helminth ecology and epidemiology, and to develop low cost ways to identify target populations for treatment. This chapter explores how this information has been used practically to guide large-scale control programmes. The use of satellite-derived environmental data has yielded new insights into the ecology of infection at a geographical scale that has proven impossible to address using more traditional approaches, and has in turn allowed spatial distributions of infection prevalence to be predicted robustly by statistical approaches. GIS/ RS have increasingly been used in the context of large-scale helminth control programmes, including not only STH infections but also those focusing on schistosomiasis, filariasis and onchocerciasis. The experience indicates that GIS/RS provides a cost-effective approach to designing and monitoring programs at realistic scale. Importantly, the use of this approach has begun to transition from being a specialist approach of international vertical programs to become a routine tool in developing public sector control programs. GIS/RS is used here to describe the global distribution of STH infections and to estimate the number of infections in school age children in sub-Saharan Africa (89.9 million) and the annual cost of providing a single anthelmintic treatment using a school-based approach (US$5.0-7.6 million). These are the first estimates at a continental scale to explicitly include the fine spatial distribution of infection prevalence and population, and suggest that traditional methods have overestimated the situation. The results suggest that continent-wide control of parasites is, from a financial perspective, an attainable goal.
Helminth parasites are highly prevalent in human communities in developing countries. In an endemic area an infected individual may harbour parasitic worms for most of his or her life, and the ability of these infections to survive immunological attack has long been a puzzle. But new techniques are starting to expose the diverse mechanisms by which these agents modulate or evade their hosts' defences, creating a dynamic interaction between the human immune system and the parasite population.
This paper estimates the global burden of lymphatic filariasis based on a review of the published literature on infection and disease surveys. A method for aggregating and projecting prevalence data from individual studies to national, regional and global levels, which also facilitates the estimation of gender and age-specific burdens, is presented. The method weights in favour of the larger, and hence presumbably more reliable, studies and relies on estimated empirical relationships between gender, age, infection and disease in order to correct studies with incomplete data. The results presented here suggest that although the overall prevalence of filariasis cases is 2.0% globally (approximately totalling 119 million cases), the disease continues to be of considerable local importance, particularly in India and Sub-Saharan Africa. Estimates by age and gender clearly show that, unlike other helminth infections, filariasis is mainly a disease of the adult and older age-classes and appears to be more prevalent in males. This work suggests that the derivation of more accurate estimates of the burden of filariasis will require a better understanding of both the epidemiology and the spatial aspects of infection and disease. It also suggests that filariasis is preventable based on a geographically targeted strategy for control.
SUMMARYMathematical models of transmission dynamics of infectious diseases provide a useful tool for investigating the impact of community based control measures. Previously, we used a dynamic (constant force-of-infection) model for lymphatic filariasis to describe observed patterns of infection and disease in endemic communities. In this paper, we expand the model to examine the effects of control options against filariasis by incorporating the impact of age structure of the human community and by addressing explicitly the dynamics of parasite transmission from and to the vector population. This model is tested using data for Wuchereria bancrofti transmitted by Culex quinquefasciatus in Pondicherry, South India. The results show that chemotherapy has a larger short-term impact than vector control but that the effects of vector control can last beyond the treatment period. In addition we compare rates of recrudescence for drugs with different macrofilaricidal effects.
Despite the increasing number of models to predict infection risk for a range of diseases, the assessment of their spatial limits, predictive performance and practical application are not widely undertaken. Using the example of Schistosoma haematobium in Africa, this article illustrates how ecozonation and receiver-operator characteristic analysis can help to assess the usefulness of available models objectively.The resources targeted at parasite control are finite and often limited. Consequently, when designing control programs, it is essential to know the distribution and abundance of a disease, to devise and target intervention strategies and to optimize the use of available resources. In many African countries, the paucity of epidemiological data hinders the quantification of disease burden for basic planning. In an effort to overcome this problem, environmental data (often derived from satellite sensors) are increasingly used to predict infection risk [1][2][3]. There is a need, in such approaches, to evaluate objectively the predictive accuracy of the models used to generate risk maps and to consider the spatial extent to which models can be reliably extrapolated [4]. Developing predictive modelsReliable maps of infectious diseases require an understanding of whether models developed for one location can be applied to another because the environmental factors that influence disease transmission are unlikely to be uniform over large geographical areas [5]. Political boundaries are routinely used to define the spatial extent of risk maps, but the ecological heterogeneity, which is usually independent of political boundaries, is ignored. Alternatively, remotely-sensed (RS) derived environmental data can be used to develop ecological zone maps that identify areas of ecological similarity * (Box 1), and are better at defining where existing predictive models can and cannot be applied [6].The statistical methods commonly used to predict the occurrence or distribution of disease in relation to environmental variables are logistic regression and discriminant analysis, and these approaches have been used to map filariasis [3], malaria [6,7] . The predictive performance of these models, which is a prerequisite for their refinement [9], is evaluated by examining the agreement between predictions and observations [10] by using data collected from sites other than those used in model development. In logistic regression models, predictions are based on the model outputs that measure the probability of infection occurrence. To assess the predictive performance of a model, a probability threshold needs to be identified that can differentiate locations of relative risk. Often, however, the researcher is confronted with the question: which threshold to select for when discriminating between these different populations. A commonly used method in medical diagnostics, and more recently in ecological studies, is receiver-operator characteristic (ROC) analysis, which can provide an overall measure of model accuracy and c...
In order to explore the relationship between acute and chronic disease, age-specific data on the frequency and duration of episodic adenolymphangitis (ADL) in patients with 3 defined grades of lymphoedema in bancroftian filariasis were examined. The age distribution of grades I and II exhibited a convex age profile, but that of grade III showed a monotonic increase. The mean duration of oedema increased with its grade (grade I, 0.3 years; grade III, 9.9 years). The mean number of ADL episodes in the previous year for all cases was 4.2 and it increased with grade (grade I, 2.4 and grade III, 6.2). The mean duration of each ADL episode for all cases was 4.1 d and it was independent of grade and age. The mean period lost to ADL episodes in the previous year was 17.5 d; it increased from 9.4 d with grade I to 28.5 d with grade III. The results imply that there is a dynamic progression through the grades of lymphoedema and that the frequency of ADL episodes is positively associated with this progression. However, the study design could not separate cause from effect.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.