S U M M A R YZoonotic visceral leishmaniasis (ZVL) caused by Leishmania infantum is an important disease of humans and dogs. Here we review aspects of the transmission and control of ZVL. Whilst there is clear evidence that ZVL is maintained by sandfly transmission, transmission may also occur by non-sandfly routes, such as congenital and sexual transmission. Dogs are the only confirmed primary reservoir of infection. Meta-analysis of dog studies confirms that infectiousness is higher in symptomatic infection ; infectiousness is also higher in European than South American studies. A high prevalence of infection has been reported from an increasing number of domestic and wild mammals ; updated host ranges are provided. The crab-eating fox Cerdocyon thous, opossums Didelphis spp., domestic cat Felis cattus, black rat Rattus rattus and humans can infect sandflies, but confirmation of these hosts as primary or secondary reservoirs requires further xenodiagnosis studies at the population level. Thus the putative sylvatic reservoir(s) of ZVL remains unknown. Review of intervention studies examining the effectiveness of current control methods highlights the lack of randomized controlled trials of both dog culling and residual insecticide spraying. Topical insecticides (deltamethrin-impregnated collars and pour-ons) have been shown to provide a high level of individual protection to treated dogs, but further community-level studies are needed.
Vector-borne diseases (VBDs) such as malaria, dengue, and leishmaniasis exert a huge burden of morbidity and mortality worldwide, particularly affecting the poorest of the poor. The principal method by which these diseases are controlled is through vector control, which has a long and distinguished history. Vector control, to a greater extent than drugs or vaccines, has been responsible for shrinking the map of many VBDs. Here, we describe the history of vector control programmes worldwide from the late 1800s to date. Pre 1940, vector control relied on a thorough understanding of vector ecology and epidemiology, and implementation of environmental management tailored to the ecology and behaviour of local vector species. This complex understanding was replaced by a simplified dependency on a handful of insecticide-based tools, particularly for malaria control, without an adequate understanding of entomology and epidemiology and without proper monitoring and evaluation. With the rising threat from insecticide-resistant vectors, global environmental change, and the need to incorporate more vector control interventions to eliminate these diseases, we advocate for continued investment in evidence-based vector control. There is a need to return to vector control approaches based on a thorough knowledge of the determinants of pathogen transmission, which utilise a range of insecticide and non-insecticide-based approaches in a locally tailored manner for more effective and sustainable vector control. Author summaryVector-borne diseases (VBDs) such as dengue, Chagas disease, human African trypanosomiasis (HAT), leishmaniasis, and malaria exert a huge burden of morbidity and mortality worldwide. The principal method by which these diseases are controlled is through vector control. The authors chart the history of vector control through time from elucidation of the transmission route of VBDs to the present day. Pre-1940 vector control relied heavily PLOS Neglected Tropical Diseases | https://doi.
The elimination of seropositive dogs in Brazil has been used to control zoonotic visceral leishmaniasis but with little success. To elucidate the reasons for this, the infectiousness of 50 sentinel dogs exposed to natural Leishmania chagasi infection was assessed through time by xenodiagnosis with the sandfly vector, Lutzomyia longipalpis. Eighteen (43%) of 42 infected dogs became infectious after a median of 333 days in the field (105 days after seroconversion). Seven highly infectious dogs (17%) accounted for>80% of sandfly infections. There were positive correlations between infectiousness and anti-Leishmania immunoglobulin G, parasite detection by polymerase chain reaction, and clinical disease (logistic regression, r2=0.08-0.18). The sensitivity of enzyme-linked immunosorbent assay to detect currently infectious dogs was high (96%) but lower in the latent period (<63%), and specificity was low (24%). Mathematical modeling suggests that culling programs fail because of high incidence of infection and infectiousness, the insensitivity of the diagnostic test to detect infectious dogs, and time delays between diagnosis and culling.
BackgroundStudies performed over the past decade have identified fairly consistent epidemiological patterns of risk factors for visceral leishmaniasis (VL) in the Indian subcontinent.Methods and Principal FindingsTo inform the current regional VL elimination effort and identify key gaps in knowledge, we performed a systematic review of the literature, with a special emphasis on data regarding the role of cattle because primary risk factor studies have yielded apparently contradictory results. Because humans form the sole infection reservoir, clustering of kala-azar cases is a prominent epidemiological feature, both at the household level and on a larger scale. Subclinical infection also tends to show clustering around kala-azar cases. Within villages, areas become saturated over a period of several years; kala-azar incidence then decreases while neighboring areas see increases. More recently, post kala-azar dermal leishmaniasis (PKDL) cases have followed kala-azar peaks. Mud walls, palpable dampness in houses, and peri-domestic vegetation may increase infection risk through enhanced density and prolonged survival of the sand fly vector. Bed net use, sleeping on a cot and indoor residual spraying are generally associated with decreased risk. Poor micronutrient status increases the risk of progression to kala-azar. The presence of cattle is associated with increased risk in some studies and decreased risk in others, reflecting the complexity of the effect of bovines on sand fly abundance, aggregation, feeding behavior and leishmanial infection rates. Poverty is an overarching theme, interacting with individual risk factors on multiple levels.ConclusionsCarefully designed demonstration projects, taking into account the complex web of interconnected risk factors, are needed to provide direct proof of principle for elimination and to identify the most effective maintenance activities to prevent a rapid resurgence when interventions are scaled back. More effective, short-course treatment regimens for PKDL are urgently needed to enable the elimination initiative to succeed.
BackgroundThe relationships between heterogeneities in host infection and infectiousness (transmission to arthropod vectors) can provide important insights for disease management. Here, we quantify heterogeneities in Leishmania infantum parasite numbers in reservoir and non-reservoir host populations, and relate this to their infectiousness during natural infection. Tissue parasite number was evaluated as a potential surrogate marker of host transmission potential.MethodsParasite numbers were measured by qPCR in bone marrow and ear skin biopsies of 82 dogs and 34 crab-eating foxes collected during a longitudinal study in Amazon Brazil, for which previous data was available on infectiousness (by xenodiagnosis) and severity of infection.ResultsParasite numbers were highly aggregated both between samples and between individuals. In dogs, total parasite abundance and relative numbers in ear skin compared to bone marrow increased with the duration and severity of infection. Infectiousness to the sandfly vector was associated with high parasite numbers; parasite number in skin was the best predictor of being infectious. Crab-eating foxes, which typically present asymptomatic infection and are non-infectious, had parasite numbers comparable to those of non-infectious dogs.ConclusionsSkin parasite number provides an indirect marker of infectiousness, and could allow targeted control particularly of highly infectious dogs.
The sensitivity and specificity of PCR, serology (ELISA) and lymphoproliferative response to Leishmania antigen for the detection of Leishmania infantum infection were evaluated in a cohort of 126 dogs exposed to natural infection in Brazil. For PCR, Leishmania DNA from bone-marrow was amplified with both minicircle and ribosomal primers. The infection status and time of infection of each dog were estimated from longitudinal data. The sensitivity of PCR in parasite-positive samples was 98 %. However, the overall sensitivity of PCR in post-infection samples, from dogs with confirmed infection, was only 68 %. The sensitivity of PCR varied during the course of infection, being highest (78-88 %) 0-135 days postinfection and declining to around 50 % after 300 days. The sensitivity of PCR also varied between dogs, and was highest in sick dogs. The sensitivity of serology was similar in parasite-positive (84 %), PCR-positive (86 %) and post-infection (88 %) samples. The sensitivity of serology varied during the course of infection, being lowest at the time of infection and high (93-100 %) thereafter. Problems in determining the specificity of serology are discussed. The sensitivity and specificity of cellular responsiveness were low. These data suggest that PCR is most useful in detecting active or symptomatic infection, and that serology can be a more sensitive technique for the detection of all infected dogs.
Article:Quinnell, R.J., Courtenay, O., Garcez, L.M. et al.(1 more author) (1997) The epidemiology of canine leishmaniasis: transmission rates estimated from a cohort study in Amazonian Brazil. Parasitology, 115 (02 (Received 13 June 1996 ; revised 4 February 1997 ; accepted 7 February 1997 ) We estimate the incidence rate, serological conversion rate and basic case reproduction number (R ! )ofLeishmania infantum from a cohort study of 126 domestic dogs exposed to natural infection rates over 2 years on Marajo! Island, Para! State, Brazil. The analysis includes new methods for (1) determining the number of seropositives in cross-sectional serological data, (2) identifying seroconversions in longitudinal studies, based on both the number of antibody units and their rate of change through time, (3) estimating incidence and serological pre-patent periods and (4) calculating R ! for a potentially fatal, vector-borne disease under seasonal transmission. Longitudinal and cross-sectional serological (ELISA) analyses gave similar estimates of the proportion of dogs positive. However, longitudinal analysis allowed the calculation of prepatent periods, and hence the more accurate estimation of incidence : an infection-conversion model fitted by maximum likelihood to serological data yielded seasonally varying per capita incidence rates with a mean of 8n66i10 −$ \day (mean time to infection 115 days, 95 % .. 107-126 days), and a median pre-patent period of 94 (95 % .. 82-111) days. These results were used in conjunction with theory and dog demographic data to estimate the basic reproduction number, R ! , as 5n9 (95 % ..4 n 4-7n4). R ! is a determinant of the scale of the leishmaniasis control problem, and we comment on the options for control.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.