Journal Article > ResearchFull Text
Int J Epidemiol. 1996 August 1
Seaman J, Mercer A, Sondorp H
Int J Epidemiol. 1996 August 1
BACKGROUND: Although endemic in parts of southern Sudan, visceral leishmaniasis (VL) had not been reported in Western Upper Nile (WUN) until an epidemic was confirmed in 1989. A combination of circumstances created conditions for transmission among a population of mainly Nuer and Dinka people who had no immunity. The civil war which restarted in 1983 has been a major contributing cause and continues to hinder provision of treatment, data collection and control measures. METHODS: Since the first of three clinics to treat VL was established in WUN in 1989, data on the epidemic and mortality have been collected in seven retrospective surveys of villages and among patients. Adults were interviewed about surviving family members and those who had died since the epidemic came. Survey death rates are used here to estimate mortality from VL and 'excess mortality' above expected levels. RESULTS: The surveys found high mortality at all ages and suggest an overall death rate of 38-57% since the epidemic started in 1984, and up to 70% in the most affected areas. Both methods of estimation suggest that around 100,000 deaths, among about 280,000 people in the epidemic area, might be attributable to VL. CONCLUSIONS: This continuing epidemic has shown that VL can cause high mortality in an outbreak with astonishingly high infection rates. Population movement has been a major factor in transmission and poor nutritional status has probably contributed to the risk of clinical infection. Although over 17,000 people have been successfully treated for VL at the clinics in WUN, the disease is likely to become endemic there.
Journal Article > ReviewFull Text
Int J Epidemiol. 2016 May 20; Volume 46 (Issue 2); e21.; DOI:doi.org/10.1093/ije/dyw057
Stinson K, Goemaere E, Coetzee D, van Cutsem G, Hilderbrand K, et al.
Int J Epidemiol. 2016 May 20; Volume 46 (Issue 2); e21.; DOI:doi.org/10.1093/ije/dyw057
Journal Article > ResearchFull Text
Int J Epidemiol. 2016 June 24; DOI:10.1093/ije/dyw097
Schomaker M, Leroy V, Wolfs T, Technau KG, Renner L, et al.
Int J Epidemiol. 2016 June 24; DOI:10.1093/ije/dyw097
Background: There is limited knowledge about the optimal timing of antiretroviral treatment initiation in older children and adolescents. Methods: A total of 20 576 antiretroviral treatment (ART)-naïve patients, aged 1-16 years at enrolment, from 19 cohorts in Europe, Southern Africa and West Africa, were included. We compared mortality and growth outcomes for different ART initiation criteria, aligned with previous and recent World Health Organization criteria, for 5 years of follow-up, adjusting for all measured baseline and time-dependent confounders using the g-formula. Results: Median (1st;3rd percentile) CD4 count at baseline was 676 cells/mm3 (394; 1037) (children aged ≥ 1 and < 5 years), 373 (172; 630) (≥ 5 and < 10 years) and 238 (88; 425) (≥ 10 and < 16 years). There was a general trend towards lower mortality and better growth with earlier treatment initiation. In children < 10 years old at enrolment, by 5 years of follow-up there was lower mortality and a higher mean height-for-age z-score with immediate ART initiation versus delaying until CD4 count < 350 cells/mm3 (or CD4% < 15% or weight-for-age z-score < -2) with absolute differences in mortality and height-for-age z-score of 0.3% (95% confidence interval: 0.1%; 0.6%) and -0.08 (-0.09; -0.06) (≥ 1 and < 5 years), and 0.3% (0.04%; 0.5%) and -0.07 (-0.08; -0.05) (≥ 5 and < 10 years). In those aged > 10 years at enrolment we did not find any difference in mortality or growth with immediate ART initiation, with estimated differences of -0.1% (-0.2%; 0.6%) and -0.03 (-0.05; 0.00), respectively. Growth differences in children aged < 10 years persisted for treatment thresholds using higher CD4 values. Regular follow-up led to better height and mortality outcomes. Conclusions: Immediate ART is associated with lower mortality and better growth for up to 5 years in children < 10 years old. Our results on adolescents were inconclusive.
Journal Article > ResearchFull Text
Int J Epidemiol. 2004 October 1; Volume 33 (Issue 5); DOI:10.1093/ije/dyh253
Zurovac D
Int J Epidemiol. 2004 October 1; Volume 33 (Issue 5); DOI:10.1093/ije/dyh253
BACKGROUND: When replacing failing drugs for malaria with more effective drugs, an important step towards reducing the malaria burden is that health workers (HW) prescribe drugs according to evidence-based guidelines. Past studies have shown that HW commonly do not follow guidelines, yet few studies have explored with appropriate methods why such practices occur. METHODS: We analysed data from a survey of government health facilities in four Kenyan districts in which HW consultations were observed, caretakers and HW were interviewed, and health facility assessments were performed. The analysis was limited to children 2-59 months old with uncomplicated malaria. Treatment was defined as recommended (antimalarial recommended by national guidelines), a minor error (effective, but non-recommended antimalarial), or inappropriate (no effective antimalarial). RESULTS: We evaluated 1006 consultations performed by 135 HW at 81 facilities: 567 children received recommended treatment, 314 had minor errors, and 125 received inappropriate treatment (weighted percentages: 56.9%, 30.4%, and 12.7%). Multivariate logistic regression analysis revealed that programmatic interventions such as in-service malaria training, provision of guidelines and wall charts, and more frequent supervision were significantly associated with better treatment quality. However, neither in-service training nor possession of the guideline document showed an effect by itself. More qualified HW made more errors: both major and minor errors (but generally more minor errors) when second-line drugs were in stock, and more major errors when second-line drugs were not in stock. Child factors such as age and a main complaint of fever were also associated with treatment quality. CONCLUSIONS: Our results support the use of several programmatic strategies that can redress HW deficiencies in malaria treatment. Targeted cost-effectiveness trials would help refine these strategies and provide more precise guidance on affordable and effective ways to strengthen and maintain HW practices.
Journal Article > ResearchFull Text
Int J Epidemiol. 1990 December 1
Porter JDH, Gastellu-Etchegorry M, Navarre I, Lungu G, Moren A
Int J Epidemiol. 1990 December 1
Between November 1988 and January 1989, measles outbreaks occurred in 11 Mozambican refugee camps in Malawi with five camps principally affected. A total of 1214 cases were reported. Despite the reduction of the age of measles vaccination to six months in 1987, attack rates were highest in children aged 6-9 months (10-26%); rates were also high in the 0-5 month age group (3-21%). The case-fatality rate was high among children less than five years old (15-21%). Children were being inappropriately vaccinated, either being vaccinated at less than six months of age (2-29%) or failing to receive a second dose if vaccinated at six months (0-25%). With vaccine coverage between 66-87%, vaccine efficacy in children less than five years old was estimated to be more than 90% in the camps principally affected. Reduction of the age of vaccination leads to logistical problems in vaccine delivery in refugee situations. These outbreaks again indicate the need to improve vaccine coverage with the existing Schwarz vaccine, and also highlight the urgent need for an effective single dose measles vaccine for children less than nine months of age.
Journal Article > Review
Int J Epidemiol. 2009 February 2; Volume 38 (Issue 1); DOI:10.1093/ije/dyn224
Wolfson LJ, Grais RF, Luquero FJ, Birmingham ME, Strebel PM
Int J Epidemiol. 2009 February 2; Volume 38 (Issue 1); DOI:10.1093/ije/dyn224
BACKGROUND: Global deaths from measles have decreased notably in past decades, due to both increases in immunization rates and decreases in measles case fatality ratios (CFRs). While some aspects of the reduction in measles mortality can be monitored through increases in immunization coverage, estimating the level of measles deaths (in absolute terms) is problematic, particularly since incidence-based methods of estimation rely on accurate measures of measles CFRs. These ratios vary widely by geographic and epidemiologic context and even within the same community from year-to-year. METHODS: To understand better the variations in CFRs, we reviewed community-based studies published between 1980 and 2008 reporting age-specific measles CFRs. RESULTS: The results of the search consistently document that measles CFRs are highest in unvaccinated children under age 5 years; in outbreaks; the lowest CFRs occur in vaccinated children regardless of setting. The broad range of case and death definitions, study populations and geography highlight the complexities in extrapolating results for global public health planning. CONCLUSIONS: Values for measles CFRs remain imprecise, resulting in continued uncertainty about the actual toll measles exacts.
Journal Article > ResearchFull Text
Int J Epidemiol. 2000 October 1
Kaninda AV, Belanger F, Lewis R, Batchassi E, Aplogan A, et al.
Int J Epidemiol. 2000 October 1
BACKGROUND: Early outbreak detection is necessary for control of meningococcal meningitis epidemics. A weekly incidence of 15 cases per 100 000 inhabitants averaged over 2 consecutive weeks is recommended by the World Health Organization (WHO) for detection of meningitis epidemics in Africa. This and other thresholds are tested for ability to predict outbreaks and timeliness for control measures. METHODS: Meningitis cases recorded for 1990-1997 in health centres of northern Togo were reviewed. Weekly and annual incidences were determined for each district. Ability of different weekly incidence thresholds to detect outbreaks was assessed according to sensitivity, specificity, and positive and negative predictive values. The number of cases potentially prevented by reactive vaccination in 1997 was calculated for each threshold. RESULTS: Outbreaks occurred in 1995-1996 and in 1996-1997. The WHO-recommended threshold had good specificity but low sensitivity. Thresholds of 10 and 7 cases per 100,000 inhabitants in one week had sensitivity and specificity of 100% and increased the time available for intervention by more than one or two weeks, respectively. A maximum of 65% of cases could have been prevented during the 1997 epidemic, with up to 8% fewer cases prevented for each week of delay in achieving vaccine coverage. CONCLUSIONS: In northern Togo, thresholds of 7 or 10 cases per 100,000 inhabitants per week were excellent predictors of meningitis epidemics and allowed more time for a reactive vaccination strategy than current recommendations.
Journal Article > ResearchFull Text
Int J Epidemiol. 2008 January 9; Volume 37 (Issue 2); DOI:10.1093/ije/dym275
Kolaczinski JH, Reithinger R, Worku DT, Ocheng A, Kasimiro J, et al.
Int J Epidemiol. 2008 January 9; Volume 37 (Issue 2); DOI:10.1093/ije/dym275
BACKGROUND: In East Africa, visceral leishmaniasis (VL) is endemic in parts of Sudan, Ethiopia, Somalia, Kenya and Uganda. It is caused by Leishmania donovani and transmitted by the sandfly vector Phlebotomus martini. In the Pokot focus, reaching from western Kenya into eastern Uganda, formulation of a prevention strategy has been hindered by the lack of knowledge on VL risk factors as well as by lack of support from health sector donors. The present study was conducted to establish the necessary evidence-base and to stimulate interest in supporting the control of this neglected tropical disease in Uganda and Kenya. METHODS: A case-control study was carried out from June to December 2006. Cases were recruited at Amudat hospital, Nakapiripirit district, Uganda, after clinical and parasitological confirmation of symptomatic VL infection. Controls were individuals that tested negative using a rK39 antigen-based dipstick, which were recruited at random from the same communities as the cases. Data were analysed using conditional logistic regression. RESULTS: Ninety-three cases and 226 controls were recruited into the study. Multivariate analysis identified low socio-economic status and treating livestock with insecticide as risk factors for VL. Sleeping near animals, owning a mosquito net and knowing about VL symptoms were associated with a reduced risk of VL. CONCLUSIONS: VL affects the poorest of the poor of the Pokot tribe. Distribution of insecticide-treated mosquito nets combined with dissemination of culturally appropriate behaviour-change education is likely to be an effective prevention strategy.
Journal Article > CommentaryFull Text
Int J Epidemiol. 2012 April 1; Volume 41 (Issue 2); DOI:10.1093/ije/dys032
Ford NP
Int J Epidemiol. 2012 April 1; Volume 41 (Issue 2); DOI:10.1093/ije/dys032