Journal Article > ResearchFull Text
Advances in Medical Education and Practice. 2022 June 6; Volume 13; 595-607.; DOI: 10.2147/AMEP.S358702
Owolabi JO, Ojiambo R, Seifu D, Nishimwe A, Masimbi O, et al.
Advances in Medical Education and Practice. 2022 June 6; Volume 13; 595-607.; DOI: 10.2147/AMEP.S358702
BACKGROUND
This article presents a qualitative study of African anatomists and anatomy teachers on the Anatomage Table-a modern medical education technology and innovation, as an indicator of African anatomy medical and anatomy educators' acceptance of EdTech. The Anatomage Table is used for digital dissection, prosection, functional anatomy demonstration, virtual simulation of certain functions, and interactive digital teaching aid.
MATERIALS AND METHODS
Anatomy teachers [n=79] from 11 representative African countries, Ghana, Nigeria [West Africa], Ethiopia, Kenya, Rwanda [East Africa], Namibia [South Africa], Zambia [Southern Africa], Egypt [North Africa], and Sudan [Central Africa], participated in this study. Focus group discussions [FGDs] were set up to obtain qualitative information from stakeholders from representative institutions. In addition, based on the set criteria, selected education leaders and stakeholders in representative institutions participated in In-depth Interviews [IDIs]. The interview explored critical issues concerning their perceptions about the acceptance, adoption, and integration of educational technology, specifically, the Anatomage Table into the teaching of Anatomy and related medical sciences in the African continent. Recorded interviews were transcribed and analyzed using the Dedoose software.
RESULTS
African anatomists are generally technology inclined and in favor of EdTech. The most recurring opinion was that the Anatomage Table could only be a "complementary teaching tool to cadavers" and that it "can't replace the real-life experience of cadavers." Particularly, respondents from user institutions opined that it "complements the traditional cadaver-based approaches" to anatomy learning and inquiry, including being a good "complement for cadaveric skill lab" sessions. Compared with the traditional cadaveric dissections a majority also considered it less problematic regarding cultural acceptability and health and safety-related concerns. The lifelikeness of the 3D representation is a major factor that drives acceptability.
This article presents a qualitative study of African anatomists and anatomy teachers on the Anatomage Table-a modern medical education technology and innovation, as an indicator of African anatomy medical and anatomy educators' acceptance of EdTech. The Anatomage Table is used for digital dissection, prosection, functional anatomy demonstration, virtual simulation of certain functions, and interactive digital teaching aid.
MATERIALS AND METHODS
Anatomy teachers [n=79] from 11 representative African countries, Ghana, Nigeria [West Africa], Ethiopia, Kenya, Rwanda [East Africa], Namibia [South Africa], Zambia [Southern Africa], Egypt [North Africa], and Sudan [Central Africa], participated in this study. Focus group discussions [FGDs] were set up to obtain qualitative information from stakeholders from representative institutions. In addition, based on the set criteria, selected education leaders and stakeholders in representative institutions participated in In-depth Interviews [IDIs]. The interview explored critical issues concerning their perceptions about the acceptance, adoption, and integration of educational technology, specifically, the Anatomage Table into the teaching of Anatomy and related medical sciences in the African continent. Recorded interviews were transcribed and analyzed using the Dedoose software.
RESULTS
African anatomists are generally technology inclined and in favor of EdTech. The most recurring opinion was that the Anatomage Table could only be a "complementary teaching tool to cadavers" and that it "can't replace the real-life experience of cadavers." Particularly, respondents from user institutions opined that it "complements the traditional cadaver-based approaches" to anatomy learning and inquiry, including being a good "complement for cadaveric skill lab" sessions. Compared with the traditional cadaveric dissections a majority also considered it less problematic regarding cultural acceptability and health and safety-related concerns. The lifelikeness of the 3D representation is a major factor that drives acceptability.
Journal Article > ResearchFull Text
Int J Infect Dis. 2022 September 1; Volume 122; 215-221.; DOI:10.1016/j.ijid.2022.05.039
Zheng Q, Luquero FJ, Ciglenecki I, Wamala JF, Abubakar A, et al.
Int J Infect Dis. 2022 September 1; Volume 122; 215-221.; DOI:10.1016/j.ijid.2022.05.039
BACKGROUND
Cholera remains a public health threat but is inequitably distributed across sub-Saharan Africa. Lack of standardized reporting and inconsistent outbreak definitions limit our understanding of cholera outbreak epidemiology.
METHODS
From a database of cholera incidence and mortality, we extracted data from sub-Saharan Africa and reconstructed outbreaks of suspected cholera starting in January 2010 to December 2019 based on location-specific average weekly incidence rate thresholds. We then described the distribution of key outbreak metrics.
RESULTS
We identified 999 suspected cholera outbreaks in 744 regions across 25 sub-Saharan African countries. The outbreak periods accounted for 1.8 billion person-months (2% of the total during this period) from January 2010 to January 2020. Among 692 outbreaks reported from second-level administrative units (e.g., districts), the median attack rate was 0.8 per 1000 people (interquartile range (IQR), 0.3-2.4 per 1000), the median epidemic duration was 13 weeks (IQR, 8-19), and the median early outbreak reproductive number was 1.8 (range, 1.1-3.5). Larger attack rates were associated with longer times to outbreak peak, longer epidemic durations, and lower case fatality risks.
CONCLUSIONS
This study provides a baseline from which the progress toward cholera control and essential statistics to inform outbreak management in sub-Saharan Africa can be monitored.
Cholera remains a public health threat but is inequitably distributed across sub-Saharan Africa. Lack of standardized reporting and inconsistent outbreak definitions limit our understanding of cholera outbreak epidemiology.
METHODS
From a database of cholera incidence and mortality, we extracted data from sub-Saharan Africa and reconstructed outbreaks of suspected cholera starting in January 2010 to December 2019 based on location-specific average weekly incidence rate thresholds. We then described the distribution of key outbreak metrics.
RESULTS
We identified 999 suspected cholera outbreaks in 744 regions across 25 sub-Saharan African countries. The outbreak periods accounted for 1.8 billion person-months (2% of the total during this period) from January 2010 to January 2020. Among 692 outbreaks reported from second-level administrative units (e.g., districts), the median attack rate was 0.8 per 1000 people (interquartile range (IQR), 0.3-2.4 per 1000), the median epidemic duration was 13 weeks (IQR, 8-19), and the median early outbreak reproductive number was 1.8 (range, 1.1-3.5). Larger attack rates were associated with longer times to outbreak peak, longer epidemic durations, and lower case fatality risks.
CONCLUSIONS
This study provides a baseline from which the progress toward cholera control and essential statistics to inform outbreak management in sub-Saharan Africa can be monitored.
Journal Article > ResearchFull Text
Epidemiol Infect. 2020 March 13; Volume 148; DOI:10.1017/S095026882000062X
Ferreras E, Blake A, Chewe O, Mwaba J, Zulu G, et al.
Epidemiol Infect. 2020 March 13; Volume 148; DOI:10.1017/S095026882000062X
We conducted a matched case-control (MCC), test-negative case-control (TNCC) and case-cohort study in 2016 in Lusaka, Zambia, following a mass vaccination campaign. Confirmed cholera cases served as cases in all three study designs. In the TNCC, control-subjects were cases with negative cholera culture and polymerase chain reaction results. Matched controls by age and sex were selected among neighbours of the confirmed cases in the MCC study. For the case-cohort study, we recruited a cohort of randomly selected individuals living in areas considered at-risk of cholera. We recruited 211 suspected cases (66 confirmed cholera cases and 145 non-cholera diarrhoea cases), 1055 matched controls and a cohort of 921. Adjusted vaccine effectiveness of one dose of oral cholera vaccine (OCV) was 88.9% (95% confidence interval (CI) 42.7–97.8) in the MCC study, 80.2% (95% CI: 16.9–95.3) in the TNCC design and 89.4% (95% CI: 64.6–96.9) in the case-cohort study. Three study designs confirmed the short-term effectiveness of single dose OCV. Major healthcare-seeking behaviour bias did not appear to affect our estimates. Most of the protection among vaccinated individuals could be attributed to the direct effect of the vaccine.
Journal Article > ResearchFull Text
PLOS One. 2013 February 28; Volume 8 (Issue 2); e57611.; DOI:10.1371/journal.pone.0057611
Estill J, Egger M, Johnson LF, Gsponer T, Wandeler G, et al.
PLOS One. 2013 February 28; Volume 8 (Issue 2); e57611.; DOI:10.1371/journal.pone.0057611
OBJECTIVES
Mortality in patients starting antiretroviral therapy (ART) is higher in Malawi and Zambia than in South Africa. We examined whether different monitoring of ART (viral load [VL] in South Africa and CD4 count in Malawi and Zambia) could explain this mortality difference.
DESIGN
Mathematical modelling study based on data from ART programmes.
METHODS
We used a stochastic simulation model to study the effect of VL monitoring on mortality over 5 years. In baseline scenario A all parameters were identical between strategies except for more timely and complete detection of treatment failure with VL monitoring. Additional scenarios introduced delays in switching to second-line ART (scenario B) or higher virologic failure rates (due to worse adherence) when monitoring was based on CD4 counts only (scenario C). Results are presented as relative risks (RR) with 95% prediction intervals and percent of observed mortality difference explained.
RESULTS
RRs comparing VL with CD4 cell count monitoring were 0.94 (0.74-1.03) in scenario A, 0.94 (0.77-1.02) with delayed switching (scenario B) and 0.80 (0.44-1.07) when assuming a 3-times higher rate of failure (scenario C). The observed mortality at 3 years was 10.9% in Malawi and Zambia and 8.6% in South Africa (absolute difference 2.3%). The percentage of the mortality difference explained by VL monitoring ranged from 4% (scenario A) to 32% (scenarios B and C combined, assuming a 3-times higher failure rate). Eleven percent was explained by non-HIV related mortality.
CONCLUSIONS
VL monitoring reduces mortality moderately when assuming improved adherence and decreased failure rates.
Mortality in patients starting antiretroviral therapy (ART) is higher in Malawi and Zambia than in South Africa. We examined whether different monitoring of ART (viral load [VL] in South Africa and CD4 count in Malawi and Zambia) could explain this mortality difference.
DESIGN
Mathematical modelling study based on data from ART programmes.
METHODS
We used a stochastic simulation model to study the effect of VL monitoring on mortality over 5 years. In baseline scenario A all parameters were identical between strategies except for more timely and complete detection of treatment failure with VL monitoring. Additional scenarios introduced delays in switching to second-line ART (scenario B) or higher virologic failure rates (due to worse adherence) when monitoring was based on CD4 counts only (scenario C). Results are presented as relative risks (RR) with 95% prediction intervals and percent of observed mortality difference explained.
RESULTS
RRs comparing VL with CD4 cell count monitoring were 0.94 (0.74-1.03) in scenario A, 0.94 (0.77-1.02) with delayed switching (scenario B) and 0.80 (0.44-1.07) when assuming a 3-times higher rate of failure (scenario C). The observed mortality at 3 years was 10.9% in Malawi and Zambia and 8.6% in South Africa (absolute difference 2.3%). The percentage of the mortality difference explained by VL monitoring ranged from 4% (scenario A) to 32% (scenarios B and C combined, assuming a 3-times higher failure rate). Eleven percent was explained by non-HIV related mortality.
CONCLUSIONS
VL monitoring reduces mortality moderately when assuming improved adherence and decreased failure rates.
Journal Article > Meta-AnalysisFull Text
PLOS One. 2013 July 22; Volume 8 (Issue 7); e68995.; DOI:10.1371/journal.pone.0068995
Pillay P, Ford NP, Shubber Z, Ferrand RA
PLOS One. 2013 July 22; Volume 8 (Issue 7); e68995.; DOI:10.1371/journal.pone.0068995
INTRODUCTION
There is conflicting evidence and practice regarding the use of the non-nucleoside reverse transcriptase inhibitors (NNRTI) efavirenz (EFV) and nevirapine (NVP) in first-line antiretroviral therapy (ART).
METHODS
We systematically reviewed virological outcomes in HIV-1 infected, treatment-naive patients on regimens containing EFV versus NVP from randomised trials and observational cohort studies. Data sources include PubMed, Embase, the Cochrane Central Register of Controlled Trials and conference proceedings of the International AIDS Society, Conference on Retroviruses and Opportunistic Infections, between 1996 to May 2013. Relative risks (RR) and 95% confidence intervals were synthesized using random-effects meta-analysis. Heterogeneity was assessed using the I(2) statistic, and subgroup analyses performed to assess the potential influence of study design, duration of follow up, location, and tuberculosis treatment. Sensitivity analyses explored the potential influence of different dosages of NVP and different viral load thresholds.
RESULTS
Of 5011 citations retrieved, 38 reports of studies comprising 114 391 patients were included for review. EFV was significantly less likely than NVP to lead to virologic failure in both trials (RR 0.85 [0.73-0.99] I(2) = 0%) and observational studies (RR 0.65 [0.59-0.71] I(2) = 54%). EFV was more likely to achieve virologic success than NVP, though marginally significant, in both randomised controlled trials (RR 1.04 [1.00-1.08] I(2) = 0%) and observational studies (RR 1.06 [1.00-1.12] I(2) = 68%).
CONCLUSION
EFV-based first line ART is significantly less likely to lead to virologic failure compared to NVP-based ART. This finding supports the use of EFV as the preferred NNRTI in first-line treatment regimen for HIV treatment, particularly in resource limited settings.
There is conflicting evidence and practice regarding the use of the non-nucleoside reverse transcriptase inhibitors (NNRTI) efavirenz (EFV) and nevirapine (NVP) in first-line antiretroviral therapy (ART).
METHODS
We systematically reviewed virological outcomes in HIV-1 infected, treatment-naive patients on regimens containing EFV versus NVP from randomised trials and observational cohort studies. Data sources include PubMed, Embase, the Cochrane Central Register of Controlled Trials and conference proceedings of the International AIDS Society, Conference on Retroviruses and Opportunistic Infections, between 1996 to May 2013. Relative risks (RR) and 95% confidence intervals were synthesized using random-effects meta-analysis. Heterogeneity was assessed using the I(2) statistic, and subgroup analyses performed to assess the potential influence of study design, duration of follow up, location, and tuberculosis treatment. Sensitivity analyses explored the potential influence of different dosages of NVP and different viral load thresholds.
RESULTS
Of 5011 citations retrieved, 38 reports of studies comprising 114 391 patients were included for review. EFV was significantly less likely than NVP to lead to virologic failure in both trials (RR 0.85 [0.73-0.99] I(2) = 0%) and observational studies (RR 0.65 [0.59-0.71] I(2) = 54%). EFV was more likely to achieve virologic success than NVP, though marginally significant, in both randomised controlled trials (RR 1.04 [1.00-1.08] I(2) = 0%) and observational studies (RR 1.06 [1.00-1.12] I(2) = 68%).
CONCLUSION
EFV-based first line ART is significantly less likely to lead to virologic failure compared to NVP-based ART. This finding supports the use of EFV as the preferred NNRTI in first-line treatment regimen for HIV treatment, particularly in resource limited settings.
Journal Article > ResearchFull Text
BMJ Open. 2018 January 11; Volume 8 (Issue 1); DOI:10.1136/bmjopen-2017-017405
Ballif M, Zurcher K, Reid SE, Boulle AM, Fox MP, et al.
BMJ Open. 2018 January 11; Volume 8 (Issue 1); DOI:10.1136/bmjopen-2017-017405
Seasonal variations in tuberculosis diagnoses have been attributed to seasonal climatic changes and indoor crowding during colder winter months. We investigated trends in pulmonary tuberculosis (PTB) diagnosis at antiretroviral therapy (ART) programmes in Southern Africa.
Journal Article > ResearchFull Text
AIDS. 2019 August 1; Volume 33 (Issue 10); 1635-1644.; DOI:10.1097/QAD.0000000000002234
Shroufi A, van Cutsem G, Cambiano V, Bansi-Matharu L, Duncan K, et al.
AIDS. 2019 August 1; Volume 33 (Issue 10); 1635-1644.; DOI:10.1097/QAD.0000000000002234
BACKGROUND
Many individuals failing first-line antiretroviral therapy (ART) in sub-Saharan Africa never initiate second-line ART or do so after significant delay. For people on ART with a viral load more than 1000 copies/ml, the WHO recommends a second viral load measurement 3 months after the first viral load and enhanced adherence support. Switch to a second-line regimen is contingent upon a persistently elevated viral load more than 1000 copies/ml. Delayed second-line switch places patients at increased risk for opportunistic infections and mortality.
METHODS
To assess the potential benefits of a simplified second-line ART switch strategy, we use an individual-based model of HIV transmission, progression and the effect of ART which incorporates consideration of adherence and drug resistance, to compare predicted outcomes of two policies, defining first-line regimen failure for patients on efavirenz-based ART as either two consecutive viral load values more than 1000 copies/ml, with the second after an enhanced adherence intervention (implemented as per current WHO guidelines) or a single viral load value more than 1000 copies/ml. We simulated a range of setting-scenarios reflecting the breadth of the sub-Saharan African HIV epidemic, taking into account potential delays in defining failure and switch to second-line ART.
FINDINGS
The use of a single viral load more than 1000 copies/ml to define ART failure would lead to a higher proportion of persons with nonnucleoside reverse-transcriptase inhibitor resistance switched to second-line ART [65 vs. 48%; difference 17% (90% range 14-20%)], resulting in a median 18% reduction in the rate of AIDS-related death over setting scenarios (90% range 6-30%; from a median of 3.1 to 2.5 per 100 person-years) over 3 years. The simplified strategy also is predicted to reduce the rate of AIDS conditions by a median of 31% (90% range 8-49%) among people on first-line ART with a viral load more than 1000 copies/ml in the past 6 months. For a country of 10 million adults (and a median of 880 000 people with HIV), we estimate that this approach would lead to a median of 1322 (90% range 67-3513) AIDS deaths averted per year over 3 years. For South Africa this would represent around 10 215 deaths averted annually.
INTERPRETATION
As a step towards reducing unnecessary mortality associated with delayed second-line ART switch, defining failure of first-line efavirenz-based regimens as a single viral load more than 1000 copies/ml should be considered.
Many individuals failing first-line antiretroviral therapy (ART) in sub-Saharan Africa never initiate second-line ART or do so after significant delay. For people on ART with a viral load more than 1000 copies/ml, the WHO recommends a second viral load measurement 3 months after the first viral load and enhanced adherence support. Switch to a second-line regimen is contingent upon a persistently elevated viral load more than 1000 copies/ml. Delayed second-line switch places patients at increased risk for opportunistic infections and mortality.
METHODS
To assess the potential benefits of a simplified second-line ART switch strategy, we use an individual-based model of HIV transmission, progression and the effect of ART which incorporates consideration of adherence and drug resistance, to compare predicted outcomes of two policies, defining first-line regimen failure for patients on efavirenz-based ART as either two consecutive viral load values more than 1000 copies/ml, with the second after an enhanced adherence intervention (implemented as per current WHO guidelines) or a single viral load value more than 1000 copies/ml. We simulated a range of setting-scenarios reflecting the breadth of the sub-Saharan African HIV epidemic, taking into account potential delays in defining failure and switch to second-line ART.
FINDINGS
The use of a single viral load more than 1000 copies/ml to define ART failure would lead to a higher proportion of persons with nonnucleoside reverse-transcriptase inhibitor resistance switched to second-line ART [65 vs. 48%; difference 17% (90% range 14-20%)], resulting in a median 18% reduction in the rate of AIDS-related death over setting scenarios (90% range 6-30%; from a median of 3.1 to 2.5 per 100 person-years) over 3 years. The simplified strategy also is predicted to reduce the rate of AIDS conditions by a median of 31% (90% range 8-49%) among people on first-line ART with a viral load more than 1000 copies/ml in the past 6 months. For a country of 10 million adults (and a median of 880 000 people with HIV), we estimate that this approach would lead to a median of 1322 (90% range 67-3513) AIDS deaths averted per year over 3 years. For South Africa this would represent around 10 215 deaths averted annually.
INTERPRETATION
As a step towards reducing unnecessary mortality associated with delayed second-line ART switch, defining failure of first-line efavirenz-based regimens as a single viral load more than 1000 copies/ml should be considered.
Journal Article > ResearchFull Text
Trop Med Int Health. 2018 May 31; Volume 23 (Issue 8); 834-840.; DOI:10.1111/tmi.13084
Mwaba J, Ferreras E, Chizema Kawesha E, Mwimbe D, Tafirenyika F, et al.
Trop Med Int Health. 2018 May 31; Volume 23 (Issue 8); 834-840.; DOI:10.1111/tmi.13084
OBJECTIVE
To assess the performance of the SD Bioline Cholera Ag O1/O139 rapid diagnostic test (RDT) compared to a reference standard combining culture and PCR for the diagnosis of cholera cases during an outbreak.
METHODS
RDT and bacterial culture were performed on site using fresh stools collected from cholera suspected cases, and from stools enriched in alkaline peptone water. Dried stool samples on filter paper were tested for V. cholerae by PCR in Lusaka (as part of a laboratory technology transfer project) and at a reference laboratory in Paris, France. A sample was considered positive for cholera by the reference standard if any of the culture or PCR tests was positive for V. cholerae O1 or O139.
RESULTS
Among the 170 samples tested with SD Bioline and compared to the reference standard, the RDT showed a sensitivity of 90.9% (95% CI: 81.3-96.6) and specificity of 95.2% (95% CI: 89.1-98.4). After enrichment, the sensitivity was 95.5% (95% CI: 87.3-99.1) and specificity 100% (95% CI: 96.5-100).
CONCLUSION
The observed sensitivity and specificity were within recommendations set by the Global Task Force for Cholera Control on the use of cholera RDT (sensitivity = 90%; specificity = 85%). Although the sample size was small, our findings suggest that the SD Bioline RDT could be used in the field to rapidly alert public health officials to the likely presence of cholera cases when an outbreak is suspected.
To assess the performance of the SD Bioline Cholera Ag O1/O139 rapid diagnostic test (RDT) compared to a reference standard combining culture and PCR for the diagnosis of cholera cases during an outbreak.
METHODS
RDT and bacterial culture were performed on site using fresh stools collected from cholera suspected cases, and from stools enriched in alkaline peptone water. Dried stool samples on filter paper were tested for V. cholerae by PCR in Lusaka (as part of a laboratory technology transfer project) and at a reference laboratory in Paris, France. A sample was considered positive for cholera by the reference standard if any of the culture or PCR tests was positive for V. cholerae O1 or O139.
RESULTS
Among the 170 samples tested with SD Bioline and compared to the reference standard, the RDT showed a sensitivity of 90.9% (95% CI: 81.3-96.6) and specificity of 95.2% (95% CI: 89.1-98.4). After enrichment, the sensitivity was 95.5% (95% CI: 87.3-99.1) and specificity 100% (95% CI: 96.5-100).
CONCLUSION
The observed sensitivity and specificity were within recommendations set by the Global Task Force for Cholera Control on the use of cholera RDT (sensitivity = 90%; specificity = 85%). Although the sample size was small, our findings suggest that the SD Bioline RDT could be used in the field to rapidly alert public health officials to the likely presence of cholera cases when an outbreak is suspected.
Journal Article > ResearchFull Text
Pediatrics. 2012 September 17; Volume 130 (Issue 4); e966-e977.; DOI:10.1542/peds.2011-3020
Gsponer T, Weigel R, Davies MA, Bolton C, Moultrie H, et al.
Pediatrics. 2012 September 17; Volume 130 (Issue 4); e966-e977.; DOI:10.1542/peds.2011-3020
BACKGROUND
Poor growth is an indication for antiretroviral therapy (ART) and a criterion for treatment failure. We examined variability in growth response to ART in 12 programs in Malawi, Zambia, Zimbabwe, Mozambique, and South Africa.
METHODS
Treatment naïve children aged <10 years were included. We calculated weight for age z scores (WAZs), height for age z scores (HAZs), and weight for height z scores (WHZs) up to 3 years after starting ART, by using the World Health Organization standards. Multilevel regression models were used.
RESULTS
A total of 17 990 children (range, 238–8975) were followed for 36 181 person-years. At ART initiation, most children were underweight (50%) and stunted (66%). Lower baseline WAZ, HAZ, and WHZ were the most important determinants of faster catch-up growth on ART. WAZ and WHZ increased rapidly in the first year and stagnated or reversed thereafter, whereas HAZ increased continuously over time. Three years after starting ART, WAZ ranged from −2.80 (95% confidence interval [CI]: −3.66 to −2.02) to −1.98 (95% CI: −2.41 to −1.48) in children with a baseline z score < −3 and from −0.79 (95% CI: −1.62 to 0.02) to 0.05 (95% CI: −0.42 to 0.51) in children with a baseline WAZ ≥ −1. For HAZ, the corresponding range was −2.33 (95% CI: −2.62 to −2.02) to −1.27 (95% CI: −1.58 to −1.00) for baseline HAZ < −3 and −0.24 (95% CI: −0.56 to 0.15) to 0.84 (95% CI: 0.53 to 1.16) for HAZ ≥ −1.
CONCLUSIONS
Despite a sustained growth response and catch-up growth in children with advanced HIV disease treated with ART, normal weights and heights are not achieved over 3 years of ART.
Poor growth is an indication for antiretroviral therapy (ART) and a criterion for treatment failure. We examined variability in growth response to ART in 12 programs in Malawi, Zambia, Zimbabwe, Mozambique, and South Africa.
METHODS
Treatment naïve children aged <10 years were included. We calculated weight for age z scores (WAZs), height for age z scores (HAZs), and weight for height z scores (WHZs) up to 3 years after starting ART, by using the World Health Organization standards. Multilevel regression models were used.
RESULTS
A total of 17 990 children (range, 238–8975) were followed for 36 181 person-years. At ART initiation, most children were underweight (50%) and stunted (66%). Lower baseline WAZ, HAZ, and WHZ were the most important determinants of faster catch-up growth on ART. WAZ and WHZ increased rapidly in the first year and stagnated or reversed thereafter, whereas HAZ increased continuously over time. Three years after starting ART, WAZ ranged from −2.80 (95% confidence interval [CI]: −3.66 to −2.02) to −1.98 (95% CI: −2.41 to −1.48) in children with a baseline z score < −3 and from −0.79 (95% CI: −1.62 to 0.02) to 0.05 (95% CI: −0.42 to 0.51) in children with a baseline WAZ ≥ −1. For HAZ, the corresponding range was −2.33 (95% CI: −2.62 to −2.02) to −1.27 (95% CI: −1.58 to −1.00) for baseline HAZ < −3 and −0.24 (95% CI: −0.56 to 0.15) to 0.84 (95% CI: 0.53 to 1.16) for HAZ ≥ −1.
CONCLUSIONS
Despite a sustained growth response and catch-up growth in children with advanced HIV disease treated with ART, normal weights and heights are not achieved over 3 years of ART.
Journal Article > ResearchFull Text
Bull World Health Organ. 2017 October 19; Volume 96 (Issue 2); 86-93.; DOI:10.2471/BLT.16.189241
Poncin M, Zulu G, Voûte C, Ferreras E, Muleya CM, et al.
Bull World Health Organ. 2017 October 19; Volume 96 (Issue 2); 86-93.; DOI:10.2471/BLT.16.189241
OBJECTIVE
To describe the implementation and feasibility of an innovative mass vaccination strategy - based on single-dose oral cholera vaccine - to curb a cholera epidemic in a large urban setting.
METHOD
In April 2016, in the early stages of a cholera outbreak in Lusaka, Zambia, the health ministry collaborated with Médecins Sans Frontières and the World Health Organization in organizing a mass vaccination campaign, based on single-dose oral cholera vaccine. Over a period of 17 days, partners mobilized 1700 health ministry staff and community volunteers for community sensitization, social mobilization and vaccination activities in 10 townships. On each day, doses of vaccine were delivered to vaccination sites and administrative coverage was estimated.
FINDINGS
Overall, vaccination teams administered 424 100 doses of vaccine to an estimated target population of 578 043, resulting in an estimated administrative coverage of 73.4%. After the campaign, few cholera cases were reported and there was no evidence of the disease spreading within the vaccinated areas. The total cost of the campaign - 2.31 United States dollars (US$) per dose - included the relatively low cost of local delivery - US$ 0.41 per dose.
CONCLUSION
We found that an early and large-scale targeted reactive campaign using a single-dose oral vaccine, organized in response to a cholera epidemic within a large city, to be feasible and appeared effective. While cholera vaccines remain in short supply, the maximization of the number of vaccines in response to a cholera epidemic, by the use of just one dose per member of an at-risk community, should be considered.
To describe the implementation and feasibility of an innovative mass vaccination strategy - based on single-dose oral cholera vaccine - to curb a cholera epidemic in a large urban setting.
METHOD
In April 2016, in the early stages of a cholera outbreak in Lusaka, Zambia, the health ministry collaborated with Médecins Sans Frontières and the World Health Organization in organizing a mass vaccination campaign, based on single-dose oral cholera vaccine. Over a period of 17 days, partners mobilized 1700 health ministry staff and community volunteers for community sensitization, social mobilization and vaccination activities in 10 townships. On each day, doses of vaccine were delivered to vaccination sites and administrative coverage was estimated.
FINDINGS
Overall, vaccination teams administered 424 100 doses of vaccine to an estimated target population of 578 043, resulting in an estimated administrative coverage of 73.4%. After the campaign, few cholera cases were reported and there was no evidence of the disease spreading within the vaccinated areas. The total cost of the campaign - 2.31 United States dollars (US$) per dose - included the relatively low cost of local delivery - US$ 0.41 per dose.
CONCLUSION
We found that an early and large-scale targeted reactive campaign using a single-dose oral vaccine, organized in response to a cholera epidemic within a large city, to be feasible and appeared effective. While cholera vaccines remain in short supply, the maximization of the number of vaccines in response to a cholera epidemic, by the use of just one dose per member of an at-risk community, should be considered.