Tropical Medicine publications 2009
BACKGROUND: Control measures which reduce individual exposure to malaria are expected to reduce disease, but also to eventually reduce immunity. Reassuringly, long term data following community wide ITN distribution show sustained benefits at a population level. However, the more common practice in Sub-Saharan Africa is to target ITN distribution on young children. There are few data on the long term outcomes of this practice. METHODOLOGY/PRINCIPAL FINDINGS: Episodes of febrile malaria were identified by active surveillance in 383 children over 18 months of follow up. In order to compare the short and long term outcomes of ITN use, we examined interactions between ITN use and age (12-42 months of age versus 42-80 months) in determining the risk of febrile malaria. ITN use and older age protected against the first or only episode of malaria (Hazard Ratio [HR] = 0.33, 95%CI 0.17-0.65 and HR = 0.30, 95%CI 0.17-0.51, respectively). The interaction term between ITN use and older age was HR = 2.91, 95%CI 1.02-8.3, p = 0.045, indicating that ITNs did not protect older children. When multiple episodes were included in analysis, ITN use and older age were again protective against malaria episodes (Incident Rate Ratio [IRR] = 0.43 95%CI 0.27-0.7) and IRR = 0.23, 95%CI 0.13-0.42, respectively) and the interaction term indicated that ITNs did not protect older children (IRR = 2.71, 95%CI 1.3-5.7, p = 0.008). CONCLUSIONS/SIGNIFICANCE: These data on age interactions with ITN use suggest that larger scale studies on the long term individual outcomes should be undertaken if the policy of targeted ITN use for vulnerable groups is to continue.
BACKGROUND: The T-cell mediated immune response plays a central role in the control of malaria after natural infection or vaccination. There is increasing evidence that T-cell responses are heterogeneous and that both the quality of the immune response and the balance between pro-inflammatory and regulatory T-cells determines the outcome of an infection. As Malaria parasites have been shown to induce immunosuppressive responses to the parasite and non-related antigens this study examined T-cell mediated pro-inflammatory and regulatory immune responses induced by malaria vaccination in children in an endemic area to determine if these responses were associated with vaccine immunogenicity. METHODS: Using real-time RT- PCR we profiled the expression of a panel of key markers of immunogenecity at different time points after vaccination with two viral vector vaccines expressing the malaria TRAP antigen (FP9-TRAP and MVA-TRAP) or following rabies vaccination as a control. PRINCIPAL FINDINGS: The vaccine induced modest levels of IFN-gamma mRNA one week after vaccination. There was also an increase in FoxP3 mRNA expression in both TRAP stimulated and media stimulated cells in the FFM ME-TRAP vaccine group; however, this may have been driven by natural exposure to parasite rather than by vaccination. CONCLUSION: Quantitative PCR is a useful method for evaluating vaccine induced cell mediated immune responses in frozen PBMC from children in a malaria endemic country. Future studies should seek to use vaccine vectors that increase the magnitude and quality of the IFN-gamma immune response in naturally exposed populations and should monitor the induction of a regulatory T cell response.
Plasmodium falciparum erythrocyte membrane protein 1 (PfEMP1) is a potentially important family of immune targets, which play a central role in the host-parasite interaction by binding to various host molecules. They are encoded by a diverse family of genes called var, of which there are approximately 60 copies in each parasite genome. In sub-Saharan Africa, although P. falciparum infection occurs throughout life, severe malarial disease tends to occur only in childhood. This could potentially be explained if (i) PfEMP1 variants differ in their capacity to support pathogenesis of severe malaria and (ii) this capacity is linked to the likelihood of each molecule being recognized and cleared by naturally acquired antibodies. Here, in a study of 217 Kenyan children with malaria, we show that expression of a group of var genes "cys2," containing a distinct pattern of cysteine residues, is associated with low host immunity. Expression of cys2 genes was associated with parasites from young children, those with severe malaria, and those with a poorly developed antibody response to parasite-infected erythrocyte surface antigens. Cys-2 var genes form a minor component of all genomic var repertoires analyzed to date. Therefore, the results are compatible with the hypothesis that the genomic var gene repertoire is organized such that PfEMP1 molecules that confer the most virulence to the parasite tend also to be those that are most susceptible to the development of host immunity. This may help the parasite to adapt effectively to the development of host antibodies through modification of the host-parasite relationship.
MODS is a novel liquid culture based technique that has been shown to be effective and rapid for early diagnosis of tuberculosis (TB). We evaluated the MODS assay for diagnosis of TB in children in Viet Nam. 217 consecutive samples including sputum (n = 132), gastric fluid (n = 50), CSF (n = 32) and pleural fluid (n = 3) collected from 96 children with suspected TB, were tested by smear, MODS and MGIT. When test results were aggregated by patient, the sensitivity and specificity of smear, MGIT and MODS against "clinical diagnosis" (confirmed and probable groups) as the gold standard were 28.2% and 100%, 42.3% and 100%, 39.7% and 94.4%, respectively. The sensitivity of MGIT and MODS was not significantly different in this analysis (P = 0.5), but MGIT was more sensitive than MODS when analysed on the sample level using a marginal model (P = 0.03). The median time to detection of MODS and MGIT were 8 days and 13 days, respectively, and the time to detection was significantly shorter for MODS in samples where both tests were positive (P<0.001). An analysis of time-dependent sensitivity showed that the detection rates were significantly higher for MODS than for MGIT by day 7 or day 14 (P<0.001 and P = 0.04), respectively. MODS is a rapid and sensitive alternative method for the isolation of M.tuberculosis from children.
Dengue hemorrhagic fever can occur in primary dengue virus (DENV) infection of infants. The decay of maternally derived DENV immunoglobulin (Ig) G and the incidence of DENV infection were determined in a prospectively studied cohort of 1244 Vietnamese infants. Higher concentrations of total IgG and DENV-reactive IgG were found in cord plasma relative to maternal plasma. Maternally derived DENV-neutralizing and E protein-reactive IgG titers declined to below measurable levels in >90% of infants by 6 months of age. In contrast, IgG reactive with whole DENV virions persisted until 12 months of age in 20% of infants. Serological surveillance identified 10 infants with asymptomatic DENV infection for an incidence of 1.7 cases per 100 person-years. DENV-neutralizing antibodies remained measurable for > or = 1 year after infection. These results suggest that whereas DENV infection in infants is frequently subclinical, there is a window between 4 and 12 months of age where virion-binding but nonneutralizing IgG could facilitate antibody-dependent enhancement.
BACKGROUND: Shigellosis remains considerable public health problem in some developing countries. The nature of Shigellae suggests that they are highly adaptable when placed under selective pressure in a human population. This is demonstrated by variation and fluctuations in serotypes and antimicrobial resistance profile of organisms circulating in differing setting in endemic locations. Antimicrobial resistance in the genus Shigella is a constant threat, with reports of organisms in Asia being resistant to multiple antimicrobials and new generation therapies. METHODS: Here we compare microbiological, clinical and epidemiological data from patients with shigellosis over three different periods in southern Vietnam spanning 14 years. RESULTS: Our data demonstrates a shift in dominant infecting species (S. flexneri to S. sonnei) and resistance profile of the organisms circulating in southern Vietnam. We find that there was no significant variation in the syndromes associated with either S. sonnei or S. flexneri, yet the clinical features of the disease are more severe in later observations. CONCLUSIONS: Our findings show a change in clinical presentation of shigellosis in this setting, as the disease may be now more pronounced, this is concurrent with a change in antimicrobial resistance profile. These data highlight the socio-economic development of southern Vietnam and should guide future vaccine development and deployment strategies. TRIAL REGISTRATION: Current Controlled Trials ISRCTN55945881.
We used microarrays and transcriptional profiling of peripheral blood to investigate the host response of 29 individuals who contracted typhoid fever in the Mekong Delta region of Vietnam. Samples were taken over a nine month period encompassing acute disease, convalescence, and recovery. We found that typhoid fever induced a distinct and highly reproducible signature in the peripheral blood that changed during treatment and convalescence, returning in the majority of cases to the "normal" profile as measured in healthy uninfected controls. Unexpectedly, there was a strong, distinct signature of convalescence present at day 9 after infection that remained virtually unchanged one month after acute infection and in some cases persisted as long as nine months despite a complete clinical recovery in all patients. Patients who retain the convalescent signature may be genetically or temporarily incapable of developing an effective immune response and may be more susceptible to reinfection, relapse, or the establishment of a carrier state.
The PCR primers commonly used to detect Plasmodium knowlesi infections in humans were found to cross-react stochastically with P. vivax genomic DNA. A nested primer set that targets one of the P. knowlesi small-subunit rRNA genes was validated for specificity and for sensitivity of detection of <10 parasite genomes.
Antimicrobial-resistant pathogenic members of the Enterobacteriaceae are a well-defined global problem. We hypothesized that one of the main reservoirs of dissemination of antimicrobial resistance genes in Vietnam is non-pathogenic intestinal flora, and sought to isolate antimicrobial-resistant organisms from hospitalized patients and non-hospitalized healthy individuals in Ho Chi Minh City. The results identified substantial faecal carriage of gentamicin-, ceftazidime- and nalidixic acid-resistant members of the Enterobacteriaceae in both hospitalized patients and non-hospitalized healthy individuals. A high prevalence of quinolone resistance determinants was identified, particularly the qnrS gene, in both community- and hospital-associated strains. Furthermore, the results demonstrated that a combination of quinolone resistance determinants can confer resistance to nalidixic acid and ciprofloxacin, even in the apparent absence of additional chromosomal resistance mutations in wild-type strains and laboratory strains with transferred plasmids. These data suggest that intestinal commensal organisms are a significant reservoir for the dissemination of plasmid-mediated quinolone resistance in Ho Chi Minh City.
Ann Trop Paediatr, 29 (4), pp. 251-252. | Citations: 25 (Web of Science Lite) | Read more2009. Severe and fatal vivax malaria challenges 'benign tertian malaria' dogma.
Japanese encephalitis virus (JEV) is estimated to cause 30–50,000 cases of encephalitis every year. The disease occurs mainly in rural Asia and is transmitted to humans from birds and pigs by mosquitoes of the genus Culex. JE is diagnosed with antibody testing of the serum and CSF, but this is not available in many hospitals. Neuroimaging abnormalities, particularly thalamic hypodensity on computed tomography (CT) and hyperintensity on T2 weighted magnetic resonance imaging (MRI) have been described in case studies, but their usefulness for diagnosing JE is not known. We have therefore evaluated the usefulness of neuroimaging (CT and MRI) for the diagnosis of JE. The findings of thalamic lesions were compared with the final serological diagnosis in a cohort of 75 patients (children and adults) with suspected CNS infections in Southern Vietnam, a JEV endemic area. Thalamic lesions on CT and/or MRI combined had sensitivity 23% (95% confidence interval 12.9–33.1%), specificity 100%, positive predictive value 100% and negative predictive value 42.1% (95% confidence interval 30.2–53.8%) for a diagnosis of JE in this cohort. Over time, the thalamic lesions resolved in some patients. One patient showed disappearance of lesions on CT followed by reappearance of the lesions some time later, known as the fogging effect. In this setting, the presence of thalamic abnormalities suggested the diagnosis of JE, but their absence did not exclude it.
We have analyzed the in vitro chemosensitivity profiles of 115 Kenyan isolates for chloroquine (CQ), piperaquine, lumefantrine (LM), and dihydroartemisinin in association with polymorphisms in pfcrt at codon 76 and pfmdr1 at codon 86, as well as with variations of the copy number of pfmdr1. The median drug concentrations that inhibit 50% of parasite growth (IC(50)s) were 41 nM (interquartile range [IQR], 18 to 73 nM), 50 nM (IQR, 29 to 96 nM), 32 nM (IQR, 17 to 46 nM), and 2 nM (IQR, 1 to 3 nM) for CQ, LM, piperaquine, and dihydroartemisinin, respectively. The activity of CQ correlated inversely with that of LM (r(2) = -0.26; P = 0.02). Interestingly, parasites for which LM IC(50)s were higher were wild type for pfcrt-76 and pfmdr1-86. All isolates had one pfmdr1 copy. Thus, the decrease in LM activity is associated with the selection of wild-type pfcrt-76 and pfmdr1-86 parasites, a feature that accounts for the inverse relationship between CQ and LM. Therefore, the use of LM-artemether is likely to lead to the selection of more CQ-susceptible parasites.
Clin Microbiol Infect, 15 Suppl 2 (SUPPL. 2), pp. 95-97. | Citations: 15 (Web of Science Lite) | Read more2009. Molecular detection of Bartonella species in rodents from the Lao PDR.
The PCR primers commonly used to detect Plasmodium knowlesi infections in humans were found to cross-react stochastically with P. vivax genomic DNA. A nested primer set that targets one of the P. knowlesi small-subunit rRNA genes was validated for specificity and for sensitivity of detection of < 10 parasite genomes. Copyright © 2009, American Society for Microbiology. All Rights Reserved.
BACKGROUND: Tarantula spiders are widely kept and bred in captivity by both adults and children. Their bites are generally considered harmless. AIM: To explore the effects of envenoming by Old World tarantulas. DESIGN AND METHODS: Clinical studies and review of conventional literature and hobbyist web sites. RESULTS: Two men bitten on their index fingers by pet Old World tarantula spiders, Lampropelma nigerrimum (Ornithoctoninae) and Pterinochilus murinus (Harpactirinae) in England, developed intense local pain, swelling and episodic, agonising, generalised muscle cramps. In one of them, cramps persisted for 7 days and serum creatine kinase concentration was mildly elevated. A third man bitten on a finger by Poecilotheria regalis (Poecilotheriinae), suffered persistent local cramps in the affected hand. Reports since 1803, including recent ones on hobbyist web-sites, have been largely overlooked. They mentioned muscle spasms after bites by these and other genera of Old World tarantulas, including Eumenophorus, Selenocosmia and Stromatopelma. The severe muscle spasms seen in two of our patients were a challenge to medical treatment and might, under some circumstances, have been life threatening. They demand a toxinological explanation. CONCLUSION: Bites by several genera of African, Asian and Australasian tarantulas can cause systemic neurotoxic envenoming. In the absence of available antivenom, severe persistent muscle spasms, reminiscent of latrodectism, pose a serious therapeutic challenge. Discovery of the toxin responsible would be of scientific and potential clinical benefit. Tarantula keepers should be warned of the danger of handling these animals incautiously.
Clin Infect Dis, 49 (11), pp. 1638-1640. | Citations: 14 (Scopus) | Read more2009. Artemisinin combination therapy for malaria: beyond good efficacy.
PURPOSE OF REVIEW: Effective surveillance for and rapid identification of evolved antimalarial resistance ensures that all patients are treated with efficacious drugs. This review summarizes the current status and the challenges to effective surveillance, and suggests approaches for improvement. RECENT FINDINGS: The replacement of older drugs by artemisinin combination therapies (ACTs) as the recommended treatment for malaria has dramatically improved treatment outcomes wherever ACTs have been deployed effectively. Moreover, there has been considerable technical and organizational progress, and support for the health professionals needed to carry out this work is also increasing. As a result, the prospects for more effective surveillance of antimalarial resistance, and other vital health information are improving. However, resistance to the artemisinin component of ACTs is already suspected in Cambodia, and the current methods for tracking this resistance are not yet in place. Identification of efficient markers of ACT efficacy is a crucial challenge. SUMMARY: Technical advances alone are not sufficient. Detection of decreased drug efficacy is only the first step to producing accessible and useful information for decision makers. The translation of increased access to data on health outcomes into usable evidence for rational policy and planning requires a global coordination and communication effort.
Whereas most nontyphoidal Salmonella (NTS) are associated with gastroenteritis, there has been a dramatic increase in reports of NTS-associated invasive disease in sub-Saharan Africa. Salmonella enterica serovar Typhimurium isolates are responsible for a significant proportion of the reported invasive NTS in this region. Multilocus sequence analysis of invasive S. Typhimurium from Malawi and Kenya identified a dominant type, designated ST313, which currently is rarely reported outside of Africa. Whole-genome sequencing of a multiple drug resistant (MDR) ST313 NTS isolate, D23580, identified a distinct prophage repertoire and a composite genetic element encoding MDR genes located on a virulence-associated plasmid. Further, there was evidence of genome degradation, including pseudogene formation and chromosomal deletions, when compared with other S. Typhimurium genome sequences. Some of this genome degradation involved genes previously implicated in virulence of S. Typhimurium or genes for which the orthologs in S. Typhi are either pseudogenes or are absent. Genome analysis of other epidemic ST313 isolates from Malawi and Kenya provided evidence for microevolution and clonal replacement in the field.
Cardiovascular disease (CVD) is a leading cause of death throughout the world. In high income countries, the greatest burden of disease is seen in those from lower socio-economic groups. It is therefore likely that CVD is an important issue for prisoners in the UK, the majority of whom were either unemployed or in non-skilled employment prior to imprisonment. However, there is little research examining this issue. The aim of this study was to examine the prevalence of five modifiable cardiovascular risk factors (smoking, physical activity, diet, body mass index and hypertension) in women prisoners on entry to prison and then 1 month after imprisonment. This was a prospective longitudinal study involving 505 women prisoners in England. Participants completed a questionnaire containing questions about health-related behaviours within 72 h of entering prison. The researchers measured their blood pressure, height and weight. They followed up all participants who were still imprisoned 1 month later and invited them to participate again. The results showed that women prisoners were at high risk of CVD in the future; 85% smoked cigarettes, 87% were insufficiently active to benefit their health, 86% did not eat at least five portions of fruit and vegetables each day and 30% were overweight or obese. After 1 month, there were few improvements in risk factors. This may in part reflect the fact that, unlike prisons in other high income countries, there are currently no systematic approaches which address these health issues within UK women's prisons.
PLoS Medicine, 6 (12), pp. e1000207-e1000207. | Read more2009. The Severity of Pandemic H1N1 Influenza in the United States, from April to July 2009: A Bayesian Analysis
Tuberculous meningitis is the most dangerous form of tuberculosis, yet our understanding of disease pathogenesis is based upon studies performed in the 1920s, our diagnostic methods are dependent upon those developed in the 1880 s, and our treatment has advanced little since the introduction of isoniazid in the 1950s. The authors focus this review on three important questions. First, how does Mycobacterium tuberculosis reach the brain? Second, what is the best way of identifying patients who require early empiric antituberculosis therapy? Third, what is the best way of managing tuberculous hydrocephalus?
BACKGROUND: To design an effective strategy for the control of malaria requires a map of infection and disease risks to select appropriate suites of interventions. Advances in model based geo-statistics and malaria parasite prevalence data assemblies provide unique opportunities to redefine national Plasmodium falciparum risk distributions. Here we present a new map of malaria risk for Kenya in 2009. METHODS: Plasmodium falciparum parasite rate data were assembled from cross-sectional community based surveys undertaken from 1975 to 2009. Details recorded for each survey included the month and year of the survey, sample size, positivity and the age ranges of sampled population. Data were corrected to a standard age-range of two to less than 10 years (PfPR2-10) and each survey location was geo-positioned using national and on-line digital settlement maps. Ecological and climate covariates were matched to each PfPR2-10 survey location and examined separately and in combination for relationships to PfPR2-10. Significant covariates were then included in a Bayesian geostatistical spatial-temporal framework to predict continuous and categorical maps of mean PfPR2-10 at a 1 x 1 km resolution across Kenya for the year 2009. Model hold-out data were used to test the predictive accuracy of the mapped surfaces and distributions of the posterior uncertainty were mapped. RESULTS: A total of 2,682 estimates of PfPR2-10 from surveys undertaken at 2,095 sites between 1975 and 2009 were selected for inclusion in the geo-statistical modeling. The covariates selected for prediction were urbanization; maximum temperature; precipitation; enhanced vegetation index; and distance to main water bodies. The final Bayesian geo-statistical model had a high predictive accuracy with mean error of -0.15% PfPR2-10; mean absolute error of 0.38% PfPR2-10; and linear correlation between observed and predicted PfPR2-10 of 0.81. The majority of Kenya's 2009 population (35.2 million, 86.3%) reside in areas where predicted PfPR2-10 is less than 5%; conversely in 2009 only 4.3 million people (10.6%) lived in areas where PfPR2-10 was predicted to be > or =40% and were largely located around the shores of Lake Victoria. CONCLUSION: Model based geo-statistical methods can be used to interpolate malaria risks in Kenya with precision and our model shows that the majority of Kenyans live in areas of very low P. falciparum risk. As malaria interventions go to scale effectively tracking epidemiological changes of risk demands a rigorous effort to document infection prevalence in time and space to remodel risks and redefine intervention priorities over the next 10-15 years.
BACKGROUND: Hemolysis causes anemia in falciparum malaria, but its contribution to microvascular pathology in severe malaria (SM) is not well characterized. In other hemolytic diseases, release of cell-free hemoglobin causes nitric oxide (NO) quenching, endothelial activation, and vascular complications. We examined the relationship of plasma hemoglobin and myoglobin to endothelial dysfunction and disease severity in malaria. METHODS: Cell-free hemoglobin (a potent NO quencher), reactive hyperemia peripheral arterial tonometry (RH-PAT) (a measure of endothelial NO bioavailability), and measures of perfusion and endothelial activation were quantified in adults with moderately severe (n = 78) or severe (n = 49) malaria and control subjects (n = 16) from Papua, Indonesia. RESULTS: Cell-free hemoglobin concentrations in patients with SM (median, 5.4 micromol/L; interquartile range [IQR], 3.2-7.4 micromol/L) were significantly higher than in those with moderately severe malaria (2.6 micromol/L; IQR, 1.3-4.5 micromol/L) or controls (1.2 micromol/L; IQR, 0.9-2.4 micromol/L; P < .001). Multivariable regression analysis revealed that cell-free hemoglobin remained inversely associated with RH-PAT, and in patients with SM, there was a significant longitudinal association between improvement in RH-PAT index and decreasing levels of cell-free hemoglobin (P = .047). Cell-free hemoglobin levels were also independently associated with lactate, endothelial activation, and proinflammatory cytokinemia. CONCLUSIONS: Hemolysis in falciparum malaria results in NO quenching by cell-free hemoglobin, and may exacerbate endothelial dysfunction, adhesion receptor expression and impaired tissue perfusion. Treatments that increase NO bioavailability may have potential as adjunctive therapies in SM.
Adjunctive treatment to improve outcome from bacterial meningitis has centered on dexamethasone. Among Vietnamese patients with bacterial meningitis, cerebrospinal fluid (CSF) opening pressure and CSF:plasma glucose ratios were significantly improved and levels of CSF cytokines interleukin (IL)-6, IL-8, and IL-10 and were all statistically significantly lower after treatment in patients who were randomized to dexamethasone, compared with levels in patients who received placebo.
The Roll Back Malaria (RBM) partnership has established goals for protecting vulnerable populations with locally appropriate vector control. In many places, these goals will be achieved by the mass distribution of insecticide treated bednets (ITNs). Mathematical models can forecast an ITN-driven realignment of malaria endemicity, defined by the Plasmodium falciparum parasite rate (PfPR) in children, to predict PfPR endpoints and appropriate program timelines for this change in Africa. The relative ease of measuring PfPR and its widespread use make it particularly suitable for monitoring and evaluation. This theory provides a method for context-dependent evaluation of ITN programs and a basis for setting rational ITN coverage targets over the next decade.
An otherwise healthy 20-year-old woman in Goa, India, received antibiotics after a diagnosis of upper respiratory tract infection. One week later, vivax malaria was diagnosed at a health center, but the patient developed respiratory distress and lost consciousness. She arrived at emergency department in shock, breathless, and comatose. She died within minutes. Two independent laboratories later confirmed Plasmodium vivax by microscopy (140,000/microL) and by nested and real-time polymerase chain reaction methods. Post-mortem examination showed congestion of alveolar capillaries by heavy monocytic infiltrates, along with diffuse damage to alveolar membranes consistent with acute respiratory distress syndrome. Parasites seen in lung tissue were roughly proportionate to both peripheral hyperparasitemia and those seen in other organs without lesions. In this patient, vivax malaria caused a rapidly fatal respiratory distress.
OBJECTIVE: Serious morbidity and mortality following snakebite injuries are common in tropical regions of the world. Although antivenom administration is clinically effective, it carries an important risk of early anaphylactic reactions, ranging from relatively benign nausea, vomiting, and urticaria to life-threatening angioedema, bronchospasm and hypotension. Currently, no adequately powered study has demonstrated significant benefit from the use of any prophylactic drug. A high rate of anaphylactic reactions observed during a trial of three different antivenoms in Ecuador prompted adoption of premedication with intravenous (i.v.) hydrocortisone and diphenhydramine together with dilution and slower administration of antivenom. DESIGN: In a rural mission hospital in Eastern Ecuador, 53 consecutive snakebite victims received a new antivenom regimen in 2004-2006, comprising prophylactic drugs and i.v. infusion of diluted antivenom over 60 min. They were compared to an historical control cohort of 76 patients treated in 1997-2002 without prophylactic drugs and with i.v. "push" injection of undiluted antivenom over 10 min. All these patients had incoagulable blood on admission and all were treated with Brazilian Instituto Butantan polyspecific antivenom. RESULTS: Baseline characteristics of the historical control and premedicated groups were broadly similar. In the historical group, early reaction rates were as follows: 51% of patients had no reaction; 35% had mild reactions; 6% moderate; and 6% severe. In the premedicated/slow i.v. group, 98% of patients had no reaction; 0 mild; 0 moderate; and 2% severe. The difference in reaction rates was statistically significant (p<0.001). CONCLUSIONS: Premedication with intravenous hydrocortisone and diphenhydramine together with dilution of antivenom and its administration by i.v. infusion over 60 min appeared to reduce both the frequency and severity of anaphylactic reactions. A randomized blinded controlled trial is needed to confirm these encouraging preliminary findings.
More than 20% of adults are persistently colonized with Staphylococcus aureus. When hospitalized, these carriers have increased risks of infection with their own strains. However, a recent study demonstrated a lower incidence of bacteremia-related death among carriers than among noncarriers, raising the question whether the adaptive immune system plays a protective role. In fact, S. aureus carriers mount a highly specific neutralizing antibody response against superantigens of their colonizing strains. We now used 2-dimensional immunoblotting to investigate the profiles of antibodies from healthy individuals against S. aureus extracellular proteins. Moreover, we tested whether symptom-free experimental colonization of these individuals with an S. aureus strain of low virulence, 8325-4, is sufficient to induce an antibody response. Sera obtained before and 4 weeks after colonization were screened for immunoglobulin G (IgG) antibody binding to extracellular staphylococcal proteins. At baseline, most volunteers harbored IgG directed against conserved virulence factors, including alpha-hemolysin (Hla), beta-hemolysin (Hlb), phospholipase C (Plc), staphylococcal serine protease (SspA), and cysteine protease (SspB). However, the variability of spot patterns and intensities was striking and could be important in case of infection. Experimental nasal colonization with S. aureus 8325-4 did not elicit new antibodies or boost the humoral response. Thus, the high antibody prevalence in humans is likely not induced by short-term nasal colonization, and presumably minor infections are required to trigger anti-S. aureus antibody responses.
J Clin Microbiol, 47 (11), pp. 3405-3408. | Citations: 38 (Web of Science Lite) | Read more2009. Breathing new life into pneumonia diagnostics.
BACKGROUND: Although necessary for developing a rationale for vaccination, the burden of severe respiratory syncytial virus (RSV) disease in children in resource-poor settings remains poorly defined. METHODS: We conducted prospective surveillance of severe and very severe pneumonia in children aged <5 years admitted from 2002 through 2007 to Kilifi district hospital in coastal Kenya. Nasal specimens were screened for RSV antigen by immunofluorescence. Incidence rates were estimated for the well-defined population. RESULTS: Of 25,149 hospital admissions, 7359 patients (29%) had severe or very severe pneumonia, of whom 6026 (82%) were enrolled. RSV prevalence was 15% (20% among infants) and 27% during epidemics (32% among infants). The proportion of case patients aged 3 months was 65%, and the proportion aged 6 months was 43%. Average annual hospitalization rates were 293 hospitalizations per 100,000 children aged <5 years (95% confidence interval, 271-371 hospitalizations per 100,000 children aged <5 years) and 1107 hospitalizations per 100,000 infants (95% confidence interval, 1012-1211 hospitalizations per 100,000 infants). Hospital admission rates were double in the region close to the hospital. Few patients with RSV infection had life-threatening clinical features or concurrent serious illnesses, and the associated mortality was 2.2%. CONCLUSIONS: In this low-income setting, rates of hospital admission with RSV-associated pneumonia are substantial; they are comparable to estimates from the United States but considerably underestimate the burden in the full community. An effective vaccine for children aged >2 months (outside the age group of poor responders) could prevent a large portion of RSV disease. Severity data suggest that the justification for RSV vaccination will be based on the prevention of morbidity, not mortality.
Malaria pigment is an intracellular inclusion body that appears in blood and tissue specimens on microscopic examination and can help in establishing the diagnosis of malaria. In simple light microscopy, it can be difficult to discern from cellular background and artifacts. It has long been known that if polarized light microscopy is used, malaria pigment can be much easier to distinguish. However, this technique is rarely used because of the need for a relatively costly polarization microscope. We describe a simple and economical technique to convert any standard light microscope suitable for examination of malaria films into a polarization microscope.
Health-care-associated methicillin-resistant Staphylococcus aureus (MRSA) infection may cause increased hospital stay, or sometimes death. Quantifying this effect is complicated because the exposure is time dependent: infection may prolong hospital stay, while longer stays increase infection risk. In this paper, the authors overcome these problems by using a multinomial longitudinal model to estimate the daily probability of death and discharge. They then extend the basic model to estimate how the effect of MRSA infection varies over time and to quantify number of excess days in the intensive care unit due to infection. They found that infection decreased the relative risk of discharge (relative risk ratio = 0.68, 95% credible interval: 0.54, 0.82). Infection on the first day of admission resulted in a mean extra stay of 0.3 days (95% credible interval: 0.1, 0.5) for a patient with an Acute Physiology and Chronic Health Evaluation II score of 10 and 1.2 days (95% credible interval: 0.5, 2.0) for a patient with a score of 30. The decrease in the relative risk of discharge remained fairly constant with day of MRSA infection but was slightly stronger closer to the start of infection. Results confirm the importance of MRSA infection in increasing stay in an intensive care unit but suggest that previous work may have systematically overestimated the effect size.
We tracked the effective reproductive number (Rt) over time to assess the impact of important public health control measures in the five most SARS-affected geographic areas in mainland China. As soon as the Chinese authorities gained full control of all activities to combat SARS, Rt decreased dramatically and consistently below one. Many control measures that seriously affected public life were implemented afterwards, i.e., when the epidemic was already dying down.
OBJECTIVE: To quantify the transmissibility of severe acute respiratory syndrome (SARS) in hospitals in mainland China and to assess the effectiveness of control measures. METHODS: We report key epidemiological details of three major hospital outbreaks of SARS in mainland China, and estimate the evolution of the effective reproduction number in each of the three hospitals during the course of the outbreaks. RESULTS: The three successive hospital outbreaks infected 41, 99 and 91 people of whom 37%, 60% and 70% were hospital staff. These cases resulted in 33 deaths, five of which occurred in hospital staff. In a multivariate logistic regression, age and whether or not the case was a healthcare worker (HCW) were found to be significant predictors of mortality. The estimated effective reproduction numbers (95% CI) for the three epidemics peaked at 8 (5, 11), 9 (4, 14) and 12 (7, 17). In all three hospitals the epidemics were rapidly controlled, bringing the reproduction number below one within 25, 10 and 5 days respectively. CONCLUSIONS: This work shows that in three major hospital epidemics in Beijing and Tianjin substantially higher rates of transmission were initially observed than those seen in the community. In all three cases the hospital epidemics were rapidly brought under control, with the time to successful control becoming shorter in each successive outbreak.
OBJECTIVE: To assess technical and operational performance of a dried blood spot (DBS)-based HIV-1 RNA service for remote healthcare facilities in a low-income country. DESIGN: A method comparison and operational evaluation of DBS RNA against conventional tests for early infant diagnosis of HIV and HIV RNA quantitation under field conditions in Tanzania. METHODS: DBSs were prepared and plasma was frozen at -80 degrees C. DBSs were mailed and plasma couriered to a central laboratory for testing using the Abbott m2000 system. Infant diagnosis DBSs were also tested for HIV-1 DNA by ROCHE COBAS AmpliPrep/COBAS TaqMan System. Results of DBS RNA were compared with conventional tests; program performance was described. RESULTS: Among 176 infant diagnosis participants, using a threshold of at least 1000 copies/ml, sensitivity and specificity of DBS versus plasma RNA were 1.00 and 0.99, and of DBS RNA versus DBS DNA were 0.97 and 1.00. Among 137 viral load monitoring participants, when plasma and DBS RNA were compared, r value was 0.9709; r value was 0.9675 for at least 5000 copies/ml but was 0.7301 for less than 5000 copies/ml. The highest plasma RNA value at which DBS RNA was not detected was 2084 copies/ml. Median (range) turnaround time from sample collection to result receipt at sites was 23 (4-69) days. The Tanzania mail service successfully transmitted all DBS and results between sites and the central laboratory. CONCLUSION: Under program conditions in Tanzania, DBS provided HIV-1 RNA results comparable to conventional methods to remote healthcare facilities. DBS RNA testing is an alternative to liquid plasma for HIV-1 RNA services in remote areas.
OBJECTIVES: Shigellosis remains a major public health problem in developing countries. Antimicrobial resistance has complicated the empirical treatment. Knowledge of serotypes is crucial in vaccine development, as cross-protection between various serotypes is limited. Therefore we conducted a prospective study to determine the frequency of isolation of Shigella serotypes and antimicrobial resistance. METHODS: Stool samples from 8155 individuals, collected through a surveillance study conducted in four slums of Karachi from January 2002 to March 2004, were cultured. RESULTS: Shigella was isolated in 394 (4.8%) of 8155 patients presenting with diarrhea. Two hundred and forty-two (62%) isolates were Shigella flexneri, 72 (18%) were Shigella sonnei, 43 (11%) were Shigella boydii, and 37 (9%) were Shigella dysenteriae. Thirteen S. flexneri serotypes were identified, of which the most frequent were 2a (38), 6 (37), and 1b (25), followed by 2b (23). Only 22 (5.6%) Shigella isolates were found to be pan-susceptible. Large proportions of isolates were resistant to co-trimoxazole (89% S. flexneri, 81% S. dysenteriae, 80% S. sonnei, and 56% S. boydii) and ampicillin (87% S. flexneri, 68% S. dysenteriae, 35% S. boydii, and 4% S. sonnei). CONCLUSIONS: Concurrent circulation of multiple strains with high resistance is worrying and mandates surveillance at the national level to facilitate the control of shigellosis.
Lancet, 374 (9700), pp. 1478-1480. | Citations: 6 (Web of Science Lite) | Read more2009. Intermittent preventive treatment of malaria in infancy.
NEW ENGLAND JOURNAL OF MEDICINE, 361 (18), pp. 1808-1808. | Citations: 4 (Web of Science Lite)2009. Artemisinin Resistance in Plasmodium falciparum Malaria REPLY
BACKGROUND: A major handicap in developing a malaria vaccine is the difficulty in pinpointing the immune responses that protect against malaria. The protective efficacy of natural or vaccine-induced immune responses against malaria is normally assessed by relating the level of the responses in an individual at the beginning of a follow-up period and the individual's experience of malaria infection or disease during the follow-up. This approach has identified a number of important responses against malaria, but their protective efficacies vary considerably between studies. HYPOTHESIS: It is likely that apart from differences in study methodologies, differences in exposure among study subjects within each study and brevity of antibody responses to malaria antigen are important sources of the variation in protective efficacy of anti-malaria immune responses mentioned above. Since malaria immunity is not complete, anyone in an area of stable malaria transmission who does not become asymptomatically or symptomatically infected during follow-up subsequent to treatment is most likely unexposed rather than immune. TESTING THE HYPOTHESIS: It is proposed that individuals involved in a longitudinal study of malaria immunity should be treated for malaria prior to the start of the study and only those who present with at least an asymptomatic infection during the follow-up should be included in the analysis. In addition, it is proposed that more closely repeated serological survey should be carried out during follow-up in order to get a better picture of an individual's serological status. IMPLICATIONS OF THE HYPOTHESIS: Failure to distinguish between individuals who do not get a clinical episode during follow-up because they were unexposed and those who are genuinely immune undermines our ability to assign a protective role to immune responses against malaria. The brevity of antibodies responses makes it difficult to assign the true serological status of an individual at any given time, i.e. those positive at a survey may be negative by the time they encounter the next infection.
NEW ENGLAND JOURNAL OF MEDICINE, 361 (17), pp. 1714-1714. | Citations: 8 (Web of Science Lite)2009. Artemisinin Resistance in Plasmodium falciparum Malaria (vol 361, pg 455, 2009)
Effective malaria control requires information on both the geographical distribution of malaria risk and the effectiveness of malaria interventions. The current standard for estimating malaria infection and impact indicators are household cluster surveys, but their complexity and expense preclude frequent and decentralized monitoring. This paper reviews the historical experience and current rationale for the use of schools and school children as a complementary, inexpensive framework for planning, monitoring and evaluating malaria control in Africa. Consideration is given to (i) the selection of schools; (ii) diagnosis of infection in schools; (iii) the representativeness of schools as a proxy of the communities they serve; and (iv) the increasing need to evaluate interventions delivered through schools. Finally, areas requiring further investigation are highlighted.
BACKGROUND: In sub-Saharan Africa, more than 90% of children with sickle-cell anaemia die before the diagnosis can be made. The causes of death are poorly documented, but bacterial sepsis is probably important. We examined the risk of invasive bacterial diseases in children with sickle-cell anaemia. METHODS: This study was undertaken in a rural area on the coast of Kenya, with a case-control approach. We undertook blood cultures on all children younger than 14 years who were admitted from within a defined study area to Kilifi District Hospital between Aug 1, 1998, and March 31, 2008; those with bacteraemia were defined as cases. We used two sets of controls: children recruited by random sampling in the same area into several studies undertaken between Sept 1, 1998, and Nov 30, 2005; and those born consecutively within the area between May 1, 2006, and April 30, 2008. Cases and controls were tested for sickle-cell anaemia retrospectively. FINDINGS: We detected 2157 episodes of bacteraemia in 38 441 admissions (6%). 1749 of these children with bacteraemia (81%) were typed for sickle-cell anaemia, of whom 108 (6%) were positive as were 89 of 13 492 controls (1%). The organisms most commonly isolated from children with sickle-cell anaemia were Streptococcus pneumoniae (44/108 isolates; 41%), non-typhi Salmonella species (19/108; 18%), Haemophilus influenzae type b (13/108; 12%), Acinetobacter species (seven of 108; 7%), and Escherichia coli (seven of 108; 7%). The age-adjusted odds ratio for bacteraemia in children with sickle-cell anaemia was 26.3 (95% CI 14.5-47.6), with the strongest associations for S pneumoniae (33.0, 17.4-62.8), non-typhi Salmonella species (35.5, 16.4-76.8), and H influenzae type b (28.1, 12.0-65.9). INTERPRETATION: The organisms causing bacteraemia in African children with sickle-cell anaemia are the same as those in developed countries. Introduction of conjugate vaccines against S pneumoniae and H influenzae into the childhood immunisation schedules of African countries could substantially affect survival of children with sickle-cell anaemia. FUNDING: Wellcome Trust, UK.
BACKGROUND: To assess the incidence and economic burden of rotavirus diarrhea and the potential cost-effectiveness of a rotavirus immunization program in rural Zhengding County in Hebei Province, China. METHODS: Population-based surveillance was conducted during the peak season for diarrhea among children who were <5 years of age in Zhengding County from 14 October 2004 through 19 January 2005. The cost of illness was measured from the perspectives of both patient and society. A decision-analytic model was applied to the cost-effectiveness analysis using real data derived from surveillance and from a cost-of-illness study. RESULTS: During the surveillance period, 500 episodes of diarrhea were registered. Of these 500 episodes, 125 (25%) occurred in patients who were positive for rotavirus. Of these 125 episodes, 63 (50%) occurred in patients who were hospitalized. The overall incidence rate of rotavirus infection was 61.4 cases per 1000 children per year during the 14-week epidemic season. For a Chinese cohort of 5000 newborns, a universal rotavirus immunization program would prevent 1764 cases of rotavirus diarrhea, averting 882 hospitalizations of patients <or=5 years of age. At 2004 prices, the net savings due to the immunization program would be US$14,112 from a societal perspective and US$34,751 from a patient perspective. CONCLUSION: Rotavirus was a leading cause of severe diarrhea among children <5 years of age and an economic burden for farmers in rural Zhengding County. Rotavirus vaccination should be considered as a potential cost-effective measure against rotavirus infection in China.
Melioidosis is an infectious disease with a propensity for relapse, despite prolonged antibiotic eradication therapy for 12 to 20 weeks. A pharmacokinetic (PK) simulation study was performed to determine the optimal dosing of cotrimoxazole (trimethoprim-sulfamethoxazole [TMP-SMX]) used in current eradication regimens in Thailand and Australia. Data for bioavailability, protein binding, and coefficients of absorption and elimination were taken from published literature. Apparent volumes of distribution were correlated with body mass and were estimated separately for Thai and Australian populations. In vitro experiments demonstrated concentration-dependent killing. In Australia, the currently used eradication regimen (320 [TMP]/1,600 [SMX] mg every 12 h [q12h]) was predicted to achieve the PK-pharmacodynamic (PD) target (an area under the concentration-time curve from 0 to 24 h/MIC ratio of >25 for both TMP and SMX) for strains with the MIC90 of Australian strains (< or = 1/19 mg/liter). In Thailand, the former regimen of 160/800 mg q12h would not be expected to attain the target for strains with an MIC of > or = 1/19 mg/liter, but the recently implemented weight-based regimen (<40 kg [body weight], 160/800 mg q12h; 40 to 60 kg, 240/1,200 mg q12h; >60 kg, 320/1,600 mg q12h) would be expected to achieve adequate concentrations for strains with an MIC of < or = 1/19 mg/liter. The results were sensitive to the variance of the PK parameters. Prospective PK-PD studies of Asian populations are needed to optimize TMP-SMX dosing in melioidosis.
A prospective study in Thailand identified 106 patients with culture-proven leptospirosis. The accuracy of the microscopic agglutination test (MAT) in predicting the infecting serovar was evaluated in 78/106 (74%) patients with a diagnostic titer. MAT correctly determined the infecting serovar in 26 cases (33%), indicating that this assay is a poor predictor of infecting serovar in our setting.
BACKGROUND: Castor oil is one of the most popular drugs for induction of labour in a non-medical setting; however, published data on safety and effectiveness of this compound to induce labour remain sparse. AIM: To assess the safety and effectiveness of castor oil for induction of labour in pregnancies with an ultrasound estimated gestational at birth of more than 40 weeks. METHODS: Data were extracted from hospital-based records of all pregnant women who attended antenatal clinics on the Thai-Burmese border and who were more than 40 weeks pregnant. The effectiveness of castor oil to induce labour was expressed as time to birth and analysed with a Cox proportional hazards regression model. Measures associated with safety were fetal distress, meconium-stained amniotic fluid, tachysystole of the uterus, uterine rupture, abnormal maternal blood pressure during labour, Apgar scores, neonatal resuscitation, stillbirth, post-partum haemorrhage, severe diarrhoea and maternal death. Proportions were compared using Fisher's exact test. RESULTS: Of 612 women with a gestation of more than 40 weeks, 205 received castor oil for induction and 407 did not. The time to birth was not significantly different between the two groups (hazard ratio 0.99 (95% confidence interval: 0.81 to 1.20; n = 509)). Castor oil use was not associated with any harmful effects on the mother or fetus. CONCLUSIONS: Castor oil for induction of labour had no effect on time to birth nor were there any harmful effects observed in this large series. Our findings leave no justification for recommending castor oil for this purpose.
OBJECTIVES: Ultrasound examination of the fetus is a powerful tool for assessing gestational age and detecting obstetric problems but is rarely available in developing countries. The aim of this study was to assess the intraobserver and interobserver agreement of fetal biometry by locally trained health workers in a refugee camp on the Thai-Burmese border. METHODS: One expatriate doctor and four local health workers participated in the study, which included examinations performed on every fifth pregnant woman with a singleton pregnancy between 16 and 40 weeks' gestation, and who had undergone an early dating ultrasound scan, attending the antenatal clinic in Maela refugee camp. At each examination, two examiners independently measured biparietal diameter (BPD), head circumference (HC), abdominal circumference (AC) and femur length (FL), with one of the examiners obtaining duplicate measurements of each parameter. Intraobserver measurement error was assessed using the intraclass correlation coefficient (ICC) and interobserver error was assessed by the Bland and Altman 95% limits of agreement method. RESULTS: A total of 4188 ultrasound measurements (12 per woman) were obtained in 349 pregnancies at a median gestational age of 27 (range, 16-40) weeks in 2008. The ICC for BPD, HC, AC and FL was greater than 0.99 for all four trainees and the doctor (range, 0.996-0.998). For gestational ages between 18 and 24 weeks, interobserver 95% limits of agreement corresponding to differences in estimated gestational age of less than +/- 1 week were calculated for BPD, HC, AC and FL. Measurements by local health workers showed high levels of agreement with those of the expatriate doctor. CONCLUSIONS: Locally trained health workers working in a well organized unit with ongoing quality control can obtain accurate fetal biometry measurements for gestational age estimation. This experience suggests that training of local health workers in developing countries is possible and could allow effective use of obstetric ultrasound imaging.
The mechanisms underlying the bleeding manifestations and coagulopathy associated with dengue remain unclear, in part because of the focus of much previous work on severe disease without an appropriate comparison group. We describe detailed clinical and laboratory profiles for a large group of children with dengue of all severities, and a group with similar non-dengue febrile illnesses, all followed prospectively from early presentation through to recovery. Among the dengue-infected patients but not the controls, thrombocytopenia, increased partial thromboplastin times and reduced fibrinogen concentrations were apparent from an early stage, and these abnormalities correlated strongly with the severity and timing of vascular leakage but not bleeding. There was little evidence of procoagulant activation. The findings do not support a primary diagnosis of disseminated intravascular coagulation to explain the intrinsic coagulopathy. An alternative biologically plausible hypothesis is discussed.
Med Sci (Paris), 25 (10), pp. 867-869. | Citations: 2 (Web of Science Lite) | Read more2009. [Malaria in pregnancy: a therapeutic dilemma].
Whether the number of concurrent clones in asymptomatic Plasmodium falciparum infections reflects the degree of host protection was investigated in children living in areas with different levels of transmission on the coast of Kenya. The number of concurrent clones was determined on the basis of polymorphism in msp2, which encodes the vaccine candidate antigen merozoite surface protein 2. In a low-transmission area, most children had monoclonal infections, and diversity did not predict a risk of clinical malaria. In an area of moderate transmission, asymptomatic infections with 2 clones were, compared with 1 clone, associated with an increased risk of subsequent malaria. In a comparative assessment in a high-transmission area in Tanzania, multiclonal infections conferred a reduced risk. The different nonlinear associations between the number of clones and malaria morbidity suggest that levels of tolerance to multiclonal infections are transmission dependent as a result of cumulative exposure to antigenically diverse P. falciparum infections.
Of 860 snakes brought to 10 hospitals in Sri Lanka with the patients they had bitten, 762 (89%) were venomous. Russell's vipers (Daboia russelii) and hump-nosed pit vipers (Hypnale hypnale) were the most numerous and H. hypnale was the most widely distributed. Fifty-one (6%) were misidentified by hospital staff, causing inappropriate antivenom treatment of 13 patients. Distinctive clinical syndromes were identified to aid species diagnosis in most cases of snake bite in Sri Lanka where the biting species is unknown. Diagnostic sensitivities and specificities of these syndromes for envenoming were 78% and 96% by Naja naja, 66% and 100% by Bungarus caeruleus, 14% and 100% by Daboia russelii, and 10% and 97% by Hypnale hypnale, respectively. Although only polyspecific antivenoms are used in Sri Lanka, species diagnosis remains important to anticipate life-threatening complications such as local necrosis, hemorrhage and renal and respiratory failure and to identify likely victims of envenoming by H. hypnale who will not benefit from existing antivenoms. The technique of hospital-based collection, labeling and preservation of dead snakes brought by bitten patients is recommended for rapid assessment of a country's medically-important herpetofauna.
PURPOSE OF REVIEW: Unlike Plasmodium falciparum, Plasmodium vivax rarely causes severe disease in healthy travellers or in temperate endemic regions and has been regarded as readily treatable with chloroquine. However, in tropical areas, recent reports have highlighted severe and fatal disease associated with P. vivax infection. We review the evidence for severe disease and the spread of drug-resistant P. vivax and speculate how these maybe related. RECENT FINDINGS: Studies from Indonesia, Papua New Guinea, Thailand and India have shown that 21-27% of patients with severe malaria have P. vivax monoinfection. The clinical spectrum of these cases is broad with an overall mortality of 0.8-1.6%. Major manifestations include severe anaemia and respiratory distress, with infants being particularly vulnerable. Most reports of severe and fatal vivax malaria come from endemic regions where populations have limited access to healthcare, a high prevalence of comorbidity and where drug-resistant P. vivax strains and partially effective primaquine regimens significantly undermine the radical cure and control of this relapsing infection. The mechanisms underlying severe disease in vivax malaria remain poorly defined. SUMMARY: Severe, fatal and multidrug-resistant vivax malaria challenge our perception of P. vivax as a benign disease. Strategies to understand and address these phenomena are needed urgently if the global elimination of malaria is to succeed.
Trichinellosis outbreaks occur occasionally in Vietnam following the consumption of undercooked pork. Diagnosing trichinella can be problematic because fever and myalgia are nonspecific, and diagnosis may be delayed. We describe 5 Vietnamese patients in whom trichinellosis was diagnosed after several weeks of illness.
Mechanisms for differential regulation of gene expression may underlie much of the phenotypic variation and adaptability of malaria parasites. Here we describe transcriptional variation among culture-adapted field isolates of Plasmodium falciparum, the species responsible for most malarial disease. It was found that genes coding for parasite protein export into the red cell cytosol and onto its surface, and genes coding for sexual stage proteins involved in parasite transmission are up-regulated in field isolates compared with long-term laboratory isolates. Much of this variability was associated with the loss of small or large chromosomal segments, or other forms of gene copy number variation that are prevalent in the P. falciparum genome (copy number variants, CNVs). Expression levels of genes inside these segments were correlated to that of genes outside and adjacent to the segment boundaries, and this association declined with distance from the CNV boundary. This observation could not be explained by copy number variation in these adjacent genes. This suggests a local-acting regulatory role for CNVs in transcription of neighboring genes and helps explain the chromosomal clustering that we observed here. Transcriptional co-regulation of physical clusters of adaptive genes may provide a way for the parasite to readily adapt to its highly heterogeneous and strongly selective environment.
ASIAN PACIFIC JOURNAL OF TROPICAL MEDICINE, 2 (5), pp. 14-19.2009. SDS-PAGE analysis of whole cell protein and outer membrane protein patterns of clinical isolates of Burkholderia pseudomallei
OBJECTIVE: To assess the availability of resources that support the provision of basic neonatal care in eight first-referral level (district) hospitals in Kenya. METHODS: We selected two hospitals each from four of Kenya's eight provinces with the aim of representing the diversity of this part of the health system in Kenya. We created a checklist of 53 indicator items necessary for providing essential basic care to newborns and assessed their availability at each of the eight hospitals by direct observation, and then compared our observations with the opinions of health workers providing care to newborns on recent availability for some items, using a self-administered structured questionnaire. RESULTS: The hospitals surveyed were often unable to maintain a safe hygienic environment for patients and health care workers; staffing was insufficient and sometimes poorly organised to support the provision of care; some key equipment, laboratory tests, drugs and consumables were not available while patient management guidelines were missing in all sites. CONCLUSION: Hospitals appear relatively poorly prepared to fill their proposed role in ensuring newborn survival. More effective interventions are needed to improve them to meet the special needs of this at-risk group.
BRITISH MEDICAL JOURNAL, 339 (sep30 2), pp. b3991-b3991. | Citations: 1 (Scopus) | Read more2009. Medicine and the Media Online video sharing and patients' privacy
Antivir Ther, 14 (5), pp. 619-629. | Citations: 60 (Scopus) | Show Abstract2009. Dried fluid spots for HIV type-1 viral load and resistance genotyping: a systematic review.
BACKGROUND: Dried spots on filter paper made of whole blood (dried blood spots; DBS), plasma (dried plasma spots; DPS) or serum (dried serum spots) hold promise as an affordable and practical alternative specimen source to liquid plasma for HIV type-1 (HIV-1) viral load determination and drug resistance genotyping in the context of the rapidly expanding access to antiretroviral therapy (ART) for HIV-1-infected individuals in low- and middle-income countries. This report reviews the current evidence for their utility. METHODS: We systematically searched the English language literature published before 2009 on Medline, the websites of the World Health Organization and US Centers for Disease Control and Prevention, abstracts presented at relevant international conferences and references from relevant articles. RESULTS: Data indicate that HIV-1 viral load determination and resistance genotyping from DBS and DPS is feasible, yielding comparable test performances, even after storage. Limitations include reduced analytical sensitivity resulting from small analyte volumes (approximately 3.5 log(10) copies/ml at 50 microl sample volume), nucleic acid degradation under extreme environmental conditions, impaired efficiency of nucleic acid extraction, potential interference of archived proviral DNA in genotypes obtained from DBS and the excision of spots from the filters in high-volume testing. CONCLUSIONS: This technology offers the advantages of a stable specimen matrix, ease of sample collection and shipment. The current sensitivity in drug resistance testing is appropriate for public health surveillance among pretreatment populations. However, consistently improved analytical sensitivity is needed for their routine application in the therapeutic monitoring of individuals receiving ART, particularly at the onset of treatment failure.
Artemether-lumefantrine has become one of the most widely used antimalarial drugs in the world. The objective of this study was to determine the population pharmacokinetic properties of lumefantrine in pregnant women with uncomplicated multidrug-resistant Plasmodium falciparum malaria on the northwestern border of Thailand. Burmese and Karen women (n = 103) with P. falciparum malaria and in the second and third trimesters of pregnancy were treated with artemether-lumefantrine (80/480 mg) twice daily for 3 days. All patients provided five capillary plasma samples for drug quantification, and the collection times were randomly distributed over 14 days. The concentration-time profiles of lumefantrine were assessed by nonlinear mixed-effects modeling. The treatment failure rate (PCR-confirmed recrudescent infections at delivery) was high; 16.5% (95% confidence interval, 9.9 to 25.1). The population pharmacokinetics of lumefantrine were described well by a two-compartment open model with first-order absorption and elimination. The final model included interindividual variability in all pharmacokinetic parameters and a linear covariate relationship between the estimated gestational age and the central volume of distribution. A high proportion of all women (40%, 41/103) had day 7 capillary plasma concentrations of <355 ng/ml (which corresponds to approximately <280 ng/ml in venous plasma), a threshold previously associated with an increased risk of therapeutic failure in nonpregnant patients in this area. Predictive modeling suggests that a twice-daily regimen given for 5 days would be preferable in later pregnancy. In conclusion, altered pharmacokinetic properties of lumefantrine contribute to the high rates of failure of artemether-lumefantrine treatment in later pregnancy. Dose optimization is urgently needed.
Globally, sickle cell disease (SCD) has its highest prevalence and worst prognosis in sub-Saharan Africa. Nevertheless, relatively few studies describe the clinical characteristics of children with SCD in this region. We conducted a prospective observational study of children with SCD attending a specialist out-patient clinic in Kilifi, Kenya. A total of 124 children (median age 6.3 years) were included in the study. Splenomegaly was present in 41 (33%) subjects and hepatomegaly in 25 (20%), both being common in all age groups. A positive malaria slide was found at 6% of clinic visits. The mean haemoglobin concentration was 73 g/l, compared to 107 g/l in non-SCD controls (P < 0.001). Liver function tests were elevated; plasma bilirubin concentrations were 46 micromol/l and aspartate aminotransferase was 124 iu/l. Forty-eight (39%) children were admitted to hospital and two died. Children with SCD in Kilifi have a similar degree of anaemia and liver function derangement to patients living in developed countries, but splenomegaly persists into later childhood. The prevalence of malaria was lower than expected given the prevalence in the local community. This study provides valuable data regarding the clinical characteristics of children living with SCD in a rural setting in East Africa.
PLoS Med, 6 (9), pp. e1000129. | Citations: 17 (Scopus) | Read more2009. Changing patterns of dengue epidemiology and implications for clinical management and vaccines.
The genus Plasmodium includes many species that naturally cause malaria among apes and monkeys. The 2004 discovery of people infected by Plasmodium knowlesi in Malaysian Borneo alerted to the potential for non-human species of plasmodia to cause human morbidity and mortality. Subsequent work revealed what appears to be a surprisingly high risk of infection and relatively severe disease, including among travelers to Southeast Asia. The biology and medicine of this zoonosis is reviewed here, along with an examination of the spectrum of Plasmodium species that may cause infection of humans.
Plasmodium vivax is geographically the most widely distributed cause of malaria in people, with up to 2.5 billion people at risk and an estimated 80 million to 300 million clinical cases every year--including severe disease and death. Despite this large burden of disease, P vivax is overlooked and left in the shadow of the enormous problem caused by Plasmodium falciparum in sub-Saharan Africa. The technological advances enabling the sequencing of the P vivax genome and a recent call for worldwide malaria eradication have together placed new emphasis on the importance of addressing P vivax as a major public health problem. However, because of this parasite's biology, it is especially difficult to interrupt the transmission of P vivax, and experts agree that the available methods for preventing and treating infections with P vivax are inadequate. It is thus imperative that the development of new methods and strategies become a priority. Advancing the development of such methods needs renewed emphasis on understanding the biology, pathogenesis, and epidemiology of P vivax. This Review critically examines what is known about P vivax, focusing on identifying the crucial gaps that create obstacles to the elimination of this parasite in human populations.
TROPICAL MEDICINE & INTERNATIONAL HEALTH, 14 pp. 39-40.2009. Evidence for a revised dengue classification: a multi-centre prospective study across Southeast Asia and Latin America
A recent working group convened by the World Health Organization recommended that time to first or only episode of clinical malaria should be used to evaluate vaccine efficacy in phase III trials. However, calculating vaccine efficacy based on this endpoint misses important aspects of malaria disease and transmission. Here, we discuss the gaps that this approach leaves in predicting the potential public health impact of a vaccine and the challenges faced by vaccine trial designers. We examine the implications of current vaccine trial design on effectiveness studies and the next generation of malaria vaccines.
Our environment hosts a vast diversity of venomous and poisonous animals and plants. Clinical toxinology is devoted to understanding, preventing and treating their effects in humans and domestic animals. In Sri Lanka, yellow oleander (Thevetia peruviana, Sinhala 'kaneru'), a widespread and accessible ornamental shrub, is a popular means of self-harm. Its toxic glycosides resemble those of foxglove, against which therapeutic antibodies have been raised. A randomised placebo-controlled trial proved that this treatment effectively reversed kaneru cardiotoxicity. There are strong scientific grounds for the use of activated charcoal, but encouraging results with multiple-dose activated charcoal were not confirmed by a recent more powerful study. Venom of Russell's viper (Daboia siamensis) in Burma (Myanmar) produces lethal effects in human victims. The case of a 17-year-old rice farmer is described with pathophysiological interpretations. During the first 9 days of hospital admission he suffered episodes of shock, coagulopathy, bleeding, acute renal failure, local tissue necrosis, generally increased capillary permeability and acute symptomatic hypoglycaemia with evidence of acute pituitary/adrenal insufficiency. Antivenom rapidly restored haemostatic function but failed to correct other effects of venom toxins incurred during the 3h before he could be treated.
Exotic (foreign or non-native) snakes, including venomous species, are becoming increasingly popular pets in Western countries. Some of them are kept illegally (as defined by the UK Dangerous Wild Animals Act of 1976). There is a large international market for such animals, with contraventions of the Convention on International Trade in Endangered Species of Wild Fauna and Flora (CITES). In the UK, several other European countries and the USA the reported numbers of bites by venomous exotic snakes, although small, are increasing but still underestimate the occurrence of these occasionally fatal events because of the victims' reluctance to seek medical care. Victims are predominantly young men who have been drinking alcohol. Bites may be intentionally provoked. In Europe, the species most often involved are cobras, green mambas, American pit vipers particularly rattlesnakes, African adders, vipers and Asian green pit vipers. To illustrate the special problems involved, case histories are presented of bites by exotic species in the UK and of bites abroad, where patients were repatriated for treatment. In view of the relative rarity and diversity of these cases, expert advice must usually be sought. These requests should include information about the species thought to have been responsible and the history and timing of the evolution of envenoming. Sources of advice and antivenom are discussed together with recommendations for appropriate first aid and emergency treatment while this is being awaited. Respiratory and cardiovascular resuscitation may be required and when systemic or severe local envenoming develops, specific (equine or ovine) antivenom is indicated.
TRANSACTIONS OF THE ROYAL SOCIETY OF TROPICAL MEDICINE AND HYGIENE, 103 (9), pp. 965-966. | Citations: 6 (Scopus) | Read more2009. Response to comment on: Failure of a new antivenom to treat Echis ocellatus snake bite in rural Ghana: the importance of quality surveillance
Two new rotavirus vaccines have recently been licensed in many countries. However, their efficacy has only been shown against certain serotypes commonly circulating in Europe, North America, and Latin America, but thought to be globally important. To assess the potential impact of these vaccines in sub-Saharan Africa, where rotavirus mortality is high, knowledge of prevalent types is essential because an effective rotavirus vaccine is needed to protect against prevailing serotypes in the community. We did two systematic reviews and two meta-analyses of the most recent published data on the burden of rotavirus disease in children aged under 5 years and rotavirus serotypes circulating in countries in sub-Saharan Africa. Eligible studies were selected from PubMed/Medline, Cochrane Library, EmBase, LILACS, Academic Search Premier, Biological Abstracts, ISI Web of Science, and the African Index Medicus. Depending on the heterogeneity, DerSimonian-Laird random-effects or fixed-effects models were used for meta-analyses. Geographical variability in rotavirus burden within countries in sub-Saharan Africa is substantial, and most countries lack information on rotavirus epidemiology. We estimated that annual mortality for this region was 243.3 (95% CI 187.6-301.7) deaths per 100,000 under 5 years (ie, a total of 300,000 children die of rotavirus infection in this region each year). The most common G type detected was G1 (34.9%), followed by G2 (9.1%), and G3 (8.6%). The most common P types detected were P (35.5%) and P (27.5%). Accurate information should be collected from surveillance based on standardised methods in these countries to obtain comparable data on the burden of disease and the circulating strains to assess the potential impact of vaccine introduction.
Int J Tuberc Lung Dis, 13 (9), pp. 1112-1118. | Citations: 7 (Web of Science Lite) | Show Abstract2009. Evaluation of FASTPlaqueTB to diagnose smear-negative tuberculosis in a peripheral clinic in Kenya.
OBJECTIVE: To evaluate the performance and feasibility of FASTPlaqueTB in smear-negative tuberculosis (TB) suspects in a peripheral clinic after laboratory upgrading. DESIGN: Patients with cough > or=2 weeks, two sputum smear-negative results, no response to 1 week of amoxicillin and abnormal chest X-ray were defined as smear-negative suspects. One sputum sample was collected, decontaminated and divided into two: half was tested with FASTPlaqueTB in the clinic laboratory and the other half was cultured on Löwenstein-Jensen medium in the Kenyan Medical Research Institute. Test sensitivity and specificity were evaluated in all patients and in human immunodeficiency virus (HIV) infected patients. Feasibility was assessed by the contamination rate and the resources required to upgrade the laboratory. RESULTS: Of 208 patients included in the study, 56.2% were HIV-infected. Of 203 FASTPlaqueTB tests, 95 (46.8%) were contaminated, which interfered with result interpretation and led to the interruption of the study. Sensitivity and specificity were respectively 31.2% (95%CI 12.1-58.5) and 94.9% (95%CI 86.8-98.4) in all patients and 33.3% (95%CI 9.9-65.1) and 93.9% (95%CI 83.1-98.7) in HIV-infected patients. Upgrading the laboratory cost euro 20,000. CONCLUSION: FASTPlaqueTB did not perform satisfactorily in this setting. If contamination can be reduced, in addition to laboratory upgrading, its introduction in peripheral clinics would require further assessment in smear-negative and HIV co-infected patients and test adaptation for friendlier use.
INFECTION, 37 pp. 36-36.2009. The human antibody response to S. aureus colonization and sepsis
TROPICAL MEDICINE & INTERNATIONAL HEALTH, 14 pp. 38-39.2009. Mechanisms of vascular leakage and bleeding during dengue infections
It is well understood that sociocultural practices strongly influence Taenia solium transmission; however, the extent to which interspecific parasite competition moderates Taenia transmission has yet to be determined. This is certainly the case in Southeast Asia where T. solium faces competition in both the definitive host (people) and the intermediate host (pigs). In people, adult worms of T. solium, T. saginata and T. asiatica compete through density-dependent crowding mechanisms. In pigs, metacestodes of T. solium, T. hydatigena and T. asiatica compete through density-dependent immune-mediated interactions. Here, we describe the biological and epidemiological implications of Taenia competition and propose that interspecific competition has a moderating effect on the transmission dynamics of T. solium in the region. Furthermore, we argue that this competitive ecological scenario should be considered in future research and surveillance activities examining T. solium cysticercosis and taeniasis in Southeast Asia.
BACKGROUND: Progress in therapy for cryptococcal meningitis has been slow because of the lack of a suitable marker of treatment response. Previously, we demonstrated the statistical power of a novel endpoint, the rate of clearance of infection, based on serial quantitative cultures of cerebrospinal fluid, to differentiate the fungicidal activity of alternative antifungal drug regimens. We hypothesized that the rate of clearance of infection should also be a clinically meaningful endpoint. METHODS: We combined data from cohorts of patients with human immunodeficiency virus-associated cryptococcal meningitis from Thailand, South Africa, and Uganda, for whom the rate of clearance of infection was determined, and clinical and laboratory data prospectively collected, and explored the association between the rate of clearance of infection and mortality by Cox survival analyses. RESULTS: The combined cohort comprised 262 subjects. Altered mental status at presentation, a high baseline organism load, and a slow rate of clearance of infection were independently associated with increased mortality at 2 and 10 weeks. Rate of clearance of infection was associated with antifungal drug regimen and baseline cerebrospinal fluid interferon-gamma levels. CONCLUSIONS: The results support the use of the rate of clearance of infection or early fungicidal activity as a means to explore antifungal drug dosages and combinations in phase II studies. An increased understanding of how the factors determining outcome interrelate may help clarify opportunities for intervention.
TROPICAL MEDICINE & INTERNATIONAL HEALTH, 14 pp. 83-84.2009. Multi-country evaluation of the sensitivity and specificity of two commercially available NS1 ELISA assays for dengue diagnosis
BACKGROUND: Women prisoners tend to suffer poor health on a range of indicators. This study sought to explore women prisoners' perceptions of the impact of imprisonment on their health. METHODS: This qualitative study involved adult women prisoners in two closed local prisons. Focus groups and individual interviews were conducted. RESULTS: Women prisoners reported that imprisonment impacted negatively upon their health. The initial shock of imprisonment, separation from families and enforced living with other women suffering drug withdrawal and serious mental health problems affected their own mental health. Over the longer term, women complained of detention in unhygienic facilities by regimes that operated to disempower them, including in the management of their own health. Women described responses to imprisonment that were also health negating such as increased smoking, eating poorly and seeking psychotropic medication. However, imprisonment could also offer a respite from lives characterised by poverty, social exclusion, substance misuse and violence, with perceived improvements in health. CONCLUSION: The impact of imprisonment on women's health was mixed but was largely perceived to be negative. Despite policy initiatives to introduce health promotion in prisons, there is little evidence of the extent to which this has been effective. The current policy climate in the UK makes it especially timely to examine the reported experience of women prisoners themselves about the impact of imprisonment on their health and to re-evaluate health promotion in women's prisons.
J Infect, 59 (3), pp. 219-222. | Citations: 3 (Web of Science Lite) | Read more2009. Apolipoprotein E-epsilon2 confers risk of pulmonary tuberculosis in women from the Indian subcontinent--a preliminary study.
SUMMARY AND KEY RECOMMENDATIONS: The aim of these guidelines is to describe a practical but evidence-based approach to the diagnosis and treatment of central nervous system tuberculosis in children and adults. We have presented guidance on tuberculous meningitis (TBM), intra-cerebral tuberculoma without meningitis, and tuberculosis affecting the spinal cord. Our key recommendations are as follows: 1. TBM is a medical emergency. Treatment delay is strongly associated with death and empirical anti-tuberculosis therapy should be started promptly in all patients in whom the diagnosis of TBM is suspected. Do not wait for microbiological or molecular diagnostic confirmation. 2. The diagnosis of TBM is best made with lumbar puncture and examination of the cerebrospinal fluid (CSF). Suspect TBM if there is a CSF leucocytosis (predominantly lymphocytes), the CSF protein is raised, and the CSF:plasma glucose is <50%. The diagnostic yield of CSF microscopy and culture for Mycobacterium tuberculosis increases with the volume of CSF submitted; repeat the lumbar puncture if the diagnosis remains uncertain. 3. Imaging is essential for the diagnosis of cerebral tuberculoma and tuberculosis involving the spinal cord, although the radiological appearances do not confirm the diagnosis. A tissue diagnosis (by histopathology and mycobacterial culture) should be attempted whenever possible, either by biopsy of the lesion itself, or through diagnostic sampling from extra-neural sites of disease e.g. lung, gastric fluid, lymph nodes, liver, bone marrow. 4. Treatment for all forms of CNS tuberculosis should consist of 4 drugs (isoniazid, rifampicin, pyrazinamide, ethambutol) for 2 months followed by 2 drugs (isoniazid, rifampicin) for at least 10 months. Adjunctive corticosteroids (either dexamethasone or prednisolone) should be given to all patients with TBM, regardless of disease severity. 5. Children with CNS tuberculosis should ideally be managed by a paediatrician with familiarity and expertise in paediatric tuberculosis or otherwise with input from a paediatric infectious diseases unit. The Children's HIV Association of UK and Ireland (CHIVA) provide further guidance on the management of HIV-infected children (www.chiva.org.uk). 6. All patients with suspected or proven tuberculosis should be offered testing for HIV infection. The principles of CNS tuberculosis diagnosis and treatment are the same for HIV infected and uninfected individuals, although HIV infection broadens the differential diagnosis and anti-retroviral treatment complicates management. Tuberculosis in HIV infected patients should be managed either within specialist units by physicians with expertise in both HIV and tuberculosis, or in a combined approach between HIV and tuberculosis experts. The co-administration of anti-retroviral and anti-tuberculosis drugs should follow guidance issued by the British HIV association (www.bhiva.org).
OBJECTIVES: To evaluate the prevalence of flavivirus infection in Vientiane city (Lao PDR), to describe the spatial distribution of infection within this city, and to explore the link between flavivirus seroprevalence and urbanization levels of residential neighbourhoods. METHODS: A seroprevalence survey was carried out in 2006 including 1990 adults (>or=35 years) and 1568 children (>or=6 months and <6 years) randomly selected. RESULTS: The prevalence of individuals with previous flavivirus infection (i.e. negative for both DEN and JE IgM but positive for DEN IgG) was 57.7%, with a significantly (P < 0.001) higher prevalence among adults (84.6%; 95% confidence interval (CI) = 82.4-86.8) than children (9.4%; 95% CI = 7.2-11.6). The prevalence of individuals with recent flavivirus infection (i.e. positive for DEN and/or JE IgM) was 6.5% and also significantly (P < 0.001) higher among adults (10.0%; 95% CI = 8.3-11.7) than children (2.5%; 95% CI = 1.5-3.5). In terms of spatial distribution, IgG prevalence was significantly (P < 0.001) higher among individuals living in the central city (60.1%; 95% CI = 56.2-64.1) than among those living in the periphery (44.3%; 95% CI = 41.5-47.2). In contrast, seroprevalence of recent flavivirus infections was significantly (P < 0.001) higher among individuals living in the periphery (8.8%; 95% CI = 6.9-10.7) than in the central city (4.0%; 95% CI = 2.9-5.2). This association was also statistically consistent (P < 0.01) in multivariate logistic regression after controlling for individual risk factors. CONCLUSIONS: Our findings indicate that the level of urbanization of residential neighbourhoods influences the risk of flavivirus infection. The spatial distribution of flavivirus infection varies, even within a small city of less than 300,000 habitants such as Vientiane.
Ultrasound in Obstetrics & Gynecology, 34 (S1), pp. 13-13. | Read more2009. OC08.01: Quality assessment of fetal biometry in locally trained sonographers in a developing country setting
Progression of a chronic disease can lead to the development of secondary illnesses. An example is the development of active tuberculosis (TB) in HIV-infected individuals. HIV disease progression, as indicated by declining CD4 + T-cell count (CD4), increases both the risk of TB and the risk of AIDS-related mortality. This means that CD4 is a time-dependent confounder for the effect of TB on AIDS-related mortality. Part of the effect of TB on AIDS-related mortality may be indirect by causing a drop in CD4. Estimating the total causal effect of TB on AIDS-related mortality using standard statistical techniques, conditioning on CD4 to adjust for confounding, then gives an underestimate of the true effect. Marginal structural models (MSMs) can be used to obtain an unbiased estimate. We describe an easily implemented algorithm that uses G-computation to fit an MSM, as an alternative to inverse probability weighting (IPW). Our algorithm is simplified by utilizing individual baseline parameters that describe CD4 development. Simulation confirms that the algorithm can produce an unbiased estimate of the effect of a secondary illness, when a marker for primary disease progression is both a confounder and intermediary for the effect of the secondary illness. We used the algorithm to estimate the total causal effect of TB on AIDS-related mortality in HIV-infected individuals, and found a hazard ratio of 3.5 (95 per cent confidence interval 1.2-9.1).
BACKGROUND: The incidence of community-associated methicillin-resistant Staphylococcus aureus (CA-MRSA) infection is rising in the developed world but appears to be rare in developing countries. One explanation for this difference is that resource poor countries lack the diagnostic microbiology facilities necessary to detect the presence of CA-MRSA carriage and infection. METHODOLOGY AND PRINCIPAL FINDINGS: We developed diagnostic microbiology capabilities at the Angkor Hospital for Children, Siem Reap, western Cambodia in January 2006 and in the same month identified a child with severe community-acquired impetigo caused by CA-MRSA. A study was undertaken to identify and describe additional cases presenting between January 2006 and December 2007. Bacterial isolates underwent molecular characterization using multilocus sequence typing, staphylococcal cassette chromosome mec (SCCmec) typing, and PCR for the presence of the genes encoding Panton-Valentine Leukocidin (PVL). Seventeen children were identified with CA-MRSA infection, of which 11 had skin and soft tissue infection and 6 had invasive disease. The majority of cases were unrelated in time or place. Molecular characterization identified two independent MRSA clones; fifteen isolates were sequence type (ST) 834, SCCmec type IV, PVL gene-negative, and two isolates were ST 121, SCCmec type V, PVL gene-positive. CONCLUSIONS: This represents the first ever report of MRSA in Cambodia, spread of which would pose a significant threat to public health. The finding that cases were mostly unrelated in time or place suggests that these were sporadic infections in persons who were CA-MRSA carriers or contacts of carriers, rather than arising in the context of an outbreak.
BACKGROUND: Clinical management of malaria is a major health issue in sub-Saharan Africa. New strategies based on intermittent preventive treatment (IPT) can tackle disease burden by simultaneously reducing frequency of infections and life-threatening illness in infants (IPTi) and children (IPTc), while allowing for immunity to build up. However, concerns as to whether immunity develops efficiently in treated individuals, and whether there is a rebound effect after treatment is halted, have made it imperative to define the effects that IPTi and IPTc exert on the clinical malaria scenario. METHODS AND FINDINGS: Here, we simulate several schemes of intervention under different transmission settings, while varying immunity build up assumptions. Our model predicts that infection risk and effectiveness of acquisition of clinical immunity under prophylactic effect are associated to intervention impact during treatment and follow-up periods. These effects vary across regions of different endemicity and are highly correlated with the interplay between the timing of interventions in age and the age dependent risk of acquiring an infection. However, even when significant rebound effects are predicted to occur, the overall intervention impact is positive. CONCLUSIONS: IPTi is predicted to have minimal impact on the acquisition of clinical immunity, since it does not interfere with the occurrence of mild infections, thus failing to reduce the underlying force of infection. On the contrary, IPTc has a significant potential to reduce transmission, specifically in areas where it is already low to moderate.
BACKGROUND: Clinical malaria has proven an elusive burden to enumerate. Many cases go undetected by routine disease recording systems. Epidemiologists have, therefore, frequently defaulted to actively measuring malaria in population cohorts through time. Measuring the clinical incidence of malaria longitudinally is labour-intensive and impossible to undertake universally. There is a need, therefore, to define a relationship between clinical incidence and the easier and more commonly measured index of infection prevalence: the "parasite rate". This relationship can help provide an informed basis to define malaria burdens in areas where health statistics are inadequate. METHODS: Formal literature searches were conducted for Plasmodium falciparum malaria incidence surveys undertaken prospectively through active case detection at least every 14 days. The data were abstracted, standardized and geo-referenced. Incidence surveys were time-space matched with modelled estimates of infection prevalence derived from a larger database of parasite prevalence surveys and modelling procedures developed for a global malaria endemicity map. Several potential relationships between clinical incidence and infection prevalence were then specified in a non-parametric Gaussian process model with minimal, biologically informed, prior constraints. Bayesian inference was then used to choose between the candidate models. RESULTS: The suggested relationships with credible intervals are shown for the Africa and a combined America and Central and South East Asia regions. In both regions clinical incidence increased slowly and smoothly as a function of infection prevalence. In Africa, when infection prevalence exceeded 40%, clinical incidence reached a plateau of 500 cases per thousand of the population per annum. In the combined America and Central and South East Asia regions, this plateau was reached at 250 cases per thousand of the population per annum. A temporal volatility model was also incorporated to facilitate a closer description of the variance in the observed data. CONCLUSION: It was possible to model a relationship between clinical incidence and P. falciparum infection prevalence but the best-fit models were very noisy reflecting the large variance within the observed opportunistic data sample. This continuous quantification allows for estimates of the clinical burden of P. falciparum of known confidence from wherever an estimate of P. falciparum prevalence is available.
Protective immunity generated following malaria infection may be comprised of Ab or T cells against malaria Ag of different stages; however, the short-lived immunity that is observed suggests deficiency in immune memory or regulatory activity. In this study, cellular immune responses were investigated in individuals receiving Plasmodium falciparum sporozoite challenge by the natural (mosquito bite) route as part of a malaria vaccine efficacy trial. Parasitemia, monitored by blood film microscopy and PCR, was subsequently cleared with drugs. All individuals demonstrated stable IFN-gamma, IL-2 and IL-4 ex vivo ELISPOT effector responses against P. falciparum-infected RBC (iRBC) Ag, 28 and 90 days after challenge. However, infected RBC-specific central memory responses, as measured by IFN-gamma cultured ELISPOT, were low and unstable over time, despite CD4(+) T cells being highly proliferative by CFSE dilution, and showed an inverse relationship to parasite density. In support of the observation of poor memory, co-culture experiments showed reduced responses to common recall Ag, indicating malaria-specific regulatory activity. This activity could not be accounted for by the expression of IL-10, TGF-beta, FOXP3 or CTLA-4, but proliferating T cells expressed high levels of CD95, indicating a pro-apoptotic phenotype. Lastly, there was an inverse relationship between FOXP3 expression, when measured 10 days after challenge, and ex vivo IFN-gamma measured more than 100 days later. This study shows that malaria infection elicits specific Th1 and Th2 effector cells, but concomitant weak central memory and regulatory activity, which may help to explain the short-lived immunity observed.
AMERICAN JOURNAL OF TROPICAL MEDICINE AND HYGIENE, 81 (2), pp. 335-337. | Citations: 15 (Scopus) | Show Abstract2009. Short Report: Patterns of Organ Involvement in Recurrent Melioidosis
Recurrent melioidosis can be caused by two different mechanisms: relapse or re-infection. We examined the pattern of organ involvement in the first and second episodes in individual patients. Evaluation of 140 patients with recurrence showed that similar patterns of disease occurred during the first and second episode, independent of whether this was caused by relapse or re-infection. Copyright © 2009 by The American Society of Tropical Medicine and Hygiene.
Eur J Clin Pharmacol, 65 (8), pp. 847-847. | Read more2009. The pharmacokinetics of artemether and lumefantrine in pregnant women with uncomplicated falciparum malaria.
Am J Trop Med Hyg, 81 (2), pp. 190-194. | Citations: 20 (Scopus) | Show Abstract2009. Sennetsu neorickettsiosis: a probable fish-borne cause of fever rediscovered in Laos.
Neorickettsia sennetsu has been described from Japan and Malaysia, causing a largely forgotten infectious mononucleosis-like disease. Because it is believed to be contracted from eating raw fish, frequently consumed in the Lao PDR, we looked for evidence of N. sennetsu among Lao patients and fish. A buffy coat from 1 of 91 patients with undifferentiated fever was positive by 16S rRNA amplification and sequencing and real-time polymerase chain reactions (PCR) targeting two N. sennetsu genes. Lao blood donors and patients with fever, hepatitis, or jaundice (N = 1,132) had a high prevalence (17%) of immunofluorescence assay IgG anti-N. sennetsu antibodies compared with 4% and 0% from febrile patients (N = 848) in Thailand and Malaysia, respectively. We found N. sennetsu DNA by PCR, for the first time, in a fish (Anabas testudineus). These data suggest that sennetsu may be an under-recognized cause of fever and are consistent with the hypothesis that it may be contracted from eating raw fish.
Am J Trop Med Hyg, 81 (2), pp. 335-337. | Show Abstract2009. Patterns of organ involvement in recurrent melioidosis.
Recurrent melioidosis can be caused by two different mechanisms: relapse or re-infection. We examined the pattern of organ involvement in the first and second episodes in individual patients. Evaluation of 140 patients with recurrence showed that similar patterns of disease occurred during the first and second episode, independent of whether this was caused by relapse or re-infection.
Am J Trop Med Hyg, 81 (2), pp. 338-342. | Citations: 20 (Scopus) | Show Abstract2009. Resistance to chloroquine by Plasmodium vivax at Alor in the Lesser Sundas Archipelago in eastern Indonesia.
The therapeutic response to standard chloroquine therapy against Plasmodium vivax was evaluated in 36 subjects living at Alor in the Lesser Sundas Archipelago of eastern Indonesia. Chloroquine level were measured on 32 individuals, and showed evidence of adequate absorption of standard chloroquine therapy. Three subjects failed treatment by Day 2 or 3, with evidence of rising asexual parasitemia, and two others had stable parasitemia to Day 7. Ten more subjects had recurrent parasitemia by Day 14, two by Day 21, and another one by Day 28. Three subjects had recurrent parasitemia on Days 14 and 28, but with chloroquine < 100 ng/mL. Eleven subjects cleared parasitemia by Day 3 and had no recurrences up to Day 28. In summary, 28-day cumulative incidence of confirmed resistance to chloroquine was 56% of infections evaluated. Chloroquine should not be considered adequate for treatment of acute vivax malaria acquired in this region.
BACKGROUND: Human immunodeficiency virus (HIV) infection, malnutrition, and invasive bacterial infection (IBI) are reported among children with severe malaria. However, it is unclear whether their cooccurrence with falciparum parasitization and severe disease happens by chance or by association among children in areas where malaria is endemic. METHODS: We examined 3068 consecutive children admitted to a Kenyan district hospital with clinical features of severe malaria and 592 control subjects from the community. We performed multivariable regression analysis, with each case weighted for its probability of being due to falciparum malaria, using estimates of the fraction of severe disease attributable to malaria at different parasite densities derived from cross-sectional parasitological surveys of healthy children from the same community. RESULTS: HIV infection was present in 133 (12%) of 1071 consecutive parasitemic admitted children (95% confidence interval [CI], 11%-15%). Parasite densities were higher in HIV-infected children. The odds ratio for admission associated with HIV infection for admission with true severe falciparum malaria was 9.6 (95% CI, 4.9-19); however, this effect was restricted to children aged 1 year. Malnutrition was present in 507 (25%) of 2048 consecutive parasitemic admitted children (95% CI, 23%-27%). The odd ratio associated with malnutrition for admission with true severe falciparum malaria was 4.0 (95% CI, 2.9-5.5). IBI was detected in 127 (6%) of 2048 consecutive parasitemic admitted children (95% CI, 5.2%-7.3%). All 3 comorbidities were associated with increased case fatality. CONCLUSIONS: HIV, malnutrition and IBI are biologically associated with severe disease due to falciparum malaria rather than being simply alternative diagnoses in co-incidentally parasitized children in an endemic area.
BACKGROUND: In developing countries, the study of cytomegalovirus (CMV) coinfection in HIV-infected patients remains neglected. Quantitative CMV polymerase chain reaction (PCR) is the gold standard diagnostic tool for analyzing serum CMV replication and for predicting CMV disease. We estimated the prevalence of replicating CMV in sera of newly diagnosed HIV-infected Cambodian patients and examined its impact on mortality. METHODS: This cohort study was based on 2 highly active antiretroviral therapy treatment programs in Cambodia between 2004 and 2007. Quantitative CMV PCR was performed on baseline serum samples of 377 HIV-infected patients. RESULTS: The prevalence of serum CMV DNA was 55.2% (150 of 272) in patients with CD4 count <100/mm. In multivariate analysis, hemoglobin <9 g/dL, CD4 count <100/mm, and Karnofsky index <50 were independently associated with positive serum CMV DNA at baseline. During a 3-year follow-up period, CMV viral load >or=3.1 log10 copies per milliliter was significantly associated with death independently of CD4 count, other opportunistic infections, and highly active antiretroviral therapy. CONCLUSIONS: As in industrialized countries, serum CMV replication is highly prevalent among HIV-infected Cambodian patients and is associated with increased mortality. This underscores the importance of diagnostic CMV infection by PCR in sera of HIV-infected patients with CD4 count <100/mm and treating this opportunistic infection to reduce its associated mortality.
Globally, men who have sex with men (MSM) continue to bear a high burden of HIV infection. In sub-Saharan Africa, same-sex behaviours have been largely neglected by HIV research up to now. The results from recent studies, however, indicate the widespread existence of MSM groups across Africa, and high rates of HIV infection, HIV risk behaviour, and evidence of behavioural links between MSM and heterosexual networks have been reported. Yet most African MSM have no safe access to relevant HIV/AIDS information and services, and many African states have not begun to recognise or address the needs of these men in the context of national HIV/AIDS prevention and control programmes. The HIV/AIDS community now has considerable challenges in clarifying and addressing the needs of MSM in sub-Saharan Africa; homosexuality is illegal in most countries, and political and social hostility are endemic. An effective response to HIV/AIDS requires improved strategic information about all risk groups, including MSM. The belated response to MSM with HIV infection needs rapid and sustained national and international commitment to the development of appropriate interventions and action to reduce structural and social barriers to make these accessible.
Classical swine fever (CSF) is a highly contagious and severe viral disease of swine resulting in substantial production losses in different farming systems in many regions of the world. The accurate and rapid detection of CSF outbreaks is reliant on sensitive and specific laboratory testing and is a key component of disease control. Specific detection of CSF virus can be achieved by virus isolation in tissue culture, antigen capture or the detection of viral RNA using molecular techniques. In order to reduce the time taken to achieve a diagnostic result and simplify testing methods, an antigen capture ELISA using immunomagnetic beads (IMB) as the solid phase was developed and compared to a microplate-based antigen capture (AC)-ELISA. The IMB-ELISA has up to 64-fold greater analytical sensitivity than the AC-ELISA and initial estimates of diagnostic sensitivity and specificity are 100%. The IMB-ELISA has a highly robust, rapid and stable test format and is simpler to perform than the AC-ELISA. The IMB-ELISA has the added advantage that a result can be sensitively and specifically determined by eye, lending it to the possibility of adaptation to a near-to-field test with minimal equipment or expertise needed.
Trends Parasitol, 25 (8), pp. 350-351. | Citations: 3 (Web of Science Lite) | Read more2009. Fluorescein angiography findings strengthen the theoretical basis for trialling neuroprotective agents in cerebral malaria.
Studies and trials in the field are key to the development of a vaccine for malaria. Our limited knowledge of naturally acquired immunity and of transmission dynamics and disease causation in the field imposes limitations on our ability to predict the efficacy of candidate vaccinations, and the eventual outcome on deploying an efficacious vaccine at a population level.
BMJ, 339 (7715), pp. b2066. | Citations: 8 (Scopus) | Read more2009. Oral quinine for the treatment of uncomplicated malaria.
Recently, two randomized controlled phase II studies showed that acute initiation of statin treatment directly after aneurysmal subarachnoid hemorrhage (SAH) decreases the incidence of radiologic vasospasm and clinical signs of delayed cerebral ischemia (DCI), and even reduces mortality. It was hypothesized that the beneficial effect resulted from pleiotropic effects of statins. The purpose of this study was to investigate the biologic effects of acute statin treatment in patients with SAH. We performed an exploratory single-center, prospective, randomized, double-blind, placebo-controlled trial. Patients were randomized to simvastatin 80 mg or placebo once daily. A total of 32 patients were included. There were no statistically significant differences in clinical baseline characteristics. With regard to primary outcomes, there were significant differences by treatment group for total cholesterol and low-density lipoprotein (LDL) cholesterol (P<0.0001), but not for parameters of coagulation, fibrinolysis, endothelium function, and inflammation. With regard to secondary outcomes, no differences were observed in the incidence of transcranial Doppler vasospasm, clinical signs of DCI, and poor outcome. We conclude that both the primary and secondary outcome results of this study do not support a beneficial effect of simvastatin in patients with SAH.
European Journal of Wildlife Research, 55 (4), pp. 439-442. | Read more2009. Anaplasma phagocytophilum infection in a multi-species deer community in the New Forest, England
JAIDS Journal of Acquired Immune Deficiency Syndromes, 51 (5), pp. 648-649. | Citations: 1 (Scopus) | Read more2009. Oral Fluid HIV Rapid Testing in Southern Africa Is Suitable for Population-Level Surveillance But May Not Be So for Individual Diagnosis (Invited Response to Scott et al: Can Oral Fluid Testing Be Used to Replace Blood-Based HIV Rapid Testing to Improve Access to Diagnosis in South Africa?)
BACKGROUND: Understanding spatio-temporal variation in malaria incidence provides a basis for effective disease control planning and monitoring. METHODS: Monthly surveillance data between 1991 and 2006 for Plasmodium vivax and Plasmodium falciparum malaria across 128 counties were assembled for Yunnan, a province of China with one of the highest burdens of malaria. County-level Bayesian Poisson regression models of incidence were constructed, with effects for rainfall, maximum temperature and temporal trend. The model also allowed for spatial variation in county-level incidence and temporal trend, and dependence between incidence in June-September and the preceding January-February. RESULTS: Models revealed strong associations between malaria incidence and both rainfall and maximum temperature. There was a significant association between incidence in June-September and the preceding January-February. Raw standardised morbidity ratios showed a high incidence in some counties bordering Myanmar, Laos and Vietnam, and counties in the Red River valley. Clusters of counties in south-western and northern Yunnan were identified that had high incidence not explained by climate. The overall trend in incidence decreased, but there was significant variation between counties. CONCLUSION: Dependence between incidence in summer and the preceding January-February suggests a role of intrinsic host-pathogen dynamics. Incidence during the summer peak might be predictable based on incidence in January-February, facilitating malaria control planning, scaled months in advance to the magnitude of the summer malaria burden. Heterogeneities in county-level temporal trends suggest that reductions in the burden of malaria have been unevenly distributed throughout the province.
BACKGROUND: Artemisinin-based combination therapies are the recommended first-line treatments of falciparum malaria in all countries with endemic disease. There are recent concerns that the efficacy of such therapies has declined on the Thai-Cambodian border, historically a site of emerging antimalarial-drug resistance. METHODS: In two open-label, randomized trials, we compared the efficacies of two treatments for uncomplicated falciparum malaria in Pailin, western Cambodia, and Wang Pha, northwestern Thailand: oral artesunate given at a dose of 2 mg per kilogram of body weight per day, for 7 days, and artesunate given at a dose of 4 mg per kilogram per day, for 3 days, followed by mefloquine at two doses totaling 25 mg per kilogram. We assessed in vitro and in vivo Plasmodium falciparum susceptibility, artesunate pharmacokinetics, and molecular markers of resistance. RESULTS: We studied 40 patients in each of the two locations. The overall median parasite clearance times were 84 hours (interquartile range, 60 to 96) in Pailin and 48 hours (interquartile range, 36 to 66) in Wang Pha (P<0.001). Recrudescence confirmed by means of polymerase-chain-reaction assay occurred in 6 of 20 patients (30%) receiving artesunate monotherapy and 1 of 20 (5%) receiving artesunate-mefloquine therapy in Pailin, as compared with 2 of 20 (10%) and 1 of 20 (5%), respectively, in Wang Pha (P=0.31). These markedly different parasitologic responses were not explained by differences in age, artesunate or dihydroartemisinin pharmacokinetics, results of isotopic in vitro sensitivity tests, or putative molecular correlates of P. falciparum drug resistance (mutations or amplifications of the gene encoding a multidrug resistance protein [PfMDR1] or mutations in the gene encoding sarco-endoplasmic reticulum calcium ATPase6 [PfSERCA]). Adverse events were mild and did not differ significantly between the two treatment groups. CONCLUSIONS: P. falciparum has reduced in vivo susceptibility to artesunate in western Cambodia as compared with northwestern Thailand. Resistance is characterized by slow parasite clearance in vivo without corresponding reductions on conventional in vitro susceptibility testing. Containment measures are urgently needed. (ClinicalTrials.gov number, NCT00493363, and Current Controlled Trials number, ISRCTN64835265.)
Vaccine, 27 (35), pp. 4745-4746. | Citations: 2 (Web of Science Lite) | Read more2009. Response to "Poor control vaccines in two randomised trials of malaria vaccine?".
BACKGROUND: Counterfeit oral artesunate has been a major public health problem in mainland SE Asia, impeding malaria control. A countrywide stratified random survey was performed to determine the availability and quality of oral artesunate in pharmacies and outlets (shops selling medicines) in the Lao PDR (Laos). METHODS: In 2003, 'mystery' shoppers were asked to buy artesunate tablets from 180 outlets in 12 of the 18 Lao provinces. Outlets were selected using stratified random sampling by investigators not involved in sampling. Samples were analysed for packaging characteristics, by the Fast Red Dye test, high-performance liquid chromatography (HPLC), mass spectrometry (MS), X-ray diffractometry and pollen analysis. RESULTS: Of 180 outlets sampled, 25 (13.9%) sold oral artesunate. Outlets selling artesunate were more commonly found in the more malarious southern Laos. Of the 25 outlets, 22 (88%; 95%CI 68-97%) sold counterfeit artesunate, as defined by packaging and chemistry. No artesunate was detected in the counterfeits by any of the chemical analysis techniques and analysis of the packaging demonstrated seven different counterfeit types. There was complete agreement between the Fast Red dye test, HPLC and MS analysis. A wide variety of wrong active ingredients were found by MS. Of great concern, 4/27 (14.8%) fakes contained detectable amounts of artemisinin (0.26-115.7 mg/tablet). CONCLUSION: This random survey confirms results from previous convenience surveys that counterfeit artesunate is a severe public health problem. The presence of artemisinin in counterfeits may encourage malaria resistance to artemisinin derivatives. With increasing accessibility of artemisinin-derivative combination therapy (ACT) in Laos, the removal of artesunate monotherapy from pharmacies may be an effective intervention.
BACKGROUND: It is increasingly appreciated that the interpretation of health systems research studies is greatly facilitated by detailed descriptions of study context and the process of intervention. We have undertaken an 18-month hospital-based intervention study in Kenya aiming to improve care for admitted children and newborn infants. Here we describe the baseline characteristics of the eight hospitals as environments receiving the intervention, as well as the general and local health system context and its evolution over the 18 months. METHODS: Hospital characteristics were assessed using previously developed tools assessing the broad structure, process, and outcome of health service provision for children and newborns. Major health system or policy developments over the period of the intervention at a national level were documented prospectively by monitoring government policy announcements, the media, and through informal contacts with policy makers. At the hospital level, a structured, open questionnaire was used in face-to-face meetings with senior hospital staff every six months to identify major local developments that might influence implementation. These data provide an essential background for those seeking to understand the generalisability of reports describing the intervention's effects, and whether the intervention plausibly resulted in these effects. RESULTS: Hospitals had only modest capacity, in terms of infrastructure, equipment, supplies, and human resources available to provide high-quality care at baseline. For example, hospitals were lacking between 30 to 56% of items considered necessary for the provision of care to the seriously ill child or newborn. An increase in spending on hospital renovations, attempts to introduce performance contracts for health workers, and post-election violence were recorded as examples of national level factors that might influence implementation success generally. Examples of factors that might influence success locally included frequent and sometimes numerous staff changes, movements of senior departmental or administrative staff, and the presence of local 'donor' partners with alternative priorities. CONCLUSION: The effectiveness of interventions delivered at hospital level over periods realistically required to achieve change may be influenced by a wide variety of factors at national and local levels. We have demonstrated how dynamic such contexts are, and therefore the need to consider context when interpreting an intervention's effectiveness.
BACKGROUND: Organizational factors are considered to be an important influence on health workers' uptake of interventions that improve their practices. These are additionally influenced by factors operating at individual and broader health system levels. We sought to explore contextual influences on worker motivation, a factor that may modify the effect of an intervention aimed at changing clinical practices in Kenyan hospitals. METHODS: Franco LM, et al's (Health sector reform and public sector health worker motivation: a conceptual framework. Soc Sci Med. 2002, 54: 1255-66) model of motivational influences was used to frame the study Qualitative methods including individual in-depth interviews, small-group interviews and focus group discussions were used to gather data from 185 health workers during one-week visits to each of eight district hospitals. Data were collected prior to a planned intervention aiming to implement new practice guidelines and improve quality of care. Additionally, on-site observations of routine health worker behaviour in the study sites were used to inform analyses. RESULTS: Study settings are likely to have important influences on worker motivation. Effective management at hospital level may create an enabling working environment modifying the impact of resource shortfalls. Supportive leadership may foster good working relationships between cadres, improve motivation through provision of local incentives and appropriately handle workers' expectations in terms of promotions, performance appraisal processes, and good communication. Such organisational attributes may counteract de-motivating factors at a national level, such as poor schemes of service, and enhance personally motivating factors such as the desire to maintain professional standards. CONCLUSION: Motivation is likely to influence powerfully any attempts to change or improve health worker and hospital practices. Some factors influencing motivation may themselves be influenced by the processes chosen to implement change.
BACKGROUND: Although considerable efforts are directed at developing international guidelines to improve clinical management in low-income settings they appear to influence practice rarely. This study aimed to explore barriers to guideline implementation in the early phase of an intervention study in four district hospitals in Kenya. METHODS: We developed a simple interview guide based on a simple characterisation of the intervention informed by review of major theories on barriers to uptake of guidelines. In-depth interviews, non-participatory observation, and informal discussions were then used to explore perceived barriers to guideline introduction and general improvements in paediatric and newborn care. Data were collected four to five months after in-service training in the hospitals. Data were transcribed, themes explored, and revised in two rounds of coding and analysis using NVivo 7 software, subjected to a layered analysis, reviewed, and revised after discussion with four hospital staff who acted as within-hospital facilitators. RESULTS: A total of 29 health workers were interviewed. Ten major themes preventing guideline uptake were identified: incomplete training coverage; inadequacies in local standard setting and leadership; lack of recognition and appreciation of good work; poor communication and teamwork; organizational constraints and limited resources; counterproductive health worker norms; absence of perceived benefits linked to adoption of new practices; difficulties accepting change; lack of motivation; and conflicting attitudes and beliefs. CONCLUSION: While the barriers identified are broadly similar in theme to those reported from high-income settings, their specific nature often differs. For example, at an institutional level there is an almost complete lack of systems to introduce or reinforce guidelines, poor teamwork across different cadres of health worker, and failure to confront poor practice. At an individual level, lack of interest in the evidence supporting guidelines, feelings that they erode professionalism, and expectations that people should be paid to change practice threaten successful implementation.
BACKGROUND: We have conducted an intervention study aiming to improve hospital care for children and newborns in Kenya. In judging whether an intervention achieves its aims, an understanding of how it is delivered is essential. Here, we describe how the implementation team delivered the intervention over 18 months and provide some insight into how health workers, the primary targets of the intervention, received it. METHODS: We used two approaches. First, a description of the intervention is based on an analysis of records of training, supervisory and feedback visits to hospitals, and brief logs of key topics discussed during telephone calls with local hospital facilitators. Record keeping was established at the start of the study for this purpose with analyses conducted at the end of the intervention period. Second, we planned a qualitative study nested within the intervention project and used in-depth interviews and small group discussions to explore health worker and facilitators' perceptions of implementation. After thematic analysis of all interview data, findings were presented, discussed, and revised with the help of hospital facilitators. RESULTS: Four hospitals received the full intervention including guidelines, training and two to three monthly support supervision and six monthly performance feedback visits. Supervisor visits, as well as providing an opportunity for interaction with administrators, health workers, and facilitators, were often used for impromptu, limited refresher training or orientation of new staff. The personal links that evolved with senior staff seemed to encourage local commitment to the aims of the intervention. Feedback seemed best provided as open meetings and discussions with administrators and staff. Supervision, although sometimes perceived as fault finding, helped local facilitators become the focal point of much activity including key roles in liaison, local monitoring and feedback, problem solving, and orientation of new staff to guidelines. In four control hospitals receiving a minimal intervention, local supervision and leadership to implement new guidelines, despite their official introduction, were largely absent. CONCLUSION: The actual content of an intervention and how it is implemented and received may be critical determinants of whether it achieves its aims. We have carefully described our intervention approach to facilitate appraisal of the quantitative results of the intervention's effect on quality of care. Our findings suggest ongoing training, external supportive supervision, open feedback, and local facilitation may be valuable additions to more typical in-service training approaches, and may be feasible.
Long-term epidemiological data reveal multi-annual fluctuations in the incidence of dengue fever and dengue haemorrhagic fever, as well as complex cyclical behaviour in the dynamics of the four serotypes of the dengue virus. It has previously been proposed that these patterns are due to the phenomenon of the so-called antibody-dependent enhancement (ADE) among dengue serotypes, whereby viral replication is increased during secondary infection with a heterologous serotype; however, recent studies have implied that this positive reinforcement cannot account for the temporal patterns of dengue and that some form of cross-immunity or external forcing is necessary. Here, we show that ADE alone can produce the observed periodicities and desynchronized oscillations of individual serotypes if its effects are decomposed into its two possible manifestations: enhancement of susceptibility to secondary infections and increased transmissibility from individuals suffering from secondary infections. This decomposition not only lowers the level of enhancement necessary for realistic disease patterns but also reduces the risk of stochastic extinction. Furthermore, our analyses reveal a time-lagged correlation between serotype dynamics and disease incidence rates, which could have important implications for understanding the irregular pattern of dengue epidemics.
The prevalence of CD36 deficiency in East Asian and African populations suggests that the causal variants are under selection by severe malaria. Previous analysis of data from the International HapMap Project indicated that a CD36 haplotype bearing a nonsense mutation (T1264G; rs3211938) had undergone recent positive selection in the Yoruba of Nigeria. To investigate the global distribution of this putative selection event, we genotyped T1264G in 3420 individuals from 66 populations. We confirmed the high frequency of 1264G in the Yoruba (26%). However, the 1264G allele is less common in other African populations and absent from all non-African populations without recent African admixture. Using long-range linkage disequilibrium, we studied two West African groups in depth. Evidence for recent positive selection at the locus was demonstrable in the Yoruba, although not in Gambians. We screened 70 variants from across CD36 for an association with severe malaria phenotypes, employing a case-control study of 1350 subjects and a family study of 1288 parent-offspring trios. No marker was significantly associated with severe malaria. We focused on T1264G, genotyping 10,922 samples from four African populations. The nonsense allele was not associated with severe malaria (pooled allelic odds ratio 1.0; 95% confidence interval 0.89-1.12; P = 0.98). These results suggest a range of possible explanations including the existence of alternative selection pressures on CD36, co-evolution between host and parasite or confounding caused by allelic heterogeneity of CD36 deficiency.
BACKGROUND: To date, it has been widely assumed that malaria is a common cause of morbidity and mortality in children with sickle cell disease (SCD) in malaria-endemic countries, and as a result, malarial prophylaxis is commonly recommended. Nevertheless, few data are available that support this practice. METHODS: We conducted a retrospective analysis of the data collected prospectively from children aged 0-13 years who were admitted to Kilifi District Hospital during the period from July 1998 through June 2005. We studied the prevalence, clinical features, and outcome of malarial infections in these children, stratified by SCD status. RESULTS: Although we estimated the prevalence of SCD in children to be only 0.8% (71 of 8531 children) during the period from August 2006 through September 2008 in the community surrounding the hospital, 555 (1.6%) of 34,529 children admitted to the hospital during the study period (i.e., from July 1998 through June 2005) were children with SCD; in fact, a total of 309 children with SCD were admitted 555 times. The prevalence of Plasmodium falciparum parasitemia was lower among children with SCD than it was among children without SCD (86 [15.6%] of 551 children vs. 13,835 [41.3%] of 33,500 children; P < .001). Similarly, among those infected with P. falciparum parasites, the mean parasite density was significantly lower among children with SCD than it was among children without SCD (2205 vs. 23,878 parasites/microL; P < .001). Fourteen (16.3%) of 86 parasitemic patients with SCD had features consistent with severe malaria, compared with 3424 (24.7%) of 13,835 parasitemic patients without SCD (odds ratio, 0.59; P < .07). We found no association between malarial parasitemia and death. CONCLUSIONS: We found no evidence to support the conclusion that the risk of malaria is higher among children with SCD than it is among children without SCD in a rural area on the coast of Kenya. Further studies should be undertaken to help policy makers develop appropriate guidelines regarding malarial prophylaxis for patients with SCD in malaria-endemic regions.
BACKGROUND: In sub-Saharan Africa, knowledge of malaria transmission across rapidly proliferating urban centres and recommendations for its prevention or management remain poorly defined. This paper presents the results of an investigation into infection prevalence and treatment of recent febrile events among a slum population in Nairobi, Kenya. METHODS: In July 2008, a community-based malaria parasite prevalence survey was conducted in Korogocho slum, which forms part of the Nairobi Urban Health and Demographic Surveillance system. Interviewers visited 1,069 participants at home and collected data on reported fevers experienced over the preceding 14 days and details on the treatment of these episodes. Each participant was tested for malaria parasite presence with Rapid Diagnostic Test (RDT) and microscopy. Descriptive analyses were performed to assess the period prevalence of reported fever episodes and treatment behaviour. RESULTS: Of the 1,069 participants visited, 983 (92%) consented to be tested. Three were positive for Plasmodium falciparum using RDT; however, all were confirmed negative on microscopy. Microscopic examination of all 953 readable slides showed zero prevalence. Overall, from the 1,004 participants who have data on fever, 170 fever episodes were reported giving a relatively high period prevalence (16.9%, 95% CI:13.9%-20.5%) and higher among children below five years (20.1%, 95%CI:13.8%-27.8%). Of the fever episodes with treatment information 54.3% (95%CI:46.3%-62.2%) were treated as malaria using mainly sulphadoxine-pyrimethamine or amodiaquine, including those managed at a formal health facility. Only four episodes were managed using the nationally recommended first-line treatment, artemether-lumefantrine. CONCLUSION: The study could not demonstrate any evidence of malaria in Korogocho, a slum in the centre of Nairobi. Fever was a common complaint and often treated as malaria with anti-malarial drugs. Strategies, including testing for malaria parasites to reduce the inappropriate exposure of poor communities to expensive anti-malarial drugs provided by clinical services and drug vendors, should be a priority for district planners.
BACKGROUND: Reliable and updated maps of helminth (worm) infection distributions are essential to target control strategies to those populations in greatest need. Although many surveys have been conducted in endemic countries, the data are rarely available in a form that is accessible to policy makers and the managers of public health programmes. This is especially true in sub-Saharan Africa, where empirical data are seldom in the public domain. In an attempt to address the paucity of geographical information on helminth risk, this article describes the development of an updated global atlas of human helminth infection, showing the example of East Africa. METHODS: Empirical, cross-sectional estimates of infection prevalence conducted since 1980 were identified using electronic and manual search strategies of published and unpublished sources. A number of inclusion criteria were imposed for identified information, which was extracted into a standardized database. Details of survey population, diagnostic methods, sample size and numbers infected with schistosomes and soil-transmitted helminths were recorded. A unique identifier linked each record to an electronic copy of the source document, in portable document format. An attempt was made to identify the geographical location of each record using standardized geolocation procedures and the assembled data were incorporated into a geographical information system. RESULTS: At the time of writing, over 2,748 prevalence surveys were identified through multiple search strategies. Of these, 2,612 were able to be geolocated and mapped. More than half (58%) of included surveys were from grey literature or unpublished sources, underlining the importance of reviewing in-country sources. 66% of all surveys were conducted since 2000. Comprehensive, countrywide data are available for Burundi, Rwanda and Uganda. In contrast, information for Kenya and Tanzania is typically clustered in specific regions of the country, with few records from areas with very low population density and/or environmental conditions which are unfavourable for helminth transmission. Information is presented on the prevalence and geographical distribution for the major helminth species. CONCLUSION: For all five countries, the information assembled in the current atlas provides the most reliable, up-to-date and comprehensive source of data on the distribution of common helminth infections to guide the rational implementation of control efforts.
A specific retinopathy has been described in African children with cerebral malaria, but in adults this has not been extensively studied. Since the structure and function of the retinal vasculature greatly resembles the cerebral vasculature, study of retinal changes can reveal insights into the pathophysiology of cerebral malaria. A detailed observational study of malarial retinopathy in Bangladeshi adults was performed using high-definition portable retinal photography. Retinopathy was present in 17/27 adults (63%) with severe malaria and 14/20 adults (70%) with cerebral malaria. Moderate or severe retinopathy was more frequent in cerebral malaria (11/20, 55%) than in uncomplicated malaria (3/15, 20%; P=0.039), bacterial sepsis (0/5, 0%; P=0.038) or healthy controls (0/18, 0%; P<0.001). The spectrum of malarial retinopathy was similar to that previously described in African children, but no vessel discolouration was observed. The severity of retinal whitening correlated with admission venous plasma lactate (P=0.046), suggesting that retinal ischaemia represents systemic ischaemia. In conclusion, retinal changes related to microvascular obstruction were common in adults with severe falciparum malaria and correlated with disease severity and coma, suggesting that a compromised microcirculation has important pathophysiological significance in severe and cerebral malaria. Portable retinal photography has potential as a valuable tool to study malarial retinopathy.
The pathophysiology of coma in cerebral malaria (CM) is not well understood. Obstruction of microcirculatory flow is thought to play a central role, but other hypotheses include roles for parasite- and host-derived factors such as immune mediators, and for increased blood-brain barrier permeability leading to raised intracranial pressure. The retinal vasculature is a direct extension of the cerebral vasculature. It is the only vascular bed easily accessible for visualisation and provides a unique opportunity to observe vascular pathology and its effect on neurological tissue. A specific retinopathy has been well described in African children with CM and its severity correlates with outcome. This retinopathy has been less well described in adults. The central mechanism causing malarial retinopathy appears to be microvascular obstruction, which has been demonstrated in affected retinas by fluorescein angiography. The presence in a central nervous system tissue of microvascular obstruction strongly supports the hypothesis that the sequestration of erythrocytes in small blood vessels and consequent obstruction of microcirculatory flow is an important mechanism causing coma and death in CM. Despite advances in the antimalarial treatment of severe malaria, its mortality remains approximately 15-20%. Adjunctive treatment targeting sequestration is a promising strategy to further lower mortality.
The gravity of the threat posed by vivax malaria to public health has been poorly appreciated. The widely held misperception of Plasmodium vivax as being relatively infrequent, benign, and easily treated explains its nearly complete neglect across the range of biological and clinical research. Recent evidence suggests a far higher and more-severe disease burden imposed by increasingly drug-resistant parasites. The two frontline therapies against vivax malaria, chloroquine and primaquine, may be failing. Despite 60 years of nearly continuous use of these drugs, their respective mechanisms of activity, resistance, and toxicity remain unknown. Although standardized means of assessing therapeutic efficacy against blood and liver stages have not been developed, this review examines the provisional in vivo, ex vivo, and animal model systems for doing so. The rationale, design, and interpretation of clinical trials of therapies for vivax malaria are discussed in the context of the nuance and ambiguity imposed by the hypnozoite. Fielding new drug therapies against real-world vivax malaria may require a reworking of the strategic framework of drug development, namely, the conception, testing, and evaluation of sets of drugs designed for the cure of both blood and liver asexual stages as well as the sexual blood stages within a single therapeutic regimen.
JOURNAL OF INFECTION, 59 (1), pp. 73-73. | Read more2009. UK malaria treatment guidelines (vol 54, pg 111, 2007)
Maternal and child health are high priorities for international development. Through a Review of published work, we show substantial gaps in current knowledge on incidence (cases per live births), aetiology, and risk factors for both maternal and early onset neonatal bacterial sepsis in sub-Saharan Africa. Although existing published data suggest that sepsis causes about 10% of all maternal deaths and 26% of neonatal deaths, these are likely to be considerable underestimates because of methodological limitations. Successful intervention strategies in resource-rich settings and early studies in sub-Saharan Africa suggest that the burden of maternal and early onset neonatal bacterial sepsis could be reduced through simple interventions, including antiseptic and antibiotic treatment. An effective way to expedite evidence to guide interventions and determine the incidence, aetiology, and risk factors for sepsis in sub-Saharan Africa would be through a multiarmed factorial intervention trial aimed at reducing both maternal and early onset neonatal bacterial sepsis in sub-Saharan Africa.
Wild bird surveillance for highly pathogenic avian influenza (HPAI) H5N1 virus from 2004 to 2007 in Thailand indicated that the prevalence of infection with avian influenza H5N1 virus in wild birds was low (1.0%, 95% confidence interval [CI]: 0.7-1.2, 60/6,263 pooled samples). However, the annual prevalence varied considerably over this period, with a peak of 2.7% (95% CI: 1.4, 4.1) in 2004. Prevalence dropped to 0.5% (95% CI: 0.3, 0.8]) and 0.6% (95% CI: 0.3, 1.0) in 2005 and 2006, respectively, and then increased to 1.8% (95% CI: 1.0, 2.6) in 2007. During this period, 16 species from 12 families of wild birds tested positive for H5N1 virus infection. All samples from juvenile birds were negative for H5N1 virus, whereas 0.6% (95% CI: 0.4, 0.9) of pooled samples from adult birds were positive. Most positive samples originated from peridomestic resident species. Infected wild bird samples were only found in provinces where poultry outbreaks had occurred. Detection of H5N1 virus infection in wild birds was reported up to 3 yr after eradication of the poultry outbreaks in those provinces. As observed with outbreaks in poultry, the frequencies of H5N1 outbreaks in wild birds were significantly higher in winter. Further understanding of the mechanisms of persistence and ongoing HPAI H5N1 transmission between wild birds and domestic poultry is needed.
Neth J Med, 67 (7), pp. 291-294. | Citations: 1 (Web of Science Lite)2009. Turning green with shock.
Southeast Asian J Trop Med Public Health, 40 (4), pp. 781-784. | Citations: 5 (Scopus) | Show Abstract2009. Lack of correlation of Burkholderia pseudomallei quantities in blood, urine, sputum and pus.
We evaluated the correlation of Burkholderia pseudomallei quantities in blood versus urine, sputum or pus. Correlations between bacterial counts in blood and other samples were not found. It is likely that an initial seeding event to extracellular organs is followed by independent growth of B. pseudomallei, and that bacteria in the urine were not passively filtered from the bloodstream.
A deterministic state-transition model for mastitis transmission was developed to explore population level effects of antibiotic treatment regimens targeting chronic subclinical mastitis caused by major gram-positive pathogens in lactating dairy cows. Behavior and sensitivity of model outputs to changes in key parameters were explored. Outcomes included the size of the state variables describing proportions of infected quarters and basic and effective reproductive numbers. Treatment effects were estimated by calculating proportional reductions in state variables at equilibrium for populations implementing a treatment program relative to populations with no intervention. In general the relationships between parameters were complex and non-linear, although the model outputs were especially sensitive to changes in the value of the transmission rate parameter. Interaction between the parameters resulted in large variations in treatment effect estimates. Effect estimates calculated from model outputs showed a quadratic curve with a clear optimum at low, but not the lowest, transmission rates. These results indicated that overall positive population level effects of lactation therapy would be realized for herds that have successfully implemented practices that reduce the transmission rate of pathogens. A key finding is that in herds with high transmission rates, treatment of chronically infected quarters was predicted to have little impact on the proportion of infected quarters and no positive population level effect in reducing the force of infection and new infection rates. Results of this study suggest that field trials to evaluate efficacy of antimicrobial treatment should include estimates of indirect treatment effects.
The published genomic sequences of the two major host-transforming Theileria species of cattle represent a rich resource of information that has allowed novel bioinformatic and experimental studies into these important apicomplexan parasites. Since their publication in 2005, the genomes of T. annulata and T. parva have been utilised for a diverse range of applications, ranging from candidate antigen discovery to the identification of genetic markers for population analysis. This has led to advancements in the quest for a sub-unit vaccine, while providing a greater understanding of variation among parasite populations in the field. The unique ability of these Theileria species to induce host cell transformation is the subject of considerable scientific interest and the availability of full genomic sequences has provided new insights into this area of research. This article reviews the data underlying published comparative analyses, focussing on the general features of gene expression, the major Tpr/Tar multi-copy gene family and a re-examination of the predicted macroschizont secretome. Codon usage between the Theileria species is reviewed in detail, as this underpins ongoing comparative studies investigating selection at the intra- and inter-species level. The TashAT/TpshAT family of genes, conserved between T. annulata and T. parva, encodes products targeted to the host nucleus and has been implicated in contributing to the transformed bovine phenotype. Species-specific expansion and diversification at this critical locus is discussed with reference to the availability, in the near future, of genomic datasets which are based on non-transforming Theileria species.
Neurotoxicology and Teratology, 31 (4), pp. 246-246. | Read more2009. Treating severe malaria with pre-referral artesunate saves lives and prevents CNS damage
BACKGROUND: The evasion of host immune response by the human malaria parasite Plasmodium falciparum has been linked to expression of a range of variable antigens on the infected erythrocyte surface. Several genes are potentially involved in this process with the var, rif and stevor multigene families being the most likely candidates and coding for rapidly evolving proteins. The high sequence diversity of proteins encoded by these gene families may have evolved as an immune evasion strategy that enables the parasite to establish long lasting chronic infections. Previous findings have shown that the hypervariable region (HVR) of STEVOR has significant sequence diversity both within as well as across different P. falciparum lines. However, these studies did not address whether or not there are ancestral stevor that can be found in different parasites. METHODS: DNA and RNA sequences analysis as well as phylogenetic approaches were used to analyse the stevor sequence repertoire and diversity in laboratory lines and Kilifi (Kenya) fresh isolates. RESULTS: Conserved stevor genes were identified in different P. falciparum isolates from different global locations. Consistent with previous studies, the HVR of the stevor gene family was found to be highly divergent both within and between isolates. Importantly phylogenetic analysis shows some clustering of stevor sequences both within a single parasite clone as well as across different parasite isolates. CONCLUSION: This indicates that the ancestral P. falciparum parasite genome already contained multiple stevor genes that have subsequently diversified further within the different P. falciparum populations. It also confirms that STEVOR is under strong selection pressure.
BACKGROUND TO THE DEBATE: In a 2007 article in PLoS Medicine, Holger J. Schünemann and colleagues described a new process used by the World Health Organization for rapidly developing clinical management guidelines in emergency situations. These situations include outbreaks of emerging infectious diseases. The authors discussed how they developed such a "rapid advice" guideline for the pharmacological management of avian influenza A (H5N1) virus infection. The guideline recommends giving the antiviral drug oseltamivir at a dose of 75 mg twice daily for five days. In this Debate, Nicholas White argues that such dosing is inadequate, Robert Webster and Elena Govorkova say that combination antiviral therapy should be used, and Tim Uyeki reminds us that clinical care of patients with H5N1 entails much more than antiviral treatment. These issues may also apply to therapy of patients hospitalized with severe disease due to novel swine-origin influenza A (H1N1) virus infection.
Lancet, 373 (9681), pp. 2085-2086. | Citations: 10 (Scopus) | Read more2009. Patient-oriented pandemic influenza research.
BACKGROUND: In areas where malaria is endemic, infants aged <3 months appear to be relatively protected from symptomatic and severe Plasmodium falciparum malaria, but less is known about the effect of Plasmodium vivax infection in this age group. METHODS: To define malaria morbidity in the first year of life in an area where both multidrug-resistant P. falciparum and P. vivax are highly prevalent, data were gathered on all infants attending a referral hospital in Papua, Indonesia, using systematic data forms and hospital computerized records. Additional clinical and laboratory data were prospectively collected from inpatients aged <3 months. RESULTS: From April 2004 through April 2008, 4976 infants were admitted to the hospital, of whom 1560 (31%) had malaria, with infection equally attributable to P. falciparum and P. vivax. The case-fatality rate was similar for inpatients with P. falciparum malaria (13 [2.2%] of 599 inpatients died) and P. vivax malaria (6 [1.0%] of 603 died; P= .161), whereas severe malarial anemia was more prevalent among those with P. vivax malaria (193 [32%] of 605 vs. 144 [24%] of 601; P= .025). Of the 187 infants aged <3 months, 102 (56%) had P. vivax malaria, and 55 (30%) had P. falciparum malaria. In these young infants, infection with P. vivax was associated with a greater risk of severe anemia (odds ratio, 2.4; 95% confidence interval, 1.03-5.91; P= .041) and severe thrombocytopenia (odds ratio, 3.3; 95% confidence interval, 1.07-10.6; P= .036) compared with those who have P. falciparum infection. CONCLUSIONS: P. vivax malaria is a major cause of morbidity in early infancy. Preventive strategies, early diagnosis, and prompt treatment should be initiated in the perinatal period.
BACKGROUND: The first cases of avian influenza A (H5N1) in humans in Vietnam were detected in early 2004, and Vietnam has reported the second highest number of cases globally. METHODS: We obtained retrospective clinical data through review of medical records for laboratory confirmed cases of influenza A (H5N1) infection diagnosed in Vietnam from January 2004 through December 2006. Standard data was abstracted regarding clinical and laboratory features, treatment, and outcome. RESULTS: Data were obtained for 67 (72%) of 93 cases diagnosed in Vietnam over the study period. Patients presented to the hospital after a median duration of illness of 6 days with fever (75%), cough (89%), and dyspnea (81%). Diarrhea and mucosal bleeding at presentation were more common in fatal than in nonfatal cases. Common findings were bilateral pulmonary infiltrates on chest radiograph (72%), lymphopenia (73%), and increased serum transaminase levels (aspartate aminotransferase, 69%; alanine aminotransferase, 61%). Twenty-six patients died (case fatality rate, 39%; 95% confidence interval, 27%-51%) and the most reliable predictor of a fatal outcome was the presence of both neutropenia and raised alanine aminotransferase level at admission, which correctly predicted 91% of deaths and 82% of survivals. The risk of death was higher among persons aged < or =16 years, compared with older persons (P < .001), and the risk of death was higher among patients who did not receive oseltamivir treatment (P = .048). The benefit of oseltamivir treatment remained after controlling for potential confounding by 1 measure of severity (odds ratio, 0.15; 95% confidence interval, 0.026-0.893; P = .034). CONCLUSION: In cases of infection with Influenza A (H5N1), the presence of both neutropenia and raised serum transaminase levels predicts a poor outcome. Oseltamivir treatment shows benefit, but treatment with corticosteroids is associated with an increased risk of death.
BACKGROUND: Persistent nasal carriers have an increased risk of Staphylococcus aureus infection, whereas intermittent carriers and noncarriers share the same low risk. This study was performed to provide additional insight into staphylococcal carriage types. METHODS: Fifty-one volunteers who had been decolonized with mupirocin treatment and whose carriage state was known were colonized artificially with a mixture of S. aureus strains, and intranasal survival of S. aureus was compared between carriage groups. Antistaphylococcal antibody levels were also compared among 83 carriage-classified volunteers. RESULTS: Persistent carriers preferentially reselected their autologous strain from the inoculum mixture (P=.02). They could be distinguished from intermittent carriers and noncarriers on the basis of the duration of postinoculation carriage (154 vs. 14 and 4 days, respectively; P=.017, by log-rank test). Cultures of swab samples from persistent carriers contained significantly more colony-forming units per sample than did cultures of swab samples from intermittent carriers and noncarriers (P=.004). Analysis of serum samples showed that levels of immunoglobulin G and immunoglobulin A to 17 S. aureus antigens were equal in intermittent carriers and noncarriers but not in persistent carriers. CONCLUSIONS: Along with the previously described low risk of infection, intermittent carriers and noncarriers share similar S. aureus nasal elimination kinetics and antistaphylococcal antibody profiles. This implies a paradigm shift; apparently, there are only 2 types of nasal carriers: persistent carriers and others. This knowledge may increase our understanding of susceptibility to S. aureus infection.
BACKGROUND: The Plasmodium purine salvage enzyme, hypoxanthine guanine xanthine phosphoribosyl transferase (HGXPRT) can protect mice against Plasmodium yoelii pRBC challenge in a T cell-dependent manner and has, therefore, been proposed as a novel vaccine candidate. It is not known whether natural exposure to Plasmodium falciparum stimulates HGXPRT T cell reactivity in humans. METHODS: PBMC and plasma collected from malaria-exposed Indonesians during infection and 7-28 days after anti-malarial therapy, were assessed for HGXPRT recognition using CFSE proliferation, IFNgamma ELISPOT assay and ELISA. RESULTS: HGXPRT-specific T cell proliferation was found in 44% of patients during acute infection; in 80% of responders both CD4+ and CD8+ T cell subsets proliferated. Antigen-specific T cell proliferation was largely lost within 28 days of parasite clearance. HGXPRT-specific IFN-gamma production was more frequent 28 days after treatment than during acute infection. HGXPRT-specific plasma IgG was undetectable even in individuals exposed to malaria for at least two years. CONCLUSION: The prevalence of acute proliferative and convalescent IFNgamma responses to HGXPRT demonstrates cellular immunogenicity in humans. Further studies to determine minimal HGXPRT epitopes, the specificity of responses for Plasmodia and associations with protection are required. Frequent and robust T cell proliferation, high sequence conservation among Plasmodium species and absent IgG responses distinguish HGXPRT from other malaria antigens.
Vaccination against Plasmodium falciparum malaria could reduce the worldwide burden of this disease, and decrease its high mortality in children. Replication-defective recombinant adenovirus vectors carrying P. falciparum epitopes may be useful as part of a vaccine that raises cellular immunity to the pre-erythrocytic stage of malaria infection. However, existing immunity to the adenovirus vector results in antibody-mediated neutralization of the vaccine vector, and reduced vaccine immunogenicity. Our aim was to examine a population of children who are at risk from P. falciparum malaria for neutralizing immunity to replication-deficient recombinant chimpanzee adenovirus 63 vector (AdC63), compared to human adenovirus 5 vector (AdHu5). We measured 50% and 90% vector neutralization titers in 200 individual sera, taken from a cohort of children from Kenya, using a secreted alkaline phosphatase neutralization assay. We found that 23% of the children (aged 1-6 years) had high-titer neutralizing antibodies to AdHu5, and 4% had high-titer neutralizing antibodies to AdC63. Immunity to both vectors was age-dependent. Low-level neutralization of AdC63 was significantly less frequent than AdHu5 neutralization at the 90% neutralization level. We conclude that AdC63 may be a useful vector as part of a prime-boost malaria vaccine in children.
Trans R Soc Trop Med Hyg, 103 (6), pp. 643-644. | Citations: 5 (Web of Science Lite) | Read more2009. The role of mathematical modelling in malaria elimination and eradication (Comment on: Can malaria be eliminated?).
Am J Trop Med Hyg, 80 (6), pp. 902-904. | Citations: 13 (Web of Science Lite) | Show Abstract2009. Antimalarial drug susceptibility of Plasmodium vivax in the Republic of Korea.
The antimalarial susceptibility of ring stage (> 80%) Plasmodium vivax from the Republic of Korea, where long incubation-period strains are prevalent, was evaluated using the schizont maturation inhibition technique. During 2005-2007, susceptibility to seven antimalarial drugs was evaluated with 24 fresh isolates. The geometric mean (95% confidence interval) 50% inhibition concentration (IC(50)) were quinine 60 (54-75) ng/mL, chloroquine 39 (22-282) ng/mL, piperaquine 27 (17-58) ng/mL, mefloquine 39 (35-67) ng/mL, pyrimethamine 138 (89-280) ng/mL, artesunate 0.6 (0.5-0.8) ng/mL, and primaquine 122 (98-232) ng/mL. Positive correlations were found between quinine and mefloquine (r = 0.6, P = 0.004), piperaquine and chloroquine (r = 0.6, P = 0.008), and piperaquine and primaquine IC(50) values (r = 0.5, P = 0.01). Compared with P. vivax in Thailand, P. vivax in the Republic of Korea was more sensitive to quinine and mefloquine, but equally sensitive to chloroquine and artesunate.
CLINICAL TOXICOLOGY, 47 (5), pp. 461-461.2009. Venomous Exotic Snakes: Global Overview Warrell DA
OBJECTIVES: We describe treatment failure rates by antibiotic duration for prosthetic joint infection (PJI) managed with debridement, antibiotics and implant retention (DAIR). METHODS: We retrospectively collected data from all the cases of PJI that were managed with DAIR over a 5 year period. Surgical debridement, microbiological sampling, early intravenous antibiotics and prolonged oral follow-on antibiotics were used. RESULTS: One hundred and twelve cases of PJI were identified. Twenty infections (18%) recurred during a mean follow-up of 2.3 years. The mean duration of antibiotic use was 1.5 years. Failure was more common after arthroscopic debridement, for previously revised joints and for Staphylococcus aureus infection. There were 12 failures after stopping antibiotics and 8 while on antibiotics [hazard ratio (HR) = 4.3, 95% confidence interval (CI) 1.4-12.8, P = 0.01]. However, during the first 3 months of follow-up, there were eight failures after stopping antibiotics and two while on antibiotics (HR = 7.0, 95% CI 1.5-33, P = 0.015). The duration of antibiotic therapy prior to stopping did not predict outcome. CONCLUSIONS: PJI may be managed by DAIR. The risk of failure with this strategy rises after stopping oral antibiotics, but lengthening antibiotic therapy may simply postpone, rather than prevent, failure.
Am J Trop Med Hyg, 80 (6), pp. 905-913. | Citations: 21 (Scopus) | Show Abstract2009. Impact of ministry of health interventions on private medicine retailer knowledge and practices on anti-malarial treatment in Kenya.
Small-scale interventions on training medicine retailers on malaria treatment improve over-the-counter medicine use, but there is little evidence on effectiveness when scaled up. This study evaluated the impact of Ministry of Health (MoH) training programs on the knowledge and practices of medicine retailers in three districts in Kenya. A cluster randomized trial was planned across 10 administrative divisions. Findings indicated that 30.7% (95% confidence interval [CI]: 23.3, 39.0) and 5.2% (95% CI: 2.1, 10.3) of program and control retailers, respectively, sold MoH amodiaquine with correct advice on use to surrogate clients (OR = 8.8; 95% CI: 2.9, 26.9; P < 0.001). Similarly, 61.8% (95% CI: 54.2, 69.1) and 6.3% (95% CI: 2.7, 12.1) of program and control retailers, respectively, reported correct knowledge on dosing with amodiaquine (OR = 29.8; 95% CI: 8.2, 108.8). Large-scale retailer training programs within the national malaria control framework led to significant improvements in retailers' practices across three districts.
Biological samples collected in refugee camps during an outbreak of hepatitis E were used to compare the accuracy of hepatitis E virus RNA amplification by real-time reverse transcription-PCR (RT-PCR) for sera and dried blood spots (concordance of 90.6%). Biological profiles (RT-PCR and serology) of asymptomatic individuals were also analyzed.
Am J Trop Med Hyg, 80 (6), pp. 919-926. | Citations: 69 (Scopus) | Show Abstract2009. Effect of malaria rapid diagnostic tests on the management of uncomplicated malaria with artemether-lumefantrine in Kenya: a cluster randomized trial.
Shortly after Kenya introduced artemether-lumefantrine (AL) for first-line treatment of uncomplicated malaria, we conducted a pre-post cluster randomized controlled trial to assess the effect of providing malaria rapid diagnostic tests (RDTs) on recommended treatment (patients with malaria prescribed AL) and overtreatment (patients without malaria prescribed AL) in outpatients >/= 5 years old. Sixty health facilities were randomized to receive either RDTs plus training, guidelines, and supervision (TGS) or TGS alone. Of 1,540 patients included in the analysis, 7% had uncomplicated malaria. The provision of RDTs coupled with TGS emphasizing AL use only after laboratory confirmation of malaria reduced recommended treatment by 63%-points (P = 0.04), because diagnostic test use did not change (-2%-points), but health workers significantly reduced presumptive treatment with AL for patients with a clinical diagnosis of malaria who did not undergo testing (-36%-points; P = 0.03). Health workers generally adhered to RDT results when prescribing AL: 88% of RDT-positive and 9% of RDT-negative patients were treated with AL, respectively. Overtreatment was low in both arms and was not significantly reduced by the provision of RDTs (-12%-points, P = 0.30). RDTs could potentially improve malaria case management, but we urgently need to develop more effective strategies for implementing guidelines before large scale implementation.
Am J Trop Med Hyg, 80 (6), pp. 881. | Citations: 4 (Scopus)2009. Severe retinal whitening in an adult with cerebral malaria.
BACKGROUND: A severe and challenging complication in the treatment of hemophilia A is the development of inhibiting antibodies (inhibitors) directed towards factor VIII (FVIII). Inhibitors aggravate bleeding complications, disabilities and costs. The etiology of inhibitor development is incompletely understood. OBJECTIVES: In a large cohort study in patients with mild/moderate hemophilia A we evaluated the role of genotype and intensive FVIII exposure in inhibitor development. PATIENTS/METHODS: Longitudinal clinical data from 138 mild/moderate hemophilia A patients were retrospectively collected from 1 January 1980 to 1 January 2008 and analyzed by multivariate analysis using Poisson regression. RESULTS: Genotyping demonstrated the Arg593Cys missense mutation in 52 (38%) patients; the remaining 86 patients had 26 other missense mutations. Sixty-three (46%) patients received intensive FVIII concentrate administration, 41 of them for surgery. Ten patients (7%) developed inhibitors, eight of them carrying the Arg593Cys mutation. Compared with the other patients, those with the Arg593Cys mutation had a 10-fold increased risk of developing inhibitors (RR 10; 95% CI, 0.9-119).The other two inhibitor patients had the newly detected mutations Pro1761Gln and Glu2228Asp. In both these patients and in five patients with genotype Arg593Cys, inhibitors developed after intensive peri-operative use of FVIII concentrate (RR 186; 95% CI, 25-1403). In five of the 10 inhibitor patients FVIII was administered by continuous infusion during surgery (RR 13; 95% CI, 1.9-86). CONCLUSION: The Arg593Cys genotype and intensive peri-operative use of FVIII, especially when administered by continuous infusion, are associated with an increased risk for inhibitor development in mild/moderate hemophilia A.
OBJECTIVES: We sought to identify risk factors for recurrence of Staphylococcus aureus bacteraemia (SAB) by auditing compliance with guidelines on its treatment in our hospital. METHODS: We retrospectively identified patients over the preceding 8 years whose SAB had recurred, matching each to a control patient with non-recurrent SAB. RESULTS: 40/1870 patients with SAB had suffered recurrent disease (2.1%), 33 of whom were available for study. Where 2, 4 and 6 weeks of intravenous therapy were recommended, 78%, 29% and 25% of patients received it, and there was no association with recurrence. Glycopeptide use in patients with methicillin sensitive SAB (MSSA) was significantly associated with recurrence (p=0.015). Where the source of the bacteraemia was a peripheral venous catheter the odds of recurrence were less than where an SAB originated at another site (p=0.047). All patients with SAB in whom a central venous catheter was not removed suffered recurrence. CONCLUSIONS: We found the recurrence rate after SAB was low despite poor compliance with guidelines on treatment duration. Glycopeptide therapy for MSSA bacteraemia was more likely to result in recurrent SAB than beta-lactam therapy. Recurrence was significantly less likely in patients where the source of the SAB was a peripheral line than in those with another source.
BACKGROUND: HIV serosurveys have become important sources of HIV prevalence estimates, but these estimates may be biased because of refusals and other forms of non-response. We investigate the effect of the post-test counseling study protocol on bias due to the refusal to be tested. METHODS: Data come from a nine-month prospective study of hospital admissions in Addis Ababa during which patients were approached for an HIV test. Patients had the choice between three consent levels: testing and post-test counseling (including the return of HIV test results), testing without post-test counseling, and total refusal. For all patients, information was collected on basic sociodemographic background characteristics as well as admission diagnosis. The three consent levels are used to mimic refusal bias in serosurveys with different post-test counseling study protocols. We first investigate the covariates of consent for testing. Second, we quantify refusal bias in HIV prevalence estimates using Heckman regression models that account for sample selection. RESULTS: Refusal to be tested positively correlates with admission diagnosis (and thus HIV status), but the magnitude of refusal bias in HIV prevalence surveys depends on the study protocol. Bias is larger when post-test counseling and the return of HIV test results is a prerequisite of study participation (compared to a protocol where test results are not returned to study participants, or, where there is an explicit provision for respondents to forego post-test counseling). We also find that consent for testing increased following the introduction of antiretroviral therapy in Ethiopia. Other covariates of refusal are age (non-linear effect), gender (higher refusal rates in men), marital status (lowest refusal rates in singles), educational status (refusal rate increases with educational attainment), and counselor. CONCLUSION: The protocol for post-test counseling and the return of HIV test results to study participants is an important consideration in HIV prevalence surveys that wish to minimize refusal bias. The availability of ART is likely to reduce refusal rates.
Foot-and-mouth disease (FMD) causes sporadic disease outbreaks in the Lao People's Democratic Republic (Lao PDR) and appears to be endemic within a livestock population largely susceptible to infection. As Lao PDR is a major thoroughfare for transboundary animal movement, regular FMD outbreaks occur causing economic hardship for farmers and their families. The dominant serotype causing outbreaks between 1998 and 2006 was type O. Using phylogenetic analysis, type O isolated viruses were divided into two topotypes: South East Asia (SEA) and the Middle East-South Asia (ME-SA). Type A virus was reported only in 2003 and 2006 and type Asia 1 only in 1996 and 1998.
We report a genome-wide association (GWA) study of severe malaria in The Gambia. The initial GWA scan included 2,500 children genotyped on the Affymetrix 500K GeneChip, and a replication study included 3,400 children. We used this to examine the performance of GWA methods in Africa. We found considerable population stratification, and also that signals of association at known malaria resistance loci were greatly attenuated owing to weak linkage disequilibrium (LD). To investigate possible solutions to the problem of low LD, we focused on the HbS locus, sequencing this region of the genome in 62 Gambian individuals and then using these data to conduct multipoint imputation in the GWA samples. This increased the signal of association, from P = 4 × 10(-7) to P = 4 × 10(-14), with the peak of the signal located precisely at the HbS causal variant. Our findings provide proof of principle that fine-resolution multipoint imputation, based on population-specific sequencing data, can substantially boost authentic GWA signals and enable fine mapping of causal variants in African populations.
BACKGROUND: We wanted to try to account for worker motivation as a key factor that might affect the success of an intervention to improve implementation of health worker practices in eight district hospitals in Kenya. In the absence of available tools, we therefore aimed to develop a tool that could enable a rapid measurement of motivation at baseline and at subsequent points during the 18-month intervention study. METHODS: After a literature review, a self-administered questionnaire was developed to assess the outcomes and determinants of motivation of Kenyan government hospital staff. The initial questionnaire included 23 questions (from seven underlying constructs) related to motivational outcomes that were then used to construct a simpler tool to measure motivation. Parallel qualitative work was undertaken to assess the relevance of the questions chosen and the face validity of the tool. RESULTS: Six hundred eighty-four health workers completed the questionnaires at baseline. Reliability analysis and factor analysis were used to produce the simplified motivational index, which consisted of 10 equally-weighted items from three underlying factors. Scores on the 10-item index were closely correlated with scores for the 23-item index, indicating that in future rapid assessments might be based on the 10 questions alone. The 10-item motivation index was also able to identify statistically significant differences in mean health worker motivation scores between the study hospitals (p<0.001). The parallel qualitative work in general supported these conclusions and contributed to our understanding of the three identified components of motivation. CONCLUSION: The 10-item score developed may be useful to monitor changes in motivation over time within our study or be used for more extensive rapid assessments of health worker motivation in Kenya.
BACKGROUND: The spread of resistance to chloroquine (CQ) led to its withdrawal from use in most countries in sub-Saharan Africa in the 1990s. In Malawi, this withdrawal was followed by a rapid reduction in the frequency of resistance to the point where the drug is now considered to be effective once again, just nine years after its withdrawal. In this report, the polymorphisms of markers associated with CQ-resistance against Plasmodium falciparum isolates from coastal Kenya (Kilifi) were investigated, from 1993, prior to the withdrawal of CQ, to 2006, seven years after its withdrawal. Changes to those that occurred in the dihydrofolate reductase gene (dhfr) that confers resistance to the replacement drug, pyrimethamine/sulphadoxine were also compared. METHODS: Mutations associated with CQ resistance, at codons 76 of pfcrt, at 86 of pfmdr1, and at codons 51, 59 and 164 of dhfr were analysed using PCR-restriction enzyme methods. In total, 406, 240 and 323 isolates were genotyped for pfcrt-76, pfmdr1-86 and dhfr, respectively. RESULTS: From 1993 to 2006, the frequency of the pfcrt-76 mutant significantly decreased from around 95% to 60%, while the frequency of pfmdr1-86 did not decline, remaining around 75%. Though the frequency of dhfr mutants was already high (around 80%) at the start of the study, this frequency increased to above 95% during the study period. Mutation at codon 164 of dhfr was analysed in 2006 samples, and none of them had this mutation. CONCLUSION: In accord with the study in Malawi, a reduction in resistance to CQ following official withdrawal in 1999 was found, but unlike Malawi, the decline of resistance to CQ in Kilifi was much slower. It is estimated that, at current rates of decline, it will take 13 more years for the clinical efficacy of CQ to be restored in Kilifi. In addition, CQ resistance was declining before the drug's official withdrawal, suggesting that, prior to the official ban, the use of CQ had decreased, probably due to its poor clinical effectiveness.
Science, 324 (5929), pp. 885. | Citations: 23 (Web of Science Lite) | Read more2009. Public health. The cholera crisis in Africa.
Science (New York, N.Y.), 324 (5929), pp. 885. | Citations: 19 (Scopus) | Read more2009. Public health. The cholera crisis in Africa.
BACKGROUND: Studies have highlighted the inadequacies of the public health sector in sub-Saharan African countries in providing appropriate malaria case management. The readiness of the public health sector to provide malaria case-management in Somalia, a country where there has been no functioning central government for almost two decades, was investigated. METHODS: Three districts were purposively sampled in each of the two self-declared states of Puntland and Somaliland and the south-central region of Somalia, in April-November 2007. A survey and mapping of all public and private health service providers was undertaken. Information was recorded on services provided, types of anti-malarial drugs used and stock, numbers and qualifications of staff, sources of financial support and presence of malaria diagnostic services, new treatment guidelines and job aides for malaria case-management. All settlements were mapped and a semi-quantitative approach was used to estimate their population size. Distances from settlements to public health services were computed. RESULTS: There were 45 public health facilities, 227 public health professionals, and 194 private pharmacies for approximately 0.6 million people in the three districts. The median distance to public health facilities was 6 km. 62.3% of public health facilities prescribed the nationally recommended anti-malarial drug and 37.7% prescribed chloroquine as first-line therapy. 66.7% of public facilities did not have in stock the recommended first-line malaria therapy. Diagnosis of malaria using rapid diagnostic tests (RDT) or microscopy was performed routinely in over 90% of the recommended public facilities but only 50% of these had RDT in stock at the time of survey. National treatment guidelines were available in 31.3% of public health facilities recommended by the national strategy. Only 8.8% of the private pharmacies prescribed artesunate plus sulphadoxine/pyrimethamine, while 53.1% prescribed chloroquine as first-line therapy. 31.4% of private pharmacies also provided malaria diagnosis using RDT or microscopy. CONCLUSION: Geographic access to public health sector is relatively low and there were major shortages of appropriate guidelines, anti-malarials and diagnostic tests required for appropriate malaria case management. Efforts to strengthen the readiness of the health sector in Somalia to provide malaria case management should improve availability of drugs and diagnostic kits; provide appropriate information and training; and engage and regulate the private sector to scale up malaria control.
BACKGROUND: Chronic Obstructive Pulmonary Disease (COPD) is a systemic disease; morbidity and mortality due to COPD are on the increase, and it has great impact on patients' lives. Most COPD patients are managed by general practitioners (GP). Too often, GPs base their initial assessment of patient's disease severity mainly on lung function. However, lung function correlates poorly with COPD-specific health-related quality of life and exacerbation frequency. A validated COPD disease risk index that better represents the clinical manifestations of COPD and is feasible in primary care seems to be useful. The objective of this study is to develop and validate a practical COPD disease risk index that predicts the clinical course of COPD in primary care patients with GOLD stages 2-4. METHODS/DESIGN: We will conduct 2 linked prospective cohort studies with COPD patients from GPs in Switzerland and the Netherlands. We will perform a baseline assessment including detailed patient history, questionnaires, lung function, history of exacerbations, measurement of exercise capacity and blood sampling. During the follow-up of at least 2 years, we will update the patients' profile by registering exacerbations, health-related quality of life and any changes in the use of medication. The primary outcome will be health-related quality of life. Secondary outcomes will be exacerbation frequency and mortality. Using multivariable regression analysis, we will identify the best combination of variables predicting these outcomes over one and two years and, depending on funding, even more years. DISCUSSION: Despite the diversity of clinical manifestations and available treatments, assessment and management today do not reflect the multifaceted character of the disease. This is in contrast to preventive cardiology where, nowadays, the treatment in primary care is based on patient-specific and fairly refined cardiovascular risk profile corresponding to differences in prognosis. After completion of this study, we will have a practical COPD-disease risk index that predicts the clinical course of COPD in primary care patients with GOLD stages 2-4. In a second step we will incorporate evidence-based treatment effects into this model, such that the instrument may guide physicians in selecting treatment based on the individual patients' prognosis. TRIAL REGISTRATION: ClinicalTrials.gov Archive NCT00706602.
BACKGROUND: An assessment of the correlation between anti-malarial treatment outcome and molecular markers would improve the early detection and monitoring of drug resistance by Plasmodium falciparum. The purpose of this systematic review was to determine the risk of treatment failure associated with specific polymorphisms in the parasite genome or gene copy number. METHODS: Clinical studies of non-severe malaria reporting on target genetic markers (SNPs for pfmdr1, pfcrt, dhfr, dhps, gene copy number for pfmdr1) providing complete information on inclusion criteria, outcome, follow up and genotyping, were included. Three investigators independently extracted data from articles. Results were stratified by gene, codon, drug and duration of follow-up. For each study and aggregate data the random effect odds ratio (OR) with 95%CIs was estimated and presented as Forest plots. An OR with a lower 95th confidence interval > 1 was considered consistent with a failure being associated to a given gene mutation. RESULTS: 92 studies were eligible among the selection from computerized search, with information on pfcrt (25/159 studies), pfmdr1 (29/236 studies), dhfr (18/373 studies), dhps (20/195 studies). The risk of therapeutic failure after chloroquine was increased by the presence of pfcrt K76T (Day 28, OR = 7.2 [95%CI: 4.5-11.5]), pfmdr1 N86Y was associated with both chloroquine (Day 28, OR = 1.8 [95%CI: 1.3-2.4]) and amodiaquine failures (OR = 5.4 [95%CI: 2.6-11.3, p < 0.001]). For sulphadoxine-pyrimethamine the dhfr single (S108N) (Day 28, OR = 3.5 [95%CI: 1.9-6.3]) and triple mutants (S108N, N51I, C59R) (Day 28, OR = 3.1 [95%CI: 2.0-4.9]) and dhfr-dhps quintuple mutants (Day 28, OR = 5.2 [95%CI: 3.2-8.8]) also increased the risk of treatment failure. Increased pfmdr1 copy number was correlated with treatment failure following mefloquine (OR = 8.6 [95%CI: 3.3-22.9]). CONCLUSION: When applying the selection procedure for comparative analysis, few studies fulfilled all inclusion criteria compared to the large number of papers identified, but heterogeneity was limited. Genetic molecular markers were related to an increased risk of therapeutic failure. Guidelines are discussed and a checklist for further studies is proposed.
Am J Trop Med Hyg, 80 (5), pp. 837-840. | Citations: 14 (Scopus) | Show Abstract2009. Comparison of indirect immunofluorescence assays for diagnosis of scrub typhus and murine typhus using venous blood and finger prick filter paper blood spots.
We performed indirect immunofluorescence assays (IFAs) to compare levels of IgM and IgG antibodies to Orientia tsutsugamushi and Rickettsia typhi in admission-phase serum samples and filter paper blood spots (assayed immediately and stored at 5.4 degrees C and 29 degrees C for 30 days) collected on the same day from 53 adults with suspected scrub typhus and murine typhus admitted to Mahosot Hospital Vientiane, Lao People's Democratic Republic. The sensitivities and specificities of admission-phase filter paper blood spots in comparison to paired sera were between 91% and 95% and 87% and 100%, respectively, for the diagnosis of scrub typhus and murine typhus. The classification of patients as having or not having typhus did not significantly differ after storage of the blood spots for 30 days (P > 0.4) at 5.4 degrees C and 29 degrees C. Because filter paper blood samples do not require sophisticated and expensive storage and transport, they may be an appropriate specimen collection technique for the diagnosis of rickettsial disease in the rural tropics.
High cerebral blood flow velocity (CBFv) and low haemoglobin oxygen saturation (SpO(2)) predict neurological complications in sickle cell anaemia (SCA) but any association is unclear. In a cross-sectional study of 105 Kenyan children, mean CBFv was 120 +/- 34.9 cm/s; 3 had conditional CBFv (170-199 cm/s) but none had abnormal CBFv (>200 cm/s). After adjustment for age and haematocrit, CBFv > or =150 cm/s was predicted by SpO(2) < or = 95% and history of fever. Four years later, 10 children were lost to follow-up, none had suffered neurological events and 11/95 (12%) had died, predicted by history of fever but not low SpO(2). Natural history of SCA in Africa may be different from North America and Europe.
Vaccine research is a combinatorial science requiring computational analysis of vaccine components, formulations and optimization. We have developed a framework that combines computational tools for the study of immune function and vaccine development. This framework, named ImmunoGrid combines conceptual models of the immune system, models of antigen processing and presentation, system-level models of the immune system, Grid computing, and database technology to facilitate discovery, formulation and optimization of vaccines. ImmunoGrid modules share common conceptual models and ontologies. The ImmunoGrid portal offers access to educational simulators where previously defined cases can be displayed, and to research simulators that allow the development of new, or tuning of existing, computational models. The portal is accessible at <igrid-ext.cryst.bbk.ac.uk/immunogrid>.
In studies of immunity to malaria, the absence of febrile malaria is commonly considered evidence of "protection." However, apparent "protection" may be due to a lack of exposure to infective mosquito bites or due to immunity. We studied a cohort that was given curative antimalarials before monitoring began and documented newly acquired asymptomatic parasitemia and febrile malaria episodes during 3 months of surveillance. With increasing age, there was a shift away from febrile malaria to acquiring asymptomatic parasitemia, with no change in the overall incidence of infection. Antibodies to the infected red cell surface were associated with acquiring asymptomatic infection rather than febrile malaria or remaining uninfected. Bed net use was associated with remaining uninfected rather than acquiring asymptomatic infection or febrile malaria. These observations suggest that most uninfected children were unexposed rather than "immune." Had they been immune, we would have expected the proportion of uninfected children to rise with age and that the uninfected children would have been distinguished from children with febrile malaria by the protective antibody response. We show that removing the less exposed children from conventional analyses clarifies the effects of immunity, transmission intensity, bed nets, and age. Observational studies and vaccine trials will have increased power if they differentiate between unexposed and immune children.
Am J Trop Med Hyg, 80 (5), pp. 737-738. | Citations: 52 (Scopus) | Show Abstract2009. Malaria drug shortages in Kenya: a major failure to provide access to effective treatment.
A key bench mark of successful therapeutic policy implementation, and thus effectiveness, is that the recommended drugs are available at the point of care. Two years after artemether-lumefathrine (AL) was introduced for the management of uncomplicated malaria in Kenya, we carried out a cross-sectional survey to investigate AL availability in government facilities in seven malaria-endemic districts. One of four of the surveyed facilities had none of the four AL weight-specific treatment packs in stock; three of four facilities were out of stock of at least one weight-specific AL pack, leading health workers to prescribe a range of inappropriate alternatives. The shortage was in large part caused by a delayed procurement process. National ministries of health and the international community must address the current shortcomings facing antimalarial drug supply to the public sector.
Partial nucleotide sequences (459 bp) of the groEL gene (encoding the 60-kDa heat shock protein, HSP60) from 23 contemporary isolates of Orientia tsutsugamushi isolated from patients with acute scrub typhus in Thailand were compared with 16 reference strain sequences to evaluate the potential of groEL as a conserved and representative target for molecular diagnostics.. Overall nucleotide identity within all available O. tsutsugamushi isolates (n = 39) was 98.8% (range: 95.0-100), reflecting a high degree of conservation; nucleotide identities were 67.5% and 65.6%, respectively, when typhus and spotted fever group rickettsiae were included.. A highly sensitive and quantitative real-time PCR assay was designed and evaluated using 61 samples, including buffy coats from patients in Thailand and Laos. Reliable and accurate quantitation of bacterial loads allows further investigation of other diagnostic methods and may lead to an improved understanding of the pathophysiology of acute scrub typhus, an important but under-recognized disease.
Long considered a benign infection, Plasmodium vivax is now recognized as a cause of severe and fatal malaria, despite its low parasite biomass, the increased deformability of vivax-infected red blood cells and an apparent paucity of parasite sequestration. Severe anemia is associated with recurrent bouts of hemolysis of predominantly uninfected erythrocytes with increased fragility, and lung injury is associated with inflammatory increases in alveolar-capillary membrane permeability. Although rare, vivax-associated coma challenges our understanding of pathobiology caused by Plasmodium spp. Host and parasite factors contribute to the risk of severe disease, and comorbidities might contribute to vivax mortality. In this review, we discuss potential mechanisms underlying the syndromes of uncomplicated and severe vivax malaria, identifying key areas for future research.
Int J Tuberc Lung Dis, 13 (5), pp. 613-619. | Citations: 16 (Web of Science Lite) | Show Abstract2009. Sputum, sex and scanty smears: new case definition may reduce sex disparities in smear-positive tuberculosis.
SETTING: Urban clinic, Nairobi. OBJECTIVES: To evaluate the impact of specimen quality and different smear-positive tuberculosis (TB) case (SPC) definitions on SPC detection by sex. DESIGN: Prospective study among TB suspects. RESULTS: A total of 695 patients were recruited: 644 produced > or =1 specimen for microscopy. The male/female sex ratio was 0.8. There were no significant differences in numbers of men and women submitting three specimens (274/314 vs. 339/380, P = 0.43). Significantly more men than women produced a set of three 'good' quality specimens (175/274 vs. 182/339, P = 0.01). Lowering thresholds for definitions to include scanty smears resulted in increases in SPC detection in both sexes; the increase was significantly higher for women. The revised World Health Organization (WHO) case definition was associated with the highest detection rates in women. When analysis was restricted only to patients submitting 'good' quality specimen sets, the difference in detection between sexes was on the threshold for significance (P = 0.05). CONCLUSIONS: Higher SPC notification rates in men are commonly reported by TB control programmes. The revised WHO SPC definition may reduce sex disparities in notification. This should be considered when evaluating other interventions aimed at reducing these. Further study is required on the effects of the human immuno-deficiency virus and instructed specimen collection on sex-specific impact of new SPC definition.
Am J Trop Med Hyg, 80 (5), pp. 827-831. | Citations: 23 (Scopus) | Show Abstract2009. Microbiologic characterization and antimicrobial susceptibility of Clostridium tetani isolated from wounds of patients with clinically diagnosed tetanus.
Clostridium tetani is the etiologic agent of the muscle-spasming disease tetanus. Despite an effective vaccine, tetanus is an ongoing problem in some developing countries. Diagnosis by bacterial culture is not done because it is generally unnecessary and the entry of route of the bacteria can be inapparent. We attempted to isolate and evaluate C. tetani from the wounds of 84 patients with tetanus. We effectively isolated C. tetani from 45 patients. All strains tested positive by polymerase chain reaction for the gene encoding tetanus neurotoxin. Antimicrobial susceptibilities were determined by disc diffusion and E-test. All C. tetani isolates were susceptible to penicillin and metronidazole but resistant to co-trimoxazole. Despite treatment with high doses of penicillin, C. tetani was isolated after 16 days of intravenous penicillin in two cases. These data show that the intravenous route for penicillin may be inadequate for clearing the infection and emphasizes wound debridement in the treatment of tetanus.
PLoS Medicine, 6 (5), pp. e1000085-e1000085. | Read more2009. Hedging against Antiviral Resistance during the Next Influenza Pandemic Using Small Stockpiles of an Alternative Chemotherapy
Plasmodium belongs to the phylum Apicomplexa. Within the Apicomplexa, Plasmodium, Toxoplasma and Cryptosporidium are parasites of considerable medical importance while Theileria and Eimeria are animal pathogens. P. falciparum is particularly important as it causes malaria, resulting in more than 1 million deaths each year. The malaria parasite actively invades the host cell in which it propagates and several proteins associated with the apical organelles have been implicated to be crucial in the invasion process. The biogenesis of the apical organelles is not well understood, but several studies indicate that microtubule-based vesicular transport is involved. Vesicular transport proteins are also present in Plasmodium and are presumed to be involved in transcellular transport in infected erythrocytes. Dynein is a multi-subunit motor protein involved in microtubule-based vesicular transport. In this study, we analyzed the cytoplasmic dynein light chains (Dlcs) of P. falciparum since they provide adaptor surface to the cargoes and are likely to be involved in differential transport. Dlcs consist of three different families: TcTex1/2, LC8 and LC7/roadblock. The data presented demonstrate that P. falciparum Dlcs sequences and functional domains show high sequence similarity within the species, but that only the Dlc group 1 (LC8) has a high similarity to human orthologues. TcTex1 and LC7/roadblock have low similarity to human orthologues. This sequence variation could be targeted for vaccine or drug development.
Huisarts en Wetenschap, 52 (5), pp. 218-224.2009. Antibiotics for exacerbation of chronic obstructive pulmonary disease seems to reduce relapses
Ann Intern Med, 150 (8), pp. 567-568. | Citations: 23 (Web of Science Lite) | Read more2009. Gallbladder carriage of Salmonella paratyphi A may be an important factor in the increasing incidence of this infection in South Asia.
BACKGROUND: The CD4 cell count at which combination antiretroviral therapy should be started is a central, unresolved issue in the care of HIV-1-infected patients. In the absence of randomised trials, we examined this question in prospective cohort studies. METHODS: We analysed data from 18 cohort studies of patients with HIV. Antiretroviral-naive patients from 15 of these studies were eligible for inclusion if they had started combination antiretroviral therapy (while AIDS-free, with a CD4 cell count less than 550 cells per microL, and with no history of injecting drug use) on or after Jan 1, 1998. We used data from patients followed up in seven of the cohorts in the era before the introduction of combination therapy (1989-95) to estimate distributions of lead times (from the first CD4 cell count measurement in an upper range to the upper threshold of a lower range) and unseen AIDS and death events (occurring before the upper threshold of a lower CD4 cell count range is reached) in the absence of treatment. These estimations were used to impute completed datasets in which lead times and unseen AIDS and death events were added to data for treated patients in deferred therapy groups. We compared the effect of deferred initiation of combination therapy with immediate initiation on rates of AIDS and death, and on death alone, in adjacent CD4 cell count ranges of width 100 cells per microL. FINDINGS: Data were obtained for 21 247 patients who were followed up during the era before the introduction of combination therapy and 24 444 patients who were followed up from the start of treatment. Deferring combination therapy until a CD4 cell count of 251-350 cells per microL was associated with higher rates of AIDS and death than starting therapy in the range 351-450 cells per microL (hazard ratio [HR] 1.28, 95% CI 1.04-1.57). The adverse effect of deferring treatment increased with decreasing CD4 cell count threshold. Deferred initiation of combination therapy was also associated with higher mortality rates, although effects on mortality were less marked than effects on AIDS and death (HR 1.13, 0.80-1.60, for deferred initiation of treatment at CD4 cell count 251-350 cells per microL compared with initiation at 351-450 cells per microL). INTERPRETATION: Our results suggest that 350 cells per microL should be the minimum threshold for initiation of antiretroviral therapy, and should help to guide physicians and patients in deciding when to start treatment.
BACKGROUND: In Uganda, like in many other countries traditionally viewed as harbouring very high malaria transmission, the norm has been to recommend that febrile episodes are diagnosed as malaria. In this study, the policy implications of such recommendations are revisited. METHODS: A cross-sectional survey was undertaken at outpatient departments of all health facilities in four Ugandan districts. The routine diagnostic practices were assessed for all patients during exit interviews and a research slide was obtained for later reading. Primary outcome measures were the accuracy of national recommendations and routine malaria diagnosis in comparison with the study definition of malaria (any parasitaemia on expert slide examination in patient with fever) stratified by age and intensity of malaria transmission. Secondary outcome measures were the use, interpretation and accuracy of routine malaria microscopy. RESULTS: 1,763 consultations undertaken by 233 health workers at 188 facilities were evaluated. The prevalence of malaria was 24.2% and ranged between 13.9% in patients >or=5 years in medium-to-high transmission areas to 50.5% for children <5 years in very high transmission areas. Overall, the sensitivity and negative predictive value (NPV) of routine malaria diagnosis were high (89.7% and 91.6% respectively) while the specificity and positive predictive value (PPV) were low (35.6% and 30.8% respectively). However, malaria was under-diagnosed in 39.9% of children less than five years of age in the very high transmission area. At 48 facilities with functional microscopy, the use of malaria slide examination was low (34.5%) without significant differences between age groups, or between patients for whom microscopy is recommended or not. 96.2% of patients with a routine positive slide result were treated for malaria but also 47.6% with a negative result. CONCLUSION: Current recommendations and associated clinical practices result in massive malaria over-diagnosis across all age groups and transmission areas in Uganda. Yet, under-diagnosis is also common in children <5 years. The potential benefits of malaria microscopy are not realized. To address malaria misdiagnosis, Uganda's policy shift from presumptive to parasitological diagnosis should encompass introduction of malaria rapid diagnostic tests and substantial strengthening of malaria microscopy.
BACKGROUND: Asthma is a difficult diagnosis to establish in preschool children. A few years ago, our group presented a prediction rule for young children at risk for asthma in general practice. Before this prediction rule can safely be used in practice, cross-validation is required. In addition, general practitioners face many therapeutic management decisions in children at risk for asthma. The objectives of the study are: (1) identification of predictors for asthma in preschool children at risk for asthma with the aim of cross-validating an earlier derived prediction rule; (2) compare the effects of different treatment strategies in preschool children. DESIGN: In this prospective cohort study one to five year old children at risk of developing asthma were selected from general practices. At risk was defined as 'visited the general practitioner with recurrent coughing (>or= 2 visits), wheezing (>or=1) or shortness of breath (>or=1) in the previous 12 months'. All children in this prospective cohort study will be followed until the age of six. For our prediction rule, demographic data, data with respect to clinical history and additional tests (specific immunoglobulin E (IgE), fractional exhaled nitric oxide (FENO), peak expiratory flow (PEF)) are collected. History of airway specific medication use, symptom severity and health-related quality of life (QoL) are collected to estimate the effect of different treatment intensities (as expressed in GINA levels) using recently developed statistical techniques. In total, 1,938 children at risk of asthma were selected from general practice and 771 children (40%) were enrolled. At the time of writing, follow-up for all 5-year olds and the majority of the 4-year olds is complete. The total and specific IgE measurements at baseline were carried out by 87% of the children. Response rates to the repeated questionnaires varied from 93% at baseline to 73% after 18 months follow-up; 89% and 87% performed PEF and FENO measurements, respectively. DISCUSSION: In this study a prediction rule for asthma in young children, to be used in (general) practice, will be cross-validated. Our study will also provide more insight in the effect of treatment of asthma in preschool children.
Wilderness & Environmental Medicine, 20 (1), pp. 81-82. | Citations: 1 (Scopus) | Read more2009. Clinical Images
A liquid chromatographic tandem mass spectroscopy method for the quantification of artemisinin in human heparinised plasma has been developed and validated. The method uses Oasis HLB mu-elution solid phase extraction 96-well plates to facilitate a high throughput of 192 samples a day. Artesunate (internal standard) in a plasma-water solution was added to plasma (50 microL) before solid phase extraction. Artemisinin and its internal standard artesunate were analysed by liquid chromatography and MS/MS detection on a Hypersil Gold C18 (100 mm x 2.1 mm, 5 microm) column using a mobile phase containing acetonitrile-ammonium acetate 10mM pH 3.5 (50:50, v/v) at a flow rate of 0.5 mL/min. The method has been validated according to published FDA guidelines and showed excellent performance. The within-day, between-day and total precisions expressed as R.S.D., were lower than 8% at all tested quality control levels including the upper and lower limit of quantification. The limit of detection was 0.257 ng/mL for artemisinin and the calibration range was 1.03-762 ng/mL using 50 microL plasma. The method was free from matrix effects as demonstrated both graphically and quantitatively.
BACKGROUND: Infection with the Gram-negative bacterium Burkholderia pseudomallei is an important cause of community-acquired lethal sepsis in endemic regions in southeast Asia and northern Australia and is increasingly reported in other tropical areas. In animal models, production of interferon-gamma (IFN-gamma) is critical for resistance, but in humans the characteristics of IFN-gamma production and the bacterial antigens that are recognized by the cell-mediated immune response have not been defined. METHODS: Peripheral blood from 133 healthy individuals who lived in the endemic area and had no history of melioidosis, 60 patients who had recovered from melioidosis, and 31 other patient control subjects were stimulated by whole bacteria or purified bacterial proteins in vitro, and IFN-gamma responses were analyzed by ELISPOT and flow cytometry. FINDINGS: B. pseudomallei was a potent activator of human peripheral blood NK cells for innate production of IFN-gamma. In addition, healthy individuals with serological evidence of exposure to B. pseudomallei and patients recovered from active melioidosis developed CD4(+) (and CD8(+)) T cells that recognized whole bacteria and purified proteins LolC, OppA, and PotF, members of the B. pseudomallei ABC transporter family. This response was primarily mediated by terminally differentiated T cells of the effector-memory (T(EMRA)) phenotype and correlated with the titer of anti-B. pseudomallei antibodies in the serum. CONCLUSIONS: Individuals living in a melioidosis-endemic region show clear evidence of T cell priming for the ability to make IFN-gamma that correlates with their serological status. The ability to detect T cell responses to defined B. pseudomallei proteins in large numbers of individuals now provides the opportunity to screen candidate antigens for inclusion in protein or polysaccharide-conjugate subunit vaccines against this important but neglected disease.
Mefloquine is widely used in combination with artemisinin derivatives for the treatment of falciparum malaria. Mefloquine resistance in Plasmodium falciparum has been related to increased copy numbers of multidrug-resistant gene 1 (pfmdr1). We studied the ex vivo dynamics of pfmdr1 gene amplification in culture-adapted P. falciparum in relation to mefloquine resistance and parasite fitness. A Thai P. falciparum isolate (isolate TM036) was assessed by the use of multiple genetic markers as a single genotype. Resistance was selected by exposure to stepwise increasing concentrations of mefloquine up to 30 ng/ml in continuous culture. The pfmdr1 gene copy numbers increased as susceptibility to mefloquine declined (P = 0.03). No codon mutations at positions 86, 184, 1034, 1042, and 1246 in the pfmdr1 gene were detected. Two subclones of selected parasites (average copy numbers, 2.3 and 3.1, respectively) showed a fitness disadvantage when they were grown together with the original parasites containing a single pfmdr1 gene copy in the absence of mefloquine; the multiplication rates were 6.3% and 8.7% lower, respectively (P < 0.01). Modeling of the dynamics of the pfmdr1 copy numbers over time in relation to the relative fitness of the parasites suggested that net pfmdr1 gene amplification from one to two copies occurs once in every 10(8) parasites and that amplification from two to three copies occurs once in every 10(3) parasites. pfmdr1 gene amplification in P. falciparum is a frequent event and confers mefloquine resistance. Parasites with multiple copies of the pfmdr1 gene have decreased survival fitness in the absence of drug pressure.
Severe Plasmodium falciparum malaria is a major cause of global mortality, yet the immunological factors underlying progression to severe disease remain unclear. CD4(+)CD25(+) regulatory T cells (Treg cells) are associated with impaired T cell control of Plasmodium spp infection. We investigated the relationship between Treg cells, parasite biomass, and P. falciparum malaria disease severity in adults living in a malaria-endemic region of Indonesia. CD4(+)CD25(+)Foxp3(+)CD127(lo) Treg cells were significantly elevated in patients with uncomplicated (UM; n = 17) and severe malaria (SM; n = 16) relative to exposed asymptomatic controls (AC; n = 10). In patients with SM, Treg cell frequency correlated positively with parasitemia (r = 0.79, p = 0.0003) and total parasite biomass (r = 0.87, p<0.001), both major determinants for the development of severe and fatal malaria, and Treg cells were significantly increased in hyperparasitemia. There was a further significant correlation between Treg cell frequency and plasma concentrations of soluble tumor necrosis factor receptor II (TNFRII) in SM. A subset of TNFRII(+) Treg cells with high expression of Foxp3 was increased in severe relative to uncomplicated malaria. In vitro, P. falciparum-infected red blood cells dose dependently induced TNFRII(+)Foxp3(hi) Treg cells in PBMC from malaria-unexposed donors which showed greater suppressive activity than TNFRII(-) Treg cells. The selective enrichment of the Treg cell compartment for a maximally suppressive TNFRII(+)Foxp3(hi) Treg subset in severe malaria provides a potential link between immune suppression, increased parasite biomass, and malaria disease severity. The findings caution against the induction of TNFRII(+)Foxp3(hi) Treg cells when developing effective malaria vaccines.
A systematic review was performed to determine the effectiveness of different approaches for eradicating methicillin-resistant Staphylococcus aureus carriage. Twenty-three clinical trials were selected that evaluated oral antibiotics (7 trials), topically applied antibiotics (12 trials), or both (4 trials). Because of clinical heterogeneity, quantitative analysis of all studies was deemed to be inappropriate, and exploratory subgroup analyses were performed for studies with similar study populations, methods, and targeted bacteria. The estimated pooled relative risk of treatment failure 1 week after short-term nasal mupirocin treatment, compared with placebo, was 0.10 (range, 0.07-0.14). There was low heterogeneity between study outcomes, and effects were similar for patients and healthy subjects, as well as in studies that included only methicillin-susceptible S. aureus carriers or both methicillin-susceptible S. aureus and methicillin-resistant S. aureus carriers. The development of drug resistance during treatment was reported in 1% and 9% of patients receiving mupirocin and oral antibiotics, respectively. Short-term nasal application of mupirocin is the most effective treatment for eradicating methicillin-resistant S. aureus carriage, with an estimated success of rate of 90% 1 week after treatment and approximately 60% after a longer follow-up period.
BACKGROUND: Artemether-lumefantrine is the most widely recommended artemisinin-based combination treatment for falciparum malaria. Quantification of artemether and its metabolite dihydroartemisinin in biological matrices has traditionally been difficult, with sensitivity being an issue. RESULTS: A high-throughput bioanalytical method for the analysis of artemether and its metabolite dihydroartemisinin in human plasma using solid-phase extraction in the 96-well plate format and liquid chromatography coupled to positive ion mode tandem mass spectroscopy has been developed and validated according to US FDA guidelines. The method uses 50 µl plasma and covers the calibration range 1.43-500 ng/ml with a limit of detection at 0.36 ng/ml. CONCLUSIONS: The developed liquid chromatography-tandem mass spectrometry assay is more sensitive than all previous methods despite using a lower plasma volume (50 µl) and is highly suitable for clinical studies where plasma volumes are limited, such as pediatric trials.
BACKGROUND: Transmission of highly pathogenic avian H5N1 viruses from poultry to humans have raised fears of an impending influenza pandemic. Concerted efforts are underway to prepare effective vaccines and therapies including polyclonal or monoclonal antibodies against H5N1. Current efforts are hampered by the paucity of information on protective immune responses against avian influenza. Characterizing the B cell responses in convalescent individuals could help in the design of future vaccines and therapeutics. METHODS AND FINDINGS: To address this need, we generated whole-genome-fragment phage display libraries (GFPDL) expressing fragments of 15-350 amino acids covering all the proteins of A/Vietnam/1203/2004 (H5N1). These GFPDL were used to analyze neutralizing human monoclonal antibodies and sera of five individuals who had recovered from H5N1 infection. This approach led to the mapping of two broadly neutralizing human monoclonal antibodies with conformation-dependent epitopes. In H5N1 convalescent sera, we have identified several potentially protective H5N1-specific human antibody epitopes in H5 HA[(-10)-223], neuraminidase catalytic site, and M2 ectodomain. In addition, for the first time to our knowledge in humans, we identified strong reactivity against PB1-F2, a putative virulence factor, following H5N1 infection. Importantly, novel epitopes were identified, which were recognized by H5N1-convalescent sera but did not react with sera from control individuals (H5N1 naïve, H1N1 or H3N2 seropositive). CONCLUSION: This is the first study, to our knowledge, describing the complete antibody repertoire following H5N1 infection. Collectively, these data will contribute to rational vaccine design and new H5N1-specific serodiagnostic surveillance tools.
OBJECTIVE: To describe the prevalence of hypoxaemia in children admitted to a hospital in Kenya for the purpose of identifying clinical signs of hypoxaemia for emergency triage assessment, and to test the hypothesis that such signs lead to correct identification of hypoxaemia in children, irrespective of their diagnosis. METHODS: From 2002 to 2005 we prospectively collected clinical data and pulse oximetry measurements for all paediatric admissions to Kilifi District Hospital, Kenya, irrespective of diagnosis, and assessed the prevalence of hypoxaemia in relation to the WHO clinical syndromes of 'pneumonia' on admission and the final diagnoses made at discharge. We used the data collected over the first three years to derive signs predictive of hypoxaemia, and data from the fourth year to validate those signs. FINDINGS: Hypoxemia was found in 977 of 15 289 (6.4%) of all admissions (5% to 19% depending on age group) and was strongly associated with inpatient mortality (age-adjusted risk ratio: 4.5; 95% confidence interval, CI: 3.8-5.3). Although most hypoxaemic children aged > 60 days met the WHO criteria for a syndrome of 'pneumonia' on admission, only 215 of the 693 (31%) such children had a final diagnosis of lower respiratory tract infection (LRTI). The most predictive signs for hypoxaemia included shock, a heart rate < 80 beats per minute, irregular breathing, a respiratory rate > 60 breaths per minute and impaired consciousness. However, 5-15% of the children who had hypoxaemia on admission were missed, and 18% of the children were incorrectly identified as hypoxaemic. CONCLUSION: The syndromes of pneumonia make it possible to identify most hypoxaemic children, including those without LRTI. Shock, bradycardia and irregular breathing are important predictive signs, and severe malaria with respiratory distress is a common cause of hypoxaemia. Overall, however, clinical signs are poor predictors of hypoxaemia, and using pulse oximetry in resource-poor health facilities to target oxygen therapy is likely to save costs.
The diagnosis of severe Streptococcus pneumoniae infection relies heavily on insensitive culture techniques. To improve the usefulness of PCR assays, we developed a dual-PCR protocol (targeted at pneumolysin and autolysin) for EDTA blood samples. This was compared to the Binax NOW S. pneumoniae urine antigen test in patients with bacteremic pneumococcal infections. Patients with nonbacteremic community-acquired pneumonia also were tested by these methods to determine what proportion could be confirmed as pneumococcal infections. A direct comparison was made in a group of patients who each had both tests performed. The Binax NOW S. pneumoniae urine antigen test was positive in 51 of 58 bacteremic pneumococcal cases (sensitivity, 88%; 95% confidence interval [CI], 77 to 95%), whereas the dual PCR was positive in 31 cases (sensitivity, 53.5%; 95% CI, 40 to 67%; P < 0.0001), and all of these had detectable urinary antigens. Both tests gave positive results in 2 of 51 control patients (referred to as other-organism septicemia), giving a specificity of 96% (95% CI, 86.5 to 99.5%). In 77 patients with nonbacteremic community-acquired pneumonia, urinary antigen was detected significantly more often (in 21 patients [27%]) than a positive result by the dual-PCR protocol (6 [8%]) (P = 0.002). The development of a dual-PCR protocol enhanced the sensitivity compared to that of the individual assays, but it is still significantly less sensitive than the Binax NOW urine antigen test, as well as being more time-consuming and expensive. Urinary antigen detection is the nonculture diagnostic method of choice for patients with possible severe pneumococcal infection.
Neth J Med, 67 (4), pp. 127-133. | Citations: 8 (Scopus) | Show Abstract2009. Establishment of reference values for endocrine tests. Part VII: growth hormone deficiency.
BACKGROUND: Plasma insulin-like growth factor (IGF-I) concentration can be used as a rough indicator of the growth-hormone status. However, for the diagnosis of growth hormone deficiency, dynamic tests are required. The growth hormone (GH) response in the insulin tolerance test (ITT) is considered to be the gold standard in this respect. An alternative for the ITT is the GHRH/ GHRP-6 test, which has fewer side effects. In this study we established reference values for IGF-I levels and for the GH response in both dynamic tests. METHODS: We studied 296 subjects recruited from the general population, equally distributed according to sex and aged between 20 and 70 years. Serum IGF-I level was measured in all subjects and an insulin tolerance test (0.15 U/kg Actrapid iv) and GHRH/GHRP-6 test (1 microg GHRH/kg and 1 microg GHRP-6/kg) were performed in 49 subjects. RESULTS: In multivariate analyses both IGF-I and the GH response in the ITT were significantly influenced by age, whereas the GH response in the GHRH/GHRP-6 test was significantly affected by BMI. There was no sex difference in IGF-I and in the GHRH/GHRP-6 test, but in the ITT males had a higher GH peak. There was a significant correlation between the GH responses in both tests, and the GH response was significantly higher in the GHRH/GHRP-6 test than in the ITT. Age-adjusted reference values were established for each test. CONCLUSION: We have established age-adjusted reference values for serum IGF-I and for the GH response in the ITT and GHRH/GHRP-6 test.
OBJECTIVES: To assess impact of serial lumbar punctures on association between cerebrospinal fluid (CSF) opening pressure and prognosis in HIV-associated cryptococcal meningitis; to explore time course and relationship of opening pressure with neurological findings, CSF fungal burden, immune response, and CD4 cell count. DESIGN: Evaluation of 163 HIV-positive ART-naive patients enrolled in three trials of amphotericin B-based therapy for cryptococcal meningitis in Thailand and South Africa. METHODS: Study protocols required four lumbar punctures with measurements of opening pressure over the first 2 weeks of treatment and additional lumbar punctures if opening pressure raised. Fungal burden and clearance, CSF immune parameters, CD4 cell count, neurological symptoms and signs, and outcome at 2 and 10 weeks were compared between groups categorized by opening pressure at cryptococcal meningitis diagnosis. RESULTS: Patients with higher baseline fungal burden had higher baseline opening pressure. High fungal burden appeared necessary but not sufficient for development of high pressure. Baseline opening pressure was not associated with CD4 cell count, CSF pro-inflammatory cytokines, or altered mental status. Day 14 opening pressure was associated with day 14 fungal burden. Overall mortality was 12% (20/162) at 2 weeks and 26% (42/160) at 10 weeks, with no significant differences between opening pressure groups. CONCLUSION: Studies are needed to define factors, in addition to fungal burden, associated with raised opening pressure. Aggressive management of raised opening pressure through repeated CSF drainage appeared to prevent any adverse impact of raised opening pressure on outcome in patients with cryptococcal meningitis. The results support increasing access to manometers in resource-poor settings and routine management of opening pressure in patients with cryptococcal meningitis.
PLoS Med, 6 (3), pp. e52. | Citations: 53 (Web of Science Lite) | Read more2009. Guidelines for field surveys of the quality of medicines: a proposal.
BACKGROUND: Efficient allocation of resources to intervene against malaria requires a detailed understanding of the contemporary spatial distribution of malaria risk. It is exactly 40 y since the last global map of malaria endemicity was published. This paper describes the generation of a new world map of Plasmodium falciparum malaria endemicity for the year 2007. METHODS AND FINDINGS: A total of 8,938 P. falciparum parasite rate (PfPR) surveys were identified using a variety of exhaustive search strategies. Of these, 7,953 passed strict data fidelity tests for inclusion into a global database of PfPR data, age-standardized to 2-10 y for endemicity mapping. A model-based geostatistical procedure was used to create a continuous surface of malaria endemicity within previously defined stable spatial limits of P. falciparum transmission. These procedures were implemented within a Bayesian statistical framework so that the uncertainty of these predictions could be evaluated robustly. The uncertainty was expressed as the probability of predicting correctly one of three endemicity classes; previously stratified to be an informative guide for malaria control. Population at risk estimates, adjusted for the transmission modifying effects of urbanization in Africa, were then derived with reference to human population surfaces in 2007. Of the 1.38 billion people at risk of stable P. falciparum malaria, 0.69 billion were found in Central and South East Asia (CSE Asia), 0.66 billion in Africa, Yemen, and Saudi Arabia (Africa+), and 0.04 billion in the Americas. All those exposed to stable risk in the Americas were in the lowest endemicity class (PfPR2-10 < or = 5%). The vast majority (88%) of those living under stable risk in CSE Asia were also in this low endemicity class; a small remainder (11%) were in the intermediate endemicity class (PfPR2-10 > 5 to < 40%); and the remaining fraction (1%) in high endemicity (PfPR2-10 > or = 40%) areas. High endemicity was widespread in the Africa+ region, where 0.35 billion people are at this level of risk. Most of the rest live at intermediate risk (0.20 billion), with a smaller number (0.11 billion) at low stable risk. CONCLUSIONS: High levels of P. falciparum malaria endemicity are common in Africa. Uniformly low endemic levels are found in the Americas. Low endemicity is also widespread in CSE Asia, but pockets of intermediate and very rarely high transmission remain. There are therefore significant opportunities for malaria control in Africa and for malaria elimination elsewhere. This 2007 global P. falciparum malaria endemicity map is the first of a series with which it will be possible to monitor and evaluate the progress of this intervention process.
N Engl J Med, 360 (12), pp. 1254. | Citations: 10 (Scopus) | Read more2009. Antimalarial therapies in children from Papua New Guinea.
NEW ENGLAND JOURNAL OF MEDICINE, 360 (12), pp. 1253-1254.2009. RTS,S/AS01E Vaccine against Malaria THE AUTHORS REPLY
PLoS Med, 6 (3), pp. e44. | Citations: 20 (Scopus) | Read more2009. Furious rabies after an atypical exposure.
EUROPEAN JOURNAL OF CLINICAL NUTRITION, 63 (3), pp. 450-450. | Read more2009. Postpartum traditions and nutrition practices among urban Lao women and their infants in Vientiane, Lao PDR (vol 63, pg 323, 2009)
OBJECTIVE: To explore the cost-effectiveness of artesunate against quinine based principally on the findings of a large multi-centre trial carried out in Southeast Asia. METHODS: Trial data were used to compare mortality of patients with severe malaria, treated with either artesunate or quinine. This was combined with retrospectively collected cost data to estimate the incremental cost per death averted with the use of artesunate instead of quinine. RESULTS: The incremental cost per death averted using artesunate was approximately 140 USD. Artesunate maintained this high level of cost-effectiveness also when allowing for the uncertainty surrounding the cost and effectiveness assessments. CONCLUSION: This analysis confirms the vast superiority of artesunate for treatment of severe malaria from an economic as well as a clinical perspective.
The effects of loading doses and probenecid coadministration on oseltamivir pharmacokinetics at four increasing dose levels in groups of eight healthy adult Thai volunteers (125 individual series) were evaluated. Doses of up to 675 mg were well-tolerated. The pharmacokinetics were dose linear. Oseltamivir phosphate (OS) was rapidly and completely absorbed and converted (median conversion level, 93%) to the active carboxylate metabolite. Median elimination half-lives (and 95% confidence intervals [CI]) were 1.0 h (0.9 to 1.1 h) for OS and 5.1 h (4.7 to 5.7 h) for oseltamivir carboxylate (OC). One subject repeatedly showed markedly reduced OS-to-OC conversion, indicating constitutionally impaired carboxylesterase activity. The coadministration of probenecid resulted in a mean contraction in the apparent volume of distribution of OC of 40% (95% CI, 37 to 44%) and a reduction in the renal elimination of OC of 61% (95% CI, 58 to 62%), thereby increasing the median area under the concentration-time curve (AUC) for OC by 154% (range, 71 to 278%). The AUC increase for OC in saliva was approximately three times less than the AUC increase for OC in plasma. A loading dose 1.25 times the maintenance dose should be given for severe influenza pneumonia. Probenecid coadministration may allow considerable dose saving for oseltamivir, but more information on OC penetration into respiratory secretions is needed to devise appropriate dose regimens.
BACKGROUND: Statins reduce the rates of heart attacks, strokes, and revascularization procedures (ie, major vascular events) in a wide range of circumstances. Randomized controlled trial data from 20,536 adults have been used to estimate the cost-effectiveness of prescribing statin therapy in the United States for people at different levels of vascular disease risk and to explore whether wider use of generic statins beyond the populations currently recommended for treatment in clinical guidelines is indicated. METHODS AND RESULTS: Randomized controlled trial data, an internally validated vascular disease model, and US costs of statin therapy and other medical care were used to project lifetime risks of vascular events and evaluate the cost-effectiveness of 40 mg simvastatin daily. For an average of 5 years, allocation to simvastatin reduced the estimated US costs of hospitalizations for vascular events by approximately 20% (95% CI, 15 to 24) in the different subcategories of participants studied. At a daily cost of $1 for 40 mg generic simvastatin, the estimated costs of preventing a vascular death within the 5-year study period ranged from a net saving of $1300 (95% CI, $15,600 saving to $13,200 cost) among participants with a 42% 5-year major vascular event risk to a net cost of $216,500 ($123,700 to $460,000 cost) among those with a 12% 5-year risk. The costs per life year gained with lifetime simvastatin treatment ranged from $2500 (-$40 to $3820) in people aged 40 to 49 years with a 42% 5-year major vascular event risk to $10,990 ($9430 to $14,700) in people aged 70 years and older with a 12% 5-year risk. CONCLUSIONS: Treatment with generic simvastatin appears to be cost-effective for a much wider population in the United States than that recommended by current guidelines.
Amodiaquine retains efficacy against infection by chloroquine-resistant Plasmodium falciparum; however, little information is available on its efficacy against infection by chloroquine-resistant Plasmodium vivax. Patients presenting to a rural clinic with a pure P. vivax infection that recurred after recent antimalarial treatment were retreated, this time with amodiaquine monotherapy, and the risk of further recurrence within 4 weeks was assessed. Of the 87 patients with pure P. vivax infection, 15 patients did not complete a full course of treatment, 4 of whom were intolerant to treatment. In the 72 patients completing treatment, 91% (63 of 69) had cleared their parasitemia within 48 h with no early treatment failure. Follow-up to day 28 or recurrent parasitemia was achieved for 56 patients (78%). The cumulative incidence of treatment failure by day 28 was 22.8% (95% confidence interval, 7.3 to 38%). The in vitro sensitivity profile was determined for a separate set of isolates from outpatients with pure P. vivax infection. The median 50% inhibitory concentration of amodiaquine was 11.3 nM (range, 0.37 to 95.8) and was correlated significantly with that of chloroquine (Spearman rank correlation coefficient, 0.602; P < 0.001). Although amodiaquine results in a rapid clinical response, the risk of recurrence by day 28 is unacceptably high, reducing its suitability as an alternative treatment of infection by chloroquine-resistant P. vivax in this region.
Streptococcus suis infection is acquired through exposure to contaminated pigs or pig meat. Over the past few years, the number of reported S. suis infections in humans has increased significantly, with most cases originating in Southeast Asia, where there is a high density of pigs. Increased awareness, improved diagnostics, and the occurrence of outbreaks have contributed to this increase. Meningitis and sepsis are the most common clinical manifestations of S. suis infection; hearing loss is a frequent complication. In this article, we provide an overview of the emergence and clinical manifestations of S. suis infection.
BACKGROUND: Surveillance for invasive pneumococcal disease has been conducted using a variety of case ascertainment methods and diagnostic tools. Interstudy differences in observed rates of invasive pneumococcal disease could reflect variations in surveillance methods or true epidemiological differences in disease incidence. To facilitate comparisons of surveillance data among countries, investigators of Pneumococcal Vaccines Accelerated Development and Introduction Plan-sponsored projects have developed standard case definitions and data reporting methods. METHODS: Investigators developed case definitions for meningitis, pneumonia, and very severe disease using existing World Health Organization guidelines and clinical definitions from Africa and Asia. Standardized case definitions were used to standardize reporting of aggregated results. Univariate analyses were conducted to compare results among countries and to identify factors contributing to detection of Streptococcus pneumoniae. RESULTS: Surveillance sites varied with regard to the age groups targeted, disease syndromes monitored, specimens collected, and laboratory methods employed. The proportion of specimens positive for pneumococcus was greater for cerebrospinal fluid specimens (1.2%-19.4%) than for blood specimens (0.1%-1.4%) in all countries (range, 1.3-38-fold greater). The distribution of disease syndromes and pneumonia severity captured by surveillance differed among countries. The proportion of disease cases with pneumococcus detected varied by syndrome (meningitis, 1.4%-10.8%; pneumonia, 0.2%-1.3%; other, 0.2%-1.2%) and illness severity (nonsevere pneumonia, 0%-2.7%; severe pneumonia, 0.2%-1.2%), although these variations were not consistent for all sites. Antigen testing and polymerase chain reaction increased the proportion of cerebrospinal fluid specimens with pneumococcus identified by 1.3-5.5-fold, compared with culture alone. CONCLUSIONS: Standardized case definitions and data reporting enhanced our understanding of pneumococcal epidemiology and enabled us to assess the contributions of specimen type, disease syndrome, pneumonia severity, and diagnostic tools to rate of pneumococcal detection. Broader standardization and more-detailed data reporting would further improve interpretation of surveillance results.
BACKGROUND: Accurate etiological diagnosis of meningitis in developing countries is needed, to improve clinical care and to optimize disease-prevention strategies. Cerebrospinal fluid (CSF) culture and latex agglutination testing are currently the standard diagnostic methods but lack sensitivity. METHODS: We prospectively assessed the utility of an immunochromatographic test (ICT) of pneumococcal antigen (NOW Streptococcus pneumoniae Antigen Test; Binax), compared with culture, in 5 countries that are conducting bacterial meningitis surveillance in Africa and Asia. Most CSF samples were collected from patients aged 1-59 months. RESULTS: A total of 1173 CSF samples from suspected meningitis cases were included. The ICT results were positive for 68 (99%) of the 69 culture-confirmed pneumococcal meningitis cases and negative for 124 (99%) of 125 culture-confirmed bacterial meningitis cases caused by other pathogens. By use of culture and latex agglutination testing alone, pneumococci were detected in samples from 7.4% of patients in Asia and 15.6% in Africa. The ICT increased pneumococcal detection, resulting in similar identification rates across sites, ranging from 16.2% in Nigeria to 20% in Bangladesh. ICT detection in specimens from culture-negative cases varied according to region (8.5% in Africa vs. 18.8% in Asia; P< .001), prior antibiotic use (24.2% with prior antibiotic use vs. 12.2% without; P< .001), and WBC count (9.0% for WBC count of 10-99 cells/mL, 22.1% for 100-999 cells/mL, and 25.4% for >or=1000 cells/mL; P< .001 by test for trend). CONCLUSIONS: The ICT provided substantial benefit over the latex agglutination test and culture at Asian sites but not at African sites. With the addition of the ICT, the proportion of meningitis cases attributable to pneumococci was determined to be similar in Asia and Africa. These results suggest that previous studies have underestimated the proportion of pediatric bacterial meningitis cases caused by pneumococci.
In a region with high rates of mortality among children aged <5 years, the underfunded health care systems of sub-Saharan Africa have few resources available to perform surveillance activities that can help determine the causes of morbidity and mortality in the region. At present, there are few examples of attempts to promote public health care surveillance that might inform current debates about how to expand and improve surveillance, particularly for bacterial diseases. Driven by this gap in knowledge, we attempted to explore the successes and failures of the Network for Surveillance of Pneumococcal Disease in the East African Region and to share the experiences of what are essentially nonresearch public-sector hospitals in East Africa, with the hopes that surveillance systems for other diseases, especially those that require complex diagnostic support, may be informed by these experiences. The state of services essential for surveillance and the measures taken to overcome any shortcomings are described, as is the progress made in improving clinical diagnosis, laboratory processing, and data management. For surveillance to play a role in public health care, ministries of health and associated institutions must own and push forward the surveillance agenda, with support from global partners, and take advantage of the developments that have been achieved within the institutions.
BACKGROUND: Case management guidelines use a limited set of clinical features to guide assessment and treatment for common childhood diseases in poor countries. Using video records of clinical signs we assessed agreement among experts and assessed whether Kenyan health workers could identify signs defined by expert consensus. METHODOLOGY: 104 videos representing 11 clinical sign categories were presented to experts using a web questionnaire. Proportionate agreement and agreement beyond chance were calculated using kappa and the AC1 statistic. 31 videos were selected and presented to local health workers, 20 for which experts had demonstrated clear agreement and 11 for which experts could not demonstrate agreement. PRINCIPAL FINDINGS: Experts reached very high level of chance adjusted agreement for some videos while for a few videos no agreement beyond chance was found. Where experts agreed Kenyan hospital staff of all cadres recognised signs with high mean sensitivity and specificity (sensitivity: 0.897-0.975, specificity: 0.813-0.894); years of experience, gender and hospital had no influence on mean sensitivity or specificity. Local health workers did not agree on videos where experts had low or no agreement. Results of different agreement statistics for multiple observers, the AC1 and Fleiss' kappa, differ across the range of proportionate agreement. CONCLUSION: Videos provide a useful means to test agreement amongst geographically diverse groups of health workers. Kenyan health workers are in agreement with experts where clinical signs are clear-cut supporting the potential value of assessment and management guidelines. However, clinical signs are not always clear-cut. Video recordings offer one means to help standardise interpretation of clinical signs.
OBJECTIVES: Assessments of population-level effects of antiretroviral therapy (ART) programmes in Africa are rare. We use data from burial sites to estimate trends in adult AIDS mortality and the mitigating effects of ART in Addis Ababa. ART has been available since 2003, and for free since 2005. METHODS: To substitute for deficient vital registration, we use surveillance of burials at all cemeteries. We present trends in all-cause mortality, and estimate AIDS mortality (ages 20-64 years) from lay reports of causes of death. These lay reports are first used as a diagnostic test for the true cause of death. As reference standard, we use the cause of death established via verbal autopsy interviews conducted in 2004. The positive predictive value and sensitivity are subsequently used as anchors to estimate the number of AIDS deaths for the period 2001-2007. Estimates are compared with Spectrum projections. RESULTS: Between 2001 and 2005, the number of AIDS deaths declined by 21.9 and 9.3% for men and women, respectively. Between 2005 and 2007, the number of AIDS deaths declined by 38.2 for men and 42.9% for women. Compared with the expected number in the absence of ART, the reduction in AIDS deaths in 2007 is estimated to be between 56.8 and 63.3%, depending on the coverage of the burial surveillance. CONCLUSION: Five years into the ART programme, adult AIDS mortality has been reduced by more than half. Following the free provision of ART in 2005, the decline accelerated and became more sex balanced. Substantial AIDS mortality, however, persists.
DNA microarrays and specific reverse-transcription polymerase chain reaction assays were used to reveal transcriptional patterns in the blood of children presenting with dengue shock syndrome (DSS) and well-matched patients with uncomplicated dengue. The transcriptome of patients with acute uncomplicated dengue was characterized by a metabolically demanding "host-defense" profile; transcripts related to oxidative metabolism, interferon signaling, protein ubiquination, apoptosis, and cytokines were prominent. In contrast, the transcriptome of patients with DSS was surprisingly benign, particularly with regard to transcripts derived from apoptotic and type I interferon pathways. These data highlight significant heterogeneity in the type or timing of host transcriptional immune responses precipitated by dengue virus infection independent of the duration of illness. In particular, they suggest that, if transcriptional events in the blood compartment contribute to capillary leakage leading to hypovolemic shock, they occur before cardiovascular decompensation, a finding that has implications for rational adjuvant therapy in this syndrome.
The tumor necrosis factor gene (TNF) and lymphotoxin-alpha gene (LTA) have long attracted attention as candidate genes for susceptibility traits for malaria, and several of their polymorphisms have been found to be associated with severe malaria (SM) phenotypes. In a large study involving >10,000 individuals and encompassing 3 African populations, we found evidence to support the reported associations between the TNF -238 polymorphism and SM in The Gambia. However, no TNF/LTA polymorphisms were found to be associated with SM in cohorts in Kenya and Malawi. It has been suggested that the causal polymorphisms regulating the TNF and LTA responses may be located some distance from the genes. Therefore, more-detailed mapping of variants across TNF/LTA genes and their flanking regions in the Gambian and allied populations may need to be undertaken to find any causal polymorphisms.
BACKGROUND: Most malaria deaths occur in rural areas. Rapid progression from illness to death can be interrupted by prompt, effective medication. Antimalarial treatment cannot rescue terminally ill patients but could be effective if given earlier. If patients who cannot be treated orally are several hours from facilities for injections, rectal artesunate can be given before referral and acts rapidly on parasites. We investigated whether this intervention reduced mortality and permanent disability. METHODS: In Bangladesh, Ghana, and Tanzania, patients with suspected severe malaria who could not be treated orally were allocated randomly to a single artesunate (n=8954) or placebo (n=8872) suppository by taking the next numbered box, then referred to clinics at which injections could be given. Those with antimalarial injections or negative blood smears before randomisation were excluded, leaving 12 068 patients (6072 artesunate, 5996 placebo) for analysis. Primary endpoints were mortality, assessed 7-30 days later, and permanent disability, reassessed periodically. All investigators were masked to group assignment. Analysis was by intention to treat. This study is registered in all three countries, numbers ISRCTN83979018, 46343627, and 76987662. RESULTS: Mortality was 154 of 6072 artesunate versus 177 of 5996 placebo (2.5%vs 3.0%, p=0.1). Two versus 13 (0.03%vs 0.22%, p=0.0020) were permanently disabled; total dead or disabled: 156 versus 190 (2.6%vs 3.2%, p=0.0484). There was no reduction in early mortality (56 vs 51 deaths within 6 h; median 2 h). In patients reaching clinic within 6 h (median 3 h), pre-referral artesunate had no significant effect on death after 6 h or permanent disability (71/4450 [1.6%] vs 82/4426 [1.9%], risk ratio 0.86 [95% CI 0.63-1.18], p=0.35). In patients still not in clinic after more than 6 h, however, half were still not there after more than 15 h, and pre-referral rectal artesunate significantly reduced death or permanent disability (29/1566 [1.9%] vs 57/1519 [3.8%], risk ratio 0.49 [95% CI 0.32-0.77], p=0.0013). INTERPRETATION: If patients with severe malaria cannot be treated orally and access to injections will take several hours, a single inexpensive artesunate suppository at the time of referral substantially reduces the risk of death or permanent disability. FUNDING: UNICEF/UNDP/World Bank Special Programme for Research and Training in Tropical Diseases (WHO/TDR); WHO Global Malaria Programme (WHO/GMP); Sall Family Foundation; the European Union (QLRT-2000-01430); the UK Medical Research Council; USAID; Irish Aid; the Karolinska Institute; and the University of Oxford Clinical Trial Service Unit (CTSU).
Background: Shigella dysenteriae type 1 (Sd1) is a cause of major dysentery outbreaks, particularly among children and displaced populations in tropical countries. Although outbreaks continue, the characteristics of such outbreaks have rarely been documented. Here, we describe the Sd1 outbreaks occurring between 1993 and 1995 in 11 refugee settlements in Rwanda, Tanzania and Democratic Republic of the Congo (DRC). We also explored the links between the different types of the camps and the magnitude of the outbreaks. Methodology/Principal Findings: Number of cases of bloody diarrhea and deaths were collected on a weekly basis in 11 refugee camps, and analyzed retrospectively. Between November 1993 and February 1995, 181,921 cases of bloody diarrhea were reported. Attack rates ranged from 6.3% to 39.1% and case fatality ratios (CFRs) from 1.5% to 9.0% (available for 5 camps). The CFRs were higher in children under age 5. In Tanzania where the response was rapidly deployed, the mean attack rate was lower than in camps in the region of Goma without an immediate response (13.3% versus 32.1% respectively). Conclusions/Significance: This description, and the areas where data is missing, highlight both the importance of collecting data in future epidemics, difficulties in documenting outbreaks occurring in complex emergencies and most importantly, the need to assure that minimal requirements are met. © 2009 Kernéis et al.
Understanding the evolution of drug resistance in malaria is a central area of study at the intersection of evolution and medicine. Antimalarial drug resistance is a major threat to malaria control and directly related to trends in malaria attributable mortality. Artemisinin combination therapies (ACT) are now recommended worldwide as first line treatment for uncomplicated malaria, and losing them to resistance would be a disaster for malaria control. Understanding the emergence and spread of antimalarial drug resistance in the context of different scenarios of antimalarial drug use is essential for the development of strategies protecting ACTs. In this study, we review the basic mechanisms of resistance emergence and describe several simple equations that can be used to estimate the probabilities of de novo resistance mutations at three stages of the parasite life cycle: sporozoite, hepatic merozoite and asexual blood stages; we discuss the factors that affect parasite survival in a single host in the context of different levels of antimalarial drug use, immunity and parasitaemia. We show that in the absence of drug effects, and despite very different parasite numbers, the probability of resistance emerging at each stage is very low and similar in all stages (for example per-infection probability of 10(-10)-10(-9) if the per-parasite chance of mutation is 10(-10) per asexual division). However, under the selective pressure provided by antimalarial treatment and particularly in the presence of hyperparasitaemia, the probability of resistance emerging in the blood stage of the parasite can be approximately five orders of magnitude higher than in the absence of drugs. Detailed models built upon these basic methods should allow us to assess the relative probabilities of resistance emergence in the different phases of the parasite life cycle.
OBJECTIVES: Sepsis is associated with immunosuppression (characterized by a reduced capacity of circulating monocytes to release proinflammatory cytokines), which has been implicated in late mortality. Melioidosis, caused by the Gram-negative bacterium Burkholderia pseudomallei, is an important cause of community-acquired sepsis in Southeast Asia with a mortality of up to 40%. Previous in vitro and murine studies have suggested a key role for the so-called negative regulators of the toll-like receptor (TLR) signaling pathway in immunosuppression. In this study, we investigated the expression of these negative TLR regulators in patients with septic melioidosis in association with the responsiveness of peripheral blood leukocytes of these patients to lipopolysaccharide and B. pseudomallei. DESIGN: Ex vivo study. SETTING: Academic research laboratory. PATIENTS: Thirty-two healthy controls and 34 patients with sepsis caused by B. pseudomallei. INTERVENTIONS: None. MEASUREMENTS: 1) Plasma cytokine levels; 2) ex vivo cytokine production capacity of whole blood; and 3) purified mononuclear cell-derived messenger RNA (mRNA) levels of key inhibitory molecules of the TLR-signaling cascade were investigated. MAIN RESULTS: In accordance with an immunosuppressed state, whole blood of patients demonstrated a strongly decreased capacity to release the proinflammatory cytokines tumor necrosis factor-[alpha], interleukin-1[beta], and the chemokine interleukin-8 after ex vivo stimulation with lipopolysaccharide or B. pseudomallei. Analysis of myeloid-differentiation-88-short, interleukin-1R-associated-kinase (IRAK)-M, IRAK-1, suppressor-of-cytokine signaling-3, Src-homology-2-domain-containing inositol-5-phosphatase-1, single-immunoglobulin-interleukin-1R-related-molecule, and A20 mRNA expression in purified mononuclear cells showed decreased IRAK-1 and elevated IRAK-M expression in patients with septic melioidosis. Immunosuppression was correlated with mortality; furthermore, patients who eventually died had higher IRAK-M mRNA levels on admission than the patients who survived. CONCLUSIONS: Immunosuppression in sepsis caused by B. pseudomallei is associated with an upregulation of IRAK-M and an indicator of poor outcome.
OBJECTIVE: Markers of oxidative stress are reported to be increased in severe malaria. It has been suggested that the antioxidant N-acetylcysteine (NAC) may be beneficial in treatment. We studied the efficacy and safety of parenteral NAC as an adjunct to artesunate treatment of severe falciparum malaria. DESIGN: A randomized, double-blind, placebo-controlled trial on the use of high-dose intravenous NAC as adjunctive treatment to artesunate. SETTING: A provincial hospital in Western Thailand and a tertiary referral hospital in Chittagong, Bangladesh. PATIENTS: One hundred eight adult patients with severe falciparum malaria. INTERVENTIONS: Patients were randomized to receive NAC or placebo as an adjunctive treatment to intravenous artesunate. MEASUREMENTS AND MAIN RESULTS: A total of 56 patients were treated with NAC and 52 received placebo. NAC had no significant effect on mortality, lactate clearance times (p = 0.74), or coma recovery times (p = 0.46). Parasite clearance time was increased from 30 hours (range, 6-144 hours) to 36 hours (range, 6-120 hours) (p = 0.03), but this could be explained by differences in admission parasitemia. Urinary F2-isoprostane metabolites, measured as a marker of oxidative stress, were increased in severe malaria compared with patients with uncomplicated malaria and healthy volunteers. Admission red cell rigidity correlated with mortality, but did not improve with NAC. CONCLUSION: Systemic oxidative stress is increased in severe malaria. Treatment with NAC had no effect on outcome in patients with severe falciparum malaria in this setting.
Orientia tsutsugamushi, the cause of scrub typhus, is a major pathogen in the Asia-Pacific region. The severity of infection ranges from mild features to multiorgan failure and death. The aim of this prospective study was to define the O. tsutsugamushi loads in the blood samples of patients with scrub typhus on the day of hospital admission and to determine whether this was associated with disease severity. Quantitation was performed using a real-time PCR assay targeting the 16S rRNA gene of O. tsutsugamushi. A total of 155 patients with a confirmed diagnosis of scrub typhus had a median (interquartile range [IQR], range) O. tsutsugamushi DNA load in blood of 13 (0 to 334, 0 to 310,253) copies/ml. This included 74 patients who had undetectable bacterial loads. An analysis of bacterial load versus clinical features for all 155 patents demonstrated that duration of illness (P < 0.001), presence of eschar (P = 0.004), and concentrations of aspartate aminotransferase, alanine aminotransferase, and alkaline phosphatase (P < 0.001 for all three) were positively correlated with bacterial load. Patients who died had a significantly higher bacterial load than those who survived (mean [standard deviation] values: 17,154 [12.7] versus 281 [5.2] copies/ml; P < 0.001). This study has demonstrated a relationship between bacterial load and disease severity in adults with scrub typhus.
Multiple displacement amplification (MDA) using Phi29 has proved to be an efficient, high-fidelity method for whole genome amplification in many organisms. This project was designed to evaluate this approach for use with the malaria parasite Plasmodium falciparum. In particular, we were concerned that the AT richness and presence of contaminating human DNA could limit efficiency of MDA in this system. We amplified 60 DNA samples using phi29 and scored 14 microsatellites, 9 single-nucleotide polymorphisms (SNPs), and gene copy number at GTP-cyclohydrolase I both before and after MDA. We observed 100% concordance in 829 microsatellite genotypes and in 499 SNP genotypes. Furthermore, copy number estimates for the GTP-cyclohydrolase I gene were correlated (r(2) = 0.67) in pre- and postamplification samples. These data confirm that MDA permits scoring of a range of different types of polymorphisms in P. falciparum malaria and can be used to extend the life of valuable DNA stocks.
By contrast with high-income countries, Staphylococcus aureus disease ranks low on the public-health agenda in low-income countries. We undertook a literature review of S aureus disease in resource-limited countries in south and east Asia, and found that its neglected status as a developing world pathogen does not equate with low rates of disease. The incidence of the disease seems to be highest in neonates, its range of clinical manifestations is as broad as that seen in other settings, and the mortality rate associated with serious S aureus infection, such as bacteraemia, is as high as 50%. The prevalence of meticillin-resistant S aureus (MRSA) infection across much of resource-limited Asia is largely unknown. Antibiotic drugs are readily and widely available from pharmacists in most parts of Asia, where ease of purchase and frequent self-medication are likely to be major drivers in the emergence of drug resistance. In our global culture, the epidemiology of important drug-resistant pathogens in resource-limited countries is inextricably linked with the health of both developing and developed communities. An initiative is needed to raise the profile of S aureus disease in developing countries, and to define a programme of research to find practical solutions to the health-care challenges posed by this important global pathogen.
J Med Microbiol, 58 (Pt 2), pp. 281-283. | Citations: 19 (Web of Science Lite) | Read more2009. Rapid emergence of third generation cephalosporin resistant Shigella spp. in Southern Vietnam.
Maturation of Plasmodium falciparum decreases the deformability of infected red blood cells (RBCs), increasing their clearance as they attempt to pass through endothelial slits of the splenic sinus. Previous studies of Plasmodium vivax-infected RBCs led to opposite conclusions with respect to cellular deformability. To resolve this controversy, P. vivax-infected RBCs were passed through a 2-microm microfluidic channel. In contrast to P. falciparum-infected RBCs, mature P. vivax-infected RBCs readily became deformed through 2-microm constrictions. After this extreme deformation, 67% of P. vivax-infected RBCs recovered a normal appearance; however, 15% of uninfected RBCs were destroyed. Results suggest mechanisms for both avoidance of splenic clearance and anemia in vivax malaria.
J Med Microbiol, 58 (Pt 2), pp. 276-277. | Read more2009. No association between human herpesvirus 6 reactivation and cryptococcosis in human immunodeficiency virus-infected patients.
The frequency of typical and atypical Beijing strains of Mycobacterium tuberculosis was determined in the Netherlands; Vietnam; and Hong Kong Special Administrative Region, People's Republic of China. The strains' associations with drug resistance, M. bovis BCG vaccination, and patient characteristics were assessed. BCG vaccination may have positively selected the prevalent typical Beijing strains.
BACKGROUND: Studies of glycoconjugate vaccines have traditionally used an immune challenge with a plain polysaccharide vaccine to demonstrate immunologic memory. Plain polysaccharide vaccines are poorly immunogenic in children and can induce subsequent immunologic hyporesponsiveness. We therefore assessed the use of glycoconjugate vaccines as an alternative method of demonstrating immunologic memory. METHODS: Children immunized with hepatitis B vaccine or serogroup C meningococcal glycoconjugate vaccine (MenCC) at age 2, 3, 4 months received a plain polysaccharide meningococcal serogroup A/C vaccine (MenACP) or MenCC at age 12 months. A post hoc analysis of serum bactericidal activity responses to MenCC assessed whether this differed in MenCC primed and MenCC naive infants. RESULTS: MenCC primed children displayed higher geometric mean serum bactericidal titers than MenCC naive children following MenACP (1518 compared with 30; P = 0.003). A similar difference was seen after a dose of MenCC to toddlers (MenCC primed: 8663, MenCC naive: 710; P < 0.001). The latter comparison became a borderline significance after adjusting for higher pretoddler immunization serum bactericidal geometric mean titers in the MenCC primed group (P = 0.068). CONCLUSIONS: Administration of glycoconjugate vaccines provides an important alternative method of demonstrating immunologic memory, avoiding the use of plain polysaccharide vaccines that are potentially deleterious in children. This has implications for the design of all future clinical trials of glycoconjugate vaccines.
AIM: To provide data on changes in illegal drug use in women following imprisonment. DESIGN: Prospective cohort study. SETTING: Recruitment took place in two prisons in the Midlands and South-East England and follow-up in 13 prisons across England. PARTICIPANTS: A total of 505 women prisoners participated, a response rate of 82%. Measurements Questions about drug use were contained within a questionnaire which examined broad aspects of health. On entry into prison, women answered questions about daily drug use and injecting drug use prior to imprisonment. One month later the questionnaires examined drug use during this period of imprisonment. FINDINGS: Prior to imprisonment, 53% [95% confidence interval (CI): 49-58%] of women took at least one illegal drug daily and 38% (CI: 34-42%) said they had ever injected drugs. Following imprisonment, some women continued to use drugs; 14% (CI: 10-20%) of women reported using at least one illegal drug daily and 2% (CI: 0.7-5%) of women had injected drugs. There were important changes in the types of drugs used; there was a change in use from crack and heroin to benzodiazepines and opiate substitutes. Prior to imprisonment, women most commonly used crack and heroin, but in prison the two most commonly used illegal drugs were benzodiazepines and opiate substitutes. CONCLUSIONS: The study provides quantitative evidence of the impact of imprisonment on drug use among women. It highlights the need for enhanced drug treatment services and stronger measures to reduce the availability of illegal drugs to women in prison.
Bipolar elongation of filaments of the bacterial actin homolog ParM drives movement of newly replicated plasmid DNA to opposite poles of a bacterial cell. We used a combination of vitreous sectioning and electron cryotomography to study this DNA partitioning system directly in native, frozen cells. The diffraction patterns from overexpressed ParM bundles in electron cryotomographic reconstructions were used to unambiguously identify ParM filaments in Escherichia coli cells. Using a low-copy number plasmid encoding components required for partitioning, we observed small bundles of three to five intracellular ParM filaments that were situated close to the edge of the nucleoid. We propose that this may indicate the capture of plasmid DNA within the periphery of this loosely defined, chromosome-containing region.
CONTEXT: Ready-to-use therapeutic foods (RUTFs) are an important component of effective outpatient treatment of severe wasting. However, their effectiveness in the population-based prevention of moderate and severe wasting has not been evaluated. OBJECTIVE: To evaluate the effect of a 3-month distribution of RUTF on the nutritional status, mortality, and morbidity of children aged 6 to 60 months in Niger. DESIGN, SETTING, AND PARTICIPANTS: A cluster randomized trial of 12 villages in Maradi, Niger. Six villages were randomized to intervention and 6 to no intervention. All children in the study villages aged 6 to 60 months were eligible for recruitment. INTERVENTION: Children with weight-for-height 80% or more of the National Center for Health Statistics reference median in the 6 intervention villages received a monthly distribution of 1 packet per day of RUTF (92 g [500 kcal/d]) from August to October 2006. Children in the 6 nonintervention villages received no preventive supplementation. Active surveillance for conditions requiring medical or nutritional treatment was conducted monthly in all 12 study villages from August 2006 to March 2007. MAIN OUTCOME MEASURES: Changes in weight-for-height z score (WHZ) according to the World Health Organization Child Growth Standards and incidence of wasting (WHZ <-2) over 8 months of follow-up. RESULTS: The number of children with height and weight measurements in August, October, December, and February was 3166, 3110, 2936, and 3026, respectively. The WHZ difference between the intervention and nonintervention groups was -0.10 z (95% confidence interval [CI], -0.23 to 0.03) at baseline and 0.12 z (95% CI, 0.02 to 0.21) after 8 months of follow-up. The adjusted effect of the intervention on WHZ from baseline to the end of follow-up was thus 0.22 z (95% CI, 0.13 to 0.30). The absolute rate of wasting and severe wasting, respectively, was 0.17 events per child-year (140 events/841 child-years) and 0.03 events per child-year (29 events/943 child-years) in the intervention villages, compared with 0.26 events per child-year (233 events/895 child-years) and 0.07 events per child-year (71 events/1029 child-years) in the nonintervention villages. The intervention thus resulted in a 36% (95% CI, 17% to 50%; P < .001) reduction in the incidence of wasting and a 58% (95% CI, 43% to 68%; P < .001) reduction in the incidence of severe wasting. There was no reduction in mortality, with a mortality rate of 0.007 deaths per child-year (7 deaths/986 child-years) in the intervention villages and 0.016 deaths per child-year (18 deaths/1099 child-years) in the nonintervention villages (adjusted hazard ratio, 0.51; 95% CI, 0.25 to 1.05). CONCLUSION: Short-term supplementation of nonmalnourished children with RUTF reduced the decline in WHZ and the incidence of wasting and severe wasting over 8 months. TRIAL REGISTRATION: clinicaltrials.gov Identifier: NCT00682708.
The hepatitis C virus (HCV), which currently infects an estimated 3% of people worldwide, has been present in some human populations for several centuries, notably HCV genotypes 1 and 2 in West Africa and genotype 6 in Southeast Asia. Here we use newly developed methods of sequence analysis to conduct the first comprehensive investigation of the epidemic and evolutionary history of HCV in Asia. Our analysis includes new HCV core (n = 16) and NS5B (n = 14) gene sequences, obtained from serum samples of jaundiced patients from Laos. These exceptionally diverse isolates were analyzed in conjunction with all available reference strains using phylogenetic and Bayesian coalescent methods. We performed statistical tests of phylogeographic structure and applied a recently developed "relaxed molecular clock" approach to HCV for the first time, which indicated an unexpectedly high degree of rate variation. Our results reveal a >1,000-year-long development of genotype 6 in Asia, characterized by substantial phylogeographic structure and two distinct phases of epidemic history, before and during the 20th century. We conclude that HCV lineages representing preexisting and spatially restricted strains were involved in multiple, independent local epidemics during the 20th century. Our analysis explains the generation and maintenance of HCV diversity in Asia and could provide a template for further investigations of HCV spread in other regions.
Vet Rec, 164 (2), pp. 58-59. | Read more2009. Bartonella clarridgeiae in a cat in the UK.
BACKGROUND: The understanding of the epidemiology of severe malaria in African children remains incomplete across the spectrum of Plasmodium falciparum transmission intensities through which communities might expect to transition, as intervention coverage expands. METHODS: Paediatric admission data were assembled from 13 hospitals serving 17 communities between 1990 and 2007. Estimates of Plasmodium falciparum transmission intensity in these communities were assembled to be spatially and temporally congruent to the clinical admission data. The analysis focused on the relationships between community derived parasite prevalence and the age and clinical presentation of paediatric malaria in children aged 0-9 years admitted to hospital. RESULTS: As transmission intensity declined a greater proportion of malaria admissions were in older children. There was a strong linear relationship between increasing transmission intensity and the proportion of paediatric malaria admissions that were infants (R2 = 0.73, p < 0.001). Cerebral malaria was reported among 4% and severe malaria anaemia among 17% of all malaria admissions. At higher transmission intensity cerebral malaria was a less common presentation compared to lower transmission sites. There was no obvious relationship between the proportions of children with severe malaria anaemia and transmission intensity. CONCLUSION: As the intensity of malaria transmission declines in Africa through the scaling up of insecticide-treated nets and other vector control measures a focus of disease prevention among very young children becomes less appropriate. The understanding of the relationship between parasite exposure and patterns of disease risk should be used to adapt malaria control strategies in different epidemiological settings.
BACKGROUND: Insecticide-treated bednets (ITNs) provide a means to improve child survival across Africa. Sales figures of these nets and survey coverage data presented nationally mask inequities in populations at biological and economic risk, and do not allow for precision in the estimation of unmet commodity needs. We gathered subnational ITN coverage sample survey data from 40 malaria-endemic countries in Africa between 2000 and 2007. METHODS: We computed the projected ITN coverage among children aged less than 5 years for age-adjusted population data that were stratified according to malaria transmission risks, proximate determinants of poverty, and methods of ITN delivery. FINDINGS: In 2000, only 1.7 million (1.8%) African children living in stable malaria-endemic conditions were protected by an ITN and the number increased to 20.3 million (18.5%) by 2007 leaving 89.6 million children unprotected. Of these, 30 million were living in some of the poorest areas of Africa: 54% were living in only seven countries and 25% in Nigeria alone. Overall, 33 (83%) countries were estimated to have ITN coverage of less than 40% in 2007. On average, we noted a greater increase in ITN coverage in areas where free distribution had operated between survey periods. INTERPRETATION: By mapping the distribution of populations in relation to malaria risk and intervention coverage, we provide a means to track the future requirements for scaling up essential disease-prevention strategies. The present coverage of ITN in Africa remains inadequate and a focused effort to improve distribution in selected areas would have a substantial effect on the continent's malaria burden.
BACKGROUND: Cystatin C has been proposed as an alternative marker of renal function. We sought to determine whether participants randomized to episodic use of antiretroviral therapy guided by CD4 cell count (drug conservation) had altered cystatin C levels compared with those randomized to continuous antiretroviral therapy (viral suppression) in the Strategies for Management of Antiretroviral Therapy trial, and to identify factors associated with increased cystatin C. METHODS: Cystatin C was measured in plasma collected at randomization, 1, 2, 4, 8 and 12 months after randomization in a random sample of 249 and 250 participants in the drug conservation and viral suppression groups, respectively. Logistic regression was used to model the odds of at least 0.15 mg/dl increase in cystatin C (1 SD) in the first month after randomization, adjusting for demographic and clinical characteristics. RESULTS: At randomization, mean (SD) cystatin C level was 0.99 (0.26 mg/dl) and 1.01 (0.28 mg/dl) in the drug conservation and viral suppression arms, respectively (P = 0.29). In the first month after randomization, 21.8 and 10.6% had at least 0.15 mg/dl increase in cystatin C in the drug conservation and viral suppression arms, respectively (P = 0.0008). The difference in cystatin C between the treatment arms was maintained through 1 year after randomization. After adjustment, participants in the viral suppression arm had significantly reduced odds of at least 0.15 mg/dl increase in cystatin C in the first month (odds ratio 0.42; 95% confidence interval 0.23-0.74, P = 0.0023). CONCLUSION: These results demonstrate that interruption of antiretroviral therapy is associated with an increase in cystatin C, which may reflect worsened renal function.
BACKGROUND: Facilitation of endogenous neuroprotective pathways, such as the erythropoietin (Epo) pathway, has been proposed as adjuvant treatment strategies in cerebral malaria. Whether different endogenous protein expression levels of Epo or differences in the abundance of its receptor components could account for the extent of structural neuropathological changes or neurological complications in adults with severe malaria was investigated. METHODS: High sensitivity immunohistochemistry was used to assess the frequency, distribution and concordance of Epo and components of its homodimeric and heteromeric receptors, Epo receptor and CD131, within the brainstem of adults who died of severe malaria. The following relationships with Epo and its receptor components were also defined: (i) sequestration and indicators of hypoxia; (ii) vascular damage in the form of plasma protein leakage and haemorrhage; (iii) clinical complications and neuropathological features of severe malaria disease. Brainstems of patients dying in the UK from unrelated non-infectious causes were examined for comparison. RESULTS: The incidence of endogenous Epo in parenchymal brain cells did not greatly differ between severe malaria and non-neurological UK controls at the time of death. However, EpoR and CD131 labelling of neurons was greater in severe malaria compared with non-neurological controls (P = .009). EpoR labelling of vessels was positively correlated with admission peripheral parasite count (P = .01) and cerebral sequestration (P < .0001). There was a strong negative correlation between arterial oxygen saturation and EpoR labelling of glia (P = .001). There were no significant correlations with indicators of vascular damage, neuronal chromatolysis, axonal swelling or vital organ failure. CONCLUSION: Cells within the brainstem of severe malaria patients showed protein expression of Epo and its receptor components. However, the incidence of endogeneous expression did not reflect protection from vascular or neuronal injury, and/or clinical manifestations, such as coma. These findings do not provide support for Epo as an adjuvant neuroprotective agent in adults with severe malaria.
BACKGROUND: Artemisinin combination treatments (ACT) are recommended as first line treatment for falciparum malaria throughout the malaria affected world. We reviewed the efficacy of a 3-day regimen of mefloquine and artesunate regimen (MAS(3)), over a 13 year period of continuous deployment as first-line treatment in camps for displaced persons and in clinics for migrant population along the Thai-Myanmar border. METHODS AND FINDINGS: 3,264 patients were enrolled in prospective treatment trials between 1995 and 2007 and treated with MAS(3). The proportion of patients with parasitaemia persisting on day-2 increased significantly from 4.5% before 2001 to 21.9% since 2002 (p<0.001). Delayed parasite clearance was associated with increased risk of developing gametocytaemia (AOR = 2.29; 95% CI, 2.00-2.69, p = 0.002). Gametocytaemia on admission and carriage also increased over the years (p = 0.001, test for trend, for both). MAS(3) efficacy has declined slightly but significantly (Hazards ratio 1.13; 95% CI, 1.07-1.19, p<0.001), although efficacy in 2007 remained well within acceptable limits: 96.5% (95% CI, 91.0-98.7). The in vitro susceptibility of P. falciparum to artesunate increased significantly until 2002, but thereafter declined to levels close to those of 13 years ago (geometric mean in 2007: 4.2 nM/l; 95% CI, 3.2-5.5). The proportion of infections caused by parasites with increased pfmdr1 copy number rose from 30% (12/40) in 1996 to 53% (24/45) in 2006 (p = 0.012, test for trend). CONCLUSION: Artesunate-mefloquine remains a highly efficacious antimalarial treatment in this area despite 13 years of widespread intense deployment, but there is evidence of a modest increase in resistance. Of particular concern is the slowing of parasitological response to artesunate and the associated increase in gametocyte carriage.
BACKGROUND: Analytical approaches for the interpretation of anti-malarial clinical trials vary considerably. The aim of this study was to quantify the magnitude of the differences between efficacy estimates derived from these approaches and identify the factors underlying these differences. METHODS: Data from studies conducted in Africa and Thailand were compiled and the risk estimates of treatment failure, adjusted and unadjusted by genotyping, were derived by three methods (intention to treat (ITT), modified intention to treat (mITT) and per protocol (PP)) and then compared. RESULTS: 29 clinical trials (15 from Africa and 14 from Thailand) with a total of 65 treatment arms (38 from Africa and 27 from Thailand) were included in the analysis. Of the 15,409 patients enrolled, 2,637 (17.1%) had incomplete follow up for the unadjusted analysis and 4,489 (33.4%) for the adjusted analysis. Estimates of treatment failure were consistently higher when derived from the ITT or PP analyses compared to the mITT approach. In the unadjusted analyses the median difference between the ITT and mITT estimates was greater in Thai studies (11.4% [range 2.1-31.8]) compared to African Studies (1.8% [range 0-11.7]). In the adjusted analyses the median difference between PP and mITT estimates was 1.7%, but ranged from 0 to 30.9%. The discrepancy between estimates was correlated significantly with the proportion of patients with incomplete follow-up; p < 0.0001. The proportion of studies with a major difference (> 5%) between adjusted PP and mITT was 28% (16/57), with the risk difference greater in African (37% 14/38) compared to Thai studies (11% 2/19). In the African studies, a major difference in the adjusted estimates was significantly more likely in studies in high transmission sites (62% 8/13) compared to studies in moderate transmission sites (24% 6/25); p = 0.035. CONCLUSION: Estimates of anti-malarial clinical efficacy vary significantly depending on the analytical methodology from which they are derived. In order to monitor temporal and spatial trends in anti-malarial efficacy, standardized analytical tools need to be applied in a transparent and systematic manner.
BACKGROUND: Preventing the emergence of anti-malarial drug resistance is critical for the success of current malaria elimination efforts. Prevention strategies have focused predominantly on qualitative factors, such as choice of drugs, use of combinations and deployment of multiple first-line treatments. The importance of anti-malarial treatment dosing has been underappreciated. Treatment recommendations are often for the lowest doses that produce "satisfactory" results. METHODS: The probability of de-novo resistant malaria parasites surviving and transmitting depends on the relationship between their degree of resistance and the blood concentration profiles of the anti-malarial drug to which they are exposed. The conditions required for the in-vivo selection of de-novo emergent resistant malaria parasites were examined and relative probabilities assessed. RESULTS: Recrudescence is essential for the transmission of de-novo resistance. For rapidly eliminated anti-malarials high-grade resistance can arise from a single drug exposure, but low-grade resistance can arise only from repeated inadequate treatments. Resistance to artemisinins is, therefore, unlikely to emerge with single drug exposures. Hyperparasitaemic patients are an important source of de-novo anti-malarial drug resistance. Their parasite populations are larger, their control of the infection insufficient, and their rates of recrudescence following anti-malarial treatment are high. As use of substandard drugs, poor adherence, unusual pharmacokinetics, and inadequate immune responses are host characteristics, likely to pertain to each recurrence of infection, a small subgroup of patients provides the particular circumstances conducive to de-novo resistance selection and transmission. CONCLUSION: Current dosing recommendations provide a resistance selection opportunity in those patients with low drug levels and high parasite burdens (often children or pregnant women). Patients with hyperparasitaemia who receive outpatient treatments provide the greatest risk of selecting de-novo resistant parasites. This emphasizes the importance of ensuring that only quality-assured anti-malarial combinations are used, that treatment doses are optimized on the basis of pharmacodynamic and pharmacokinetic assessments in the target populations, and that patients with heavy parasite burdens are identified and receive sufficient treatment to prevent recrudescence.
Am J Trop Med Hyg, 80 (1), pp. 141-145. | Citations: 16 (Scopus) | Show Abstract2009. Hyponatremia in severe malaria: evidence for an appropriate anti-diuretic hormone response to hypovolemia.
Although hyponatremia occurs in most patients with severe malaria, its pathogenesis, prognostic significance, and optimal management have not been established. Clinical and biochemical data were prospectively collected from 171 consecutive Bangladeshi adults with severe malaria. On admission, 57% of patients were hyponatremic. Plasma sodium and Glasgow Coma Score were inversely related (r(s) = -0.36, P < 0.0001). Plasma antidiuretic hormone concentrations were similar in hyponatremic and normonatremic patients (median, range: 6.1, 2.3-85.3 versus 32.7, 3.0-56.4 pmol/L; P = 0.19). Mortality was lower in hyponatremic than normonatremic patients (31.6% versus 51.4%; odds ratio [95% confidence interval]: 0.44 [0.23-0.82]; P = 0.01 by univariate analysis). Plasma sodium normalized with crystalloid rehydration from (median, range) 127 (123-140) mmol/L on admission to 136 (128-149) mmol/L at 24 hours (P = 0.01). Hyponatremia in adults with severe malaria is common and associated with preserved consciousness and decreased mortality. It likely reflects continued oral hypotonic fluid intake in the setting of hypovolemia and requires no therapy beyond rehydration.
BACKGROUND: Characterization of anti-malarial drug concentration profiles is necessary to optimize dosing, and thereby optimize cure rates and reduce both toxicity and the emergence of resistance. Population pharmacokinetic studies determine the drug concentration time profiles in the target patient populations, including children who have limited sampling options. Currently, population pharmacokinetic studies of anti-malarial drugs are designed based on logistical, financial and ethical constraints, and prior knowledge of the drug concentration time profile. Although these factors are important, the proposed design may be unable to determine the desired pharmacokinetic profile because there was no formal consideration of the complex statistical models used to analyse the drug concentration data. METHODS: Optimal design methods incorporate prior knowledge of the pharmacokinetic profile of the drug, the statistical methods used to analyse data from population pharmacokinetic studies, and also the practical constraints of sampling the patient population. The methods determine the statistical efficiency of the design by evaluating the information of the candidate study design prior to the pharmacokinetic study being conducted. RESULTS: In a hypothetical population pharmacokinetic study of intravenous artesunate, where the number of patients and blood samples to be assayed was constrained to be 50 and 200 respectively, an evaluation of varying elementary designs using optimal design methods found that the designs with more patients and less samples per patient improved the precision of the pharmacokinetic parameters and inter-patient variability, and the overall statistical efficiency by at least 50%. CONCLUSION: Optimal design methods ensure that the proposed study designs for population pharmacokinetic studies are robust and efficient. It is unethical to continue conducting population pharmacokinetic studies when the sampling schedule may be insufficient to estimate precisely the pharmacokinetic profile.
BACKGROUND: The soil-dwelling Gram-negative bacterium Burkholderia pseudomallei is the cause of melioidosis. Extreme structuring of genotype and genotypic frequency has been demonstrated for B. pseudomallei in uncultivated land, but its distribution and genetic diversity in agricultural land where most human infections are probably acquired is not well defined. METHODS: Fixed-interval soil sampling was performed in a rice paddy in northeast Thailand in which 100 grams of soil was sampled at a depth of 30 cm from 10x10 sampling points each measuring 2.5 m by 2.5 m. Soil was cultured for the presence of B. pseudomallei and genotyping of colonies present on primary culture plates was performed using a combination of pulsed-field gel electrophoresis (PFGE) and multilocus sequence typing (MLST). PRINCIPAL FINDINGS: B. pseudomallei was cultured from 28/100 samples. Genotyping of 630 primary colonies drawn from 11 sampling points demonstrated 10 PFGE banding pattern types, which on MLST were resolved into 7 sequence types (ST). Overlap of genotypes was observed more often between sampling points that were closely positioned. Two sampling points contained mixed B. pseudomallei genotypes, each with a numerically dominant genotype and one or more additional genotypes present as minority populations. CONCLUSIONS: Genetic diversity and structuring of B. pseudomallei exists despite the effects of flooding and the physical and chemical processes associated with farming. These findings form an important baseline for future studies of environmental B. pseudomallei.
Am J Trop Med Hyg, 80 (1), pp. 126-132. | Citations: 23 (Scopus) | Show Abstract2009. Does artesunate prolong the electrocardiograph QT interval in patients with severe malaria?
Several antimalarials can cause significant prolongation of the electrocardiograph QT interval, which can be associated with an increased risk of potentially lethal ventricular arrhythmias. High doses of artemether and artemotil have been associated with QT prolongation in dogs, raising the possibility of a class effect with the artemisinin derivatives. Serial electrocardiograms were recorded, and QTc interval was calculated before and after administration of artesunate by intravenous injection in patients with severe falciparum malaria in Bangladesh. Of 21 adult patients with severe malaria enrolled, 8 (38%) died. The mean QTc interval was unaffected by bolus intravenous artesunate (2.4 mg/kg). In two patients, the QTc interval exceeded 0.5 seconds, but in both cases, an alternative explanation was plausible. No effect was observed on the JTc or PR interval, QRS width, blood pressure, or heart rate. Intravenous artesunate does not have significant cardiovascular effects in patients with severe falciparum malaria.
BACKGROUND: Invasive Staphylococcus aureus infection is increasingly recognised as an important cause of serious sepsis across the developing world, with mortality rates higher than those in the developed world. The factors determining mortality in developing countries have not been identified. METHODS: A prospective, observational study of invasive S. aureus disease was conducted at a provincial hospital in northeast Thailand over a 1-year period. All-cause and S. aureus-attributable mortality rates were determined, and the relationship was assessed between death and patient characteristics, clinical presentations, antibiotic therapy and resistance, drainage of pus and carriage of genes encoding Panton-Valentine Leukocidin (PVL). PRINCIPAL FINDINGS: A total of 270 patients with invasive S. aureus infection were recruited. The range of clinical manifestations was broad and comparable to that described in developed countries. All-cause and S. aureus-attributable mortality rates were 26% and 20%, respectively. Early antibiotic therapy and drainage of pus were associated with a survival advantage (both p<0.001) on univariate analysis. Patients infected by a PVL gene-positive isolate (122/248 tested, 49%) had a strong survival advantage compared with patients infected by a PVL gene-negative isolate (all-cause mortality 11% versus 39% respectively, p<0.001). Multiple logistic regression analysis using all variables significant on univariate analysis revealed that age, underlying cardiac disease and respiratory infection were risk factors for all-cause and S. aureus-attributable mortality, while one or more abscesses as the presenting clinical feature and procedures for infectious source control were associated with survival. CONCLUSIONS: Drainage of pus and timely antibiotic therapy are key to the successful management of S. aureus infection in the developing world. Defining the presence of genes encoding PVL provides no practical bedside information and draws attention away from identifying verified clinical risk factors and those interventions that save lives.
BACKGROUND: Malaria has recently been identified as a candidate for global eradication. This process will take the form of a series of national eliminations. Key issues must be considered specifically for elimination strategy when compared to the control of disease. Namely the spread of drug resistance, data scarcity and the adverse effects of failed elimination attempts. Mathematical models of various levels of complexity have been produced to consider the control and elimination of malaria infection. If available, detailed data on malaria transmission (such as the vector life cycle and behaviour, human population behaviour, the acquisition and decay of immunity, heterogeneities in transmission intensity, age profiles of clinical and subclinical infection) can be used to populate complex transmission models that can then be used to design control strategy. However, in many malaria countries reliable data are not available and policy must be formed based on information like an estimate of the average parasite prevalence. METHODS: A simple deterministic model, that requires data in the form of a single estimate of parasite prevalence as an input, is developed for the purpose of comparison with other more complex models. The model is designed to include key aspects of malaria transmission and integrated control. RESULTS: The simple model is shown to have similar short-term dynamic behaviour to three complex models. The model is used to demonstrate the potential of alternative methods of delivery of controls. The adverse effects on clinical infection and spread of resistance are predicted for failed elimination attempts. Since elimination strategies present an increased risk of the spread of drug resistance, the model is used to demonstrate the population level protective effect of multiple controls against this very serious threat. CONCLUSION: A simple model structure for the elimination of malaria is suitable for situations where data are sparse yet strategy design requirements are urgent with the caveat that more complex models, populated with new data, would provide more information, especially in the long-term.
BACKGROUND: Investigations of Plasmodium vivax are restricted to samples collected from infected persons or primates, because this parasite cannot be maintained in in vitro cultures. Contamination of P. vivax isolates with host leukocytes and platelets is detrimental to a range of ex vivo and molecular investigations. Easy-to-produce CF11 cellulose filters have recently provided us with an inexpensive method for the removal of leukocytes and platelets. This contrasted with previous reports of unacceptably high levels of infected red blood cell (IRBC) retention by CF11. The aims of this study were to compare the ability of CF11 cellulose filters and the commercial filter Plasmodipur at removing leukocyte and platelet, and to investigate the retention of P. vivax IRBCs by CF11 cellulose filtration. METHODS AND RESULTS: Side-by-side comparison of six leukocyte removal methods using blood samples from five healthy donor showed that CF11 filtration reduced the mean initial leukocyte counts from 9.4 x 103 per microl [95%CI 5.2-13.5] to 0.01 x 103 [95%CI 0.01-0.03]. The CF11 was particularly effective at removing neutrophils. CF11 treatment also reduced initial platelet counts from 211.6 x 103 per microl [95%CI 107.5-315.7] to 0.8 x 103 per microl [95%CI -0.7-2.2]. Analysis of 30 P. vivax blood samples before and after CF11 filtration showed only a minor loss in parasitaemia (<or= 7.1% of initial counts). Stage specific retention of P. vivax IRBCs was not observed. CONCLUSION: CF11 filtration is the most cost and time efficient method for the production of leukocyte- and platelet-free P. vivax-infected erythrocytes from field isolates.
BACKGROUND: Most information on invasive Staphylococcus aureus infections comes from temperate countries. There are considerable knowledge gaps in epidemiology, treatment, drug resistance and outcome of invasive S. aureus infection in the tropics. METHODS: A prospective, observational study of S. aureus bacteraemia was conducted in a 1000-bed regional hospital in northeast Thailand over 1 year. Detailed clinical data were collected and final outcomes determined at 12 weeks, and correlated with antimicrobial susceptibility profiles of infecting isolates. PRINCIPAL FINDINGS: Ninety-eight patients with S. aureus bacteraemia were recruited. The range of clinical manifestations was similar to that reported from temperate countries. The prevalence of endocarditis was 14%. The disease burden was highest at both extremes of age, whilst mortality increased with age. The all-cause mortality rate was 52%, with a mortality attributable to S. aureus of 44%. Methicillin-resistant S. aureus (MRSA) was responsible for 28% of infections, all of which were healthcare-associated. Mortality rates for MRSA and methicillin-susceptible S. aureus (MSSA) were 67% (18/27) and 46% (33/71), respectively (p = 0.11). MRSA isolates were multidrug resistant. Only vancomycin or fusidic acid would be suitable as empirical treatment options for suspected MRSA infection. CONCLUSIONS: S. aureus is a significant pathogen in northeast Thailand, with comparable clinical manifestations and a similar endocarditis prevalence but higher mortality than industrialised countries. S. aureus bacteraemia is frequently associated with exposure to healthcare settings with MRSA causing a considerable burden of disease. Further studies are required to define setting-specific strategies to reduce mortality from S. aureus bacteraemia, prevent MRSA transmission, and to define the burden of S. aureus disease and emergence of drug resistance throughout the developing world.
For many years the IFN-gamma ex vivo ELISPOT has been a major assay for assessing human T-cell responses generated by malaria vaccines. The ELISPOT assay is a sensitive assay, but an imperfect correlate of protection against malaria. Monokine induced by gamma (MIG), or CXCL9, is a chemokine induced by IFN-gamma and has the potential to provide amplification of the IFN-gamma signal. MIG secretion could provide a measure of bio-active IFN-gamma and a functional IFN-gamma signalling pathway. We report that detecting MIG by flow cytometry and by RT-PCR can be more sensitive than the detection of IFN-gamma using these methods. We also find that there is little inter-individual variability in MIG secretion when detected by flow cytometry and that the MIG assay may be used to estimate the amount of bio-active IFN-gamma present. Measurement of MIG alongside IFN-gamma may provide a fuller picture of Th1 type responses post-vaccination.
Interferon Regulatory Factor 1 (IRF-1) is a member of the IRF family of transcription factors, which have key and diverse roles in the gene-regulatory networks of the immune system. IRF-1 has been described as a critical mediator of IFN-gamma signalling and as the major player in driving TH1 type responses. It is therefore likely to be crucial in both innate and adaptive responses against intracellular pathogens such as Plasmodium falciparum. Polymorphisms at the human IRF1 locus have been previously found to be associated with the ability to control P. falciparum infection in populations naturally exposed to malaria. In order to test whether genetic variation at the IRF1 locus also affects the risk of developing severe malaria, we performed a family-based test of association for 18 Single Nucleotide Polymorphisms (SNPs) across the gene in three African populations, using genotype data from 961 trios consisting of one affected child and his/her two parents (555 from The Gambia, 204 from Kenya and 202 from Malawi). No significant association with severe malaria or severe malaria subphenotypes (cerebral malaria and severe malaria anaemia) was observed for any of the SNPs/haplotypes tested in any of the study populations. Our results offer no evidence that the molecular pathways regulated by the transcription factor IRF-1 are involved in the immune-based pathogenesis of severe malaria.
BACKGROUND: Dengue is a public health problem in many countries. Rapid diagnosis of dengue can assist patient triage and management. Detection of the dengue viral protein, NS1, represents a new approach to dengue diagnosis. METHODOLOGY/PRINCIPAL FINDINGS: The sensitivity and specificity of the Platelia NS1 ELISA assay and an NS1 lateral flow rapid test (LFRT) were compared against a gold standard reference diagnostic algorithm in 138 Vietnamese children and adults. Overall, the Platelia NS1 ELISA was modestly more sensitive (82%) than the NS1 LFRT (72%) in confirmed dengue cases. Both ELISA and LFRT assays were more sensitive for primary than secondary dengue, and for specimens collected within 3 days of illness onset relative to later time points. The presence of measurable DENV-reactive IgG and to a lesser extent IgM in the test sample was associated with a significantly lower rate of NS1 detection in both assays. NS1 positivity was associated with the underlying viraemia, as NS1-positive samples had a significantly higher viraemia than NS1-negative samples matched for duration of illness. The Platelia and NS1 LFRT were 100% specific, being negative in all febrile patients without evidence of recent dengue, as well as in patients with enteric fever, malaria, Japanese encephalitis and leptospirosis. CONCLUSIONS/SIGNIFICANCE: Collectively, these data suggest NS1 assays deserve inclusion in the diagnostic evaluation of dengue patients, but with due consideration for the limitations in patients who present late in their illness or have a concomitant humoral immune response.
BACKGROUND: Artemisinin combination therapy (ACT) is now the recommended first-line treatment for falciparum malaria throughout the world. Initiatives to eliminate malaria are critically dependent on its efficacy. There is recent worrying evidence that artemisinin resistance has arisen on the Thai-Cambodian border. Urgent containment interventions are planned and about to be executed. Mathematical modeling approaches to intervention design are now integrated into the field of malaria epidemiology and control. The use of such an approach to investigate the likely effectiveness of different containment measures with the ultimate aim of eliminating artemisinin-resistant malaria is described. METHODS: A population dynamic mathematical modeling framework was developed to explore the relative effectiveness of a variety of containment interventions in eliminating artemisinin-resistant malaria in western Cambodia. RESULTS: The most effective intervention to eliminate artemisinin-resistant malaria was a switch of treatment from artemisinin monotherapy to ACT (mean time to elimination 3.42 years (95% CI 3.32-3.60 years). However, with this approach it is predicted that elimination of artemisinin-resistant malaria using ACT can be achieved only by elimination of all malaria. This is because the various forms of ACT are more effective against infections with artemisinin-sensitive parasites, leaving the more resistant infections as an increasing proportion of the dwindling parasite population. CONCLUSION: Containment of artemisinin-resistant malaria can be achieved by elimination of malaria from western Cambodia using ACT. The "last man standing" is the most resistant and thus this strategy must be sustained until elimination is truly achieved.
BACKGROUND: Efforts to tackle the enormous burden of ill-health in low-income countries are hampered by weak health information infrastructures that do not support appropriate planning and resource allocation. For health information systems to function well, a reliable inventory of health service providers is critical. The spatial referencing of service providers to allow their representation in a geographic information system is vital if the full planning potential of such data is to be realized. METHODS: A disparate series of contemporary lists of health service providers were used to update a public health facility database of Kenya last compiled in 2003. These new lists were derived primarily through the national distribution of antimalarial and antiretroviral commodities since 2006. A combination of methods, including global positioning systems, was used to map service providers. These spatially-referenced data were combined with high-resolution population maps to analyze disparity in geographic access to public health care. FINDINGS: The updated 2008 database contained 5,334 public health facilities (67% ministry of health; 28% mission and nongovernmental organizations; 2% local authorities; and 3% employers and other ministries). This represented an overall increase of 1,862 facilities compared to 2003. Most of the additional facilities belonged to the ministry of health (79%) and the majority were dispensaries (91%). 93% of the health facilities were spatially referenced, 38% using global positioning systems compared to 21% in 2003. 89% of the population was within 5 km Euclidean distance to a public health facility in 2008 compared to 71% in 2003. Over 80% of the population outside 5 km of public health service providers was in the sparsely settled pastoralist areas of the country. CONCLUSION: We have shown that, with concerted effort, a relatively complete inventory of mapped health services is possible with enormous potential for improving planning. Expansion in public health care in Kenya has resulted in significant increases in geographic access although several areas of the country need further improvements. This information is key to future planning and with this paper we have released the digital spatial database in the public domain to assist the Kenyan Government and its partners in the health sector.
Understanding the host genetic susceptibility to typhoid fever may provide a better understanding of pathogenesis and help in the development of new therapeutics and vaccines. Here we determine the genetic variation within the human TLR4 gene encoding the principal receptor for bacterial endotoxin recognition in typhoid fever patients. It is possible that genetic variants of TLR4 could detrimentally affect the innate immune response against S. typhi infection. Mutation detection and genotyping of TLR4 was performed on DNA from 414 Vietnamese typhoid fever patients and 372 population controls. dHPLC detected a total of 10 polymorphisms within the upstream and exonic regions of TLR4, of which 7 are novel. Two SNPs, T4025A and C4215G, were more frequent in typhoid cases than in controls however due to their low allele frequencies they showed borderline significance (T4025A: OR 1.9, 95%CI 0.9-4.3, P 0.07 and C4215G: OR 6.7, 95%CI 0.8-307, P 0.04). Six missense mutations were identified, with 5/6 positioned in the ectoplasmic domain. Four missense mutations and one promoter SNP (A-271G) were only present in typhoid cases, albeit at low allele frequencies. Here we determined the extent of genetic variation within TLR4 in a Vietnamese population and suggest that TLR4 may be involved in defense against typhoid fever in this population.
West Nile virus (WNV) has emerged globally as an increasingly important pathogen for humans and domestic animals. Studies of the evolutionary diversity of the virus over its known history will help to elucidate conserved sites, and characterize their correspondence to other pathogens and their relevance to the immune system. We describe a large-scale analysis of the entire WNV proteome, aimed at identifying and characterizing evolutionarily conserved amino acid sequences. This study, which used 2,746 WNV protein sequences collected from the NCBI GenPept database, focused on analysis of peptides of length 9 amino acids or more, which are immunologically relevant as potential T-cell epitopes. Entropy-based analysis of the diversity of WNV sequences, revealed the presence of numerous evolutionarily stable nonamer positions across the proteome (entropy value of < or = 1). The representation (frequency) of nonamers variant to the predominant peptide at these stable positions was, generally, low (< or = 10% of the WNV sequences analyzed). Eighty-eight fragments of length 9-29 amino acids, representing approximately 34% of the WNV polyprotein length, were identified to be identical and evolutionarily stable in all analyzed WNV sequences. Of the 88 completely conserved sequences, 67 are also present in other flaviviruses, and several have been associated with the functional and structural properties of viral proteins. Immunoinformatic analysis revealed that the majority (78/88) of conserved sequences are potentially immunogenic, while 44 contained experimentally confirmed human T-cell epitopes. This study identified a comprehensive catalogue of completely conserved WNV sequences, many of which are shared by other flaviviruses, and majority are potential epitopes. The complete conservation of these immunologically relevant sequences through the entire recorded WNV history suggests they will be valuable as components of peptide-specific vaccines or other therapeutic applications, for sequence-specific diagnosis of a wide-range of Flavivirus infections, and for studies of homologous sequences among other flaviviruses.
BACKGROUND: Genetic heterozygosity is increasingly being shown to be a key predictor of fitness in natural populations, both through inbreeding depression, inbred individuals having low heterozygosity, and also through chance linkage between a marker and a gene under balancing selection. One important component of fitness that is often highlighted is resistance to parasites and other pathogens. However, the significance of equivalent loci in human populations remains unclear. Consequently, we performed a case-control study of fatal invasive bacterial disease in Kenyan children using a genome-wide screen with microsatellite markers. METHODS: 148 cases, comprising children aged <13 years who died of invasive bacterial disease, (variously, bacteraemia, bacterial meningitis or neonatal sepsis) and 137 age-matched, healthy children were sampled in a prospective study conducted at Kilifi District Hospital, Kenya. Samples were genotyped for 134 microsatellite markers using the ABI LD20 marker set and analysed for an association between homozygosity and mortality. RESULTS: At five markers homozygosity was strongly associated with mortality (odds ratio range 4.7 - 12.2) with evidence of interactions between some markers. Mortality was associated with different non-overlapping marker groups in Gram positive and Gram negative bacterial disease. Homozygosity at susceptibility markers was common (prevalence 19-49%) and, with the large effect sizes, this suggests that bacterial disease mortality may be strongly genetically determined. CONCLUSION: Balanced polymorphisms appear to be more widespread in humans than previously appreciated and play a critical role in modulating susceptibility to infectious disease. The effect sizes we report, coupled with the stochasticity of exposure to pathogens suggests that infection and mortality are far from random due to a strong genetic basis.
BACKGROUND: Streptococcus suis can cause severe systemic infection in adults exposed to infected pigs or after consumption of undercooked pig products. S. suis is often misdiagnosed, due to lack of awareness and improper testing. Here we report the first fifty cases diagnosed with S. suis infection in northern Viet Nam. METHODOLOGY/PRINCIPAL FINDINGS: In 2007, diagnostics for S. suis were set up at a national hospital in Hanoi. That year there were 43 S. suis positive cerebrospinal fluid samples, of which S. suis could be cultured in 32 cases and 11 cases were only positive by PCR. Seven patients were blood culture positive for S. suis but CSF culture and PCR negative; making a total of 50 patients with laboratory confirmed S. suis infection in 2007. The number of S. suis cases peaked during the warmer months. CONCLUSIONS/SIGNIFICANCE: S. suis was commonly diagnosed as a cause of bacterial meningitis in adults in northern Viet Nam. In countries where there is intense and widespread exposure of humans to pigs, S. suis can be an important human pathogen.
BACKGROUND: In Vietnam the blackwater fever syndrome (BWF) has been associated with malaria infection, quinine ingestion and G6PD deficiency. The G6PD variants within the Vietnamese Kinh contributing to the disease risk in this population, and more generally to haemoglobinuria, are currently unknown. METHOD: Eighty-two haemoglobinuria patients and 524 healthy controls were screened for G6PD deficiency using either the methylene blue reduction test, the G-6-PDH kit or the micro-methaemoglobin reduction test. The G6PD gene variants were screened using SSCP combined with DNA sequencing in 82 patients with haemoglobinuria, and in 59 healthy controls found to be G6PD deficient. RESULTS: This study confirmed that G6PD deficiency is strongly associated with haemoglobinuria (OR = 15, 95% CI [7.7 to 28.9], P < 0.0001). Six G6PD variants were identified in the Vietnamese population, of which two are novel (Vietnam1 [Glu3Lys] and Vietnam2 [Phe66Cys]). G6PD Viangchan [Val291Met], common throughout south-east Asia, accounted for 77% of the variants detected and was significantly associated with haemoglobinuria within G6PD-deficient ethnic Kinh Vietnamese (OR = 5.8 95% CI [114-55.4], P = 0.022). CONCLUSION: The primary frequency of several G6PD mutations, including novel mutations, in the Vietnamese Kinh population are reported and the contribution of G6PD mutations to the development of haemoglobinuria are investigated.
BACKGROUND: The fixed dose antimalarial combination of dihydroartemisinin-piperaquine (DP) is a promising new artemisinin-based combination therapy (ACT). We present an individual patient data analysis of efficacy and tolerability in acute uncomplicated falciparum malaria, from seven published randomized clinical trials conducted in Africa and South East Asia using a predefined in-vivo protocol. Comparator drugs were mefloquine-artesunate (MAS3) in Thailand, Myanmar, Laos and Cambodia; artemether-lumefantrine in Uganda; and amodiaquine+sulfadoxine-pyrimethamine and artesunate+amodiaquine in Rwanda. METHODS AND FINDINGS: In total 3,547 patients were enrolled: 1,814 patients (32% children under five years) received DP and 1,733 received a comparator antimalarial at 12 different sites and were followed for 28-63 days. There was no significant heterogeneity between trials. DP was well tolerated with 1.7% early vomiting. There were less adverse events with DP in children and adults compared to MAS3 except for diarrhea; ORs (95%CI) 2.74 (2.13 to 3.51) and 3.11 (2.31 to 4.18), respectively. DP treatment resulted in a rapid clearance of fever and parasitaemia. The PCR genotype corrected efficacy at Day 28 of DP assessed by survival analysis was 98.7% (95%CI 97.6-99.8). DP was superior to the comparator drugs in protecting against both P.falciparum recurrence and recrudescence (P = 0.001, weighted by site). There was no difference between DP and MAS3 in treating P. vivax co-infections and in suppressing the first relapse (median interval to P. vivax recurrence: 6 weeks). Children under 5 y were at higher risk of recurrence for both infections. The proportion of patients developing gametocytaemia (P = 0.002, weighted by site) and the subsequent gametocyte carriage rates were higher with DP (11/1000 person gametocyte week, PGW) than MAS3 (6/1000 PGW, P = 0.001, weighted by site). CONCLUSIONS: DP proved a safe, well tolerated, and highly effective treatment of P.falciparum malaria in Asia and Africa, but the effect on gametocyte carriage was inferior to that of MAS3.
BACKGROUND: Pharmacokinetic (PK) data on amodiaquine (AQ) and artesunate (AS) are limited in children, an important risk group for malaria. The aim of this study was to evaluate the PK properties of a newly developed and registered fixed dose combination (FDC) of artesunate and amodiaquine. METHODS: A prospective population pharmacokinetic study of AS and AQ was conducted in children aged six months to five years. Participants were randomized to receive the new artesunate and amodiaquine FDC or the same drugs given in separate tablets. Children were divided into two groups of 70 (35 in each treatment arm) to evaluate the pharmacokinetic properties of AS and AQ, respectively. Population pharmacokinetic models for dihydroartemisinin (DHA) and desethylamodiaquine (DeAq), the principal pharmacologically active metabolites of AS and AQ, respectively, and total artemisinin anti-malarial activity, defined as the sum of the molar equivalent plasma concentrations of DHA and artesunate, were constructed using the non-linear mixed effects approach. Relative bioavailability between products was compared by estimating the ratios (and 95% CI) between the areas under the plasma concentration-time curves (AUC). RESULTS: The two regimens had similar PK properties in young children with acute malaria. The ratio of loose formulation to fixed co-formulation AUCs, was estimated as 1.043 (95% CI: 0.956 to 1.138) for DeAq. For DHA and total anti-malarial activity AUCs were estimated to be the same. Artesunate was rapidly absorbed, hydrolysed to DHA, and eliminated. Plasma concentrations were significantly higher following the first dose, when patients were acutely ill, than after subsequent doses when patients were usually afebrile and clinically improved. Amodiaquine was converted rapidly to DeAq, which was then eliminated with an estimated median (range) elimination half-life of 9 (7 to 12) days. Efficacy was similar in the two treatments groups, with cure rates of 0.946 (95% CI: 0.840-0.982) in the AS+AQ group and 0.892 (95% CI: 0.787 - 0.947) in the AS/AQ group. Four out of five patients with PCR confirmed recrudescences received AQ doses < 10 mg/kg. Both regimens were well tolerated. No child developed severe, post treatment neutropaenia (<1,000/muL). There was no evidence of AQ dose related hepatotoxicity, but one patient developed an asymptomatic rise in liver enzymes that was resolving by Day-28. CONCLUSION: The bioavailability of the co-formulated AS-AQ FDC was similar to that of the separate tablets for desethylamodiaquine, DHA and the total anti-malarial activity. These data support the use this new AS-AQ FDC in children with acute uncomplicated falciparum malaria.
BACKGROUND: Mefloquine and artesunate combination therapy is the recommended first-line treatment for uncomplicated malaria throughout much of south-east Asia. Concerns have been raised about the potential central nervous system (CNS) effects of both drug components and there are no detailed reports in very young children. METHODS: Children, aged between three months and five years, with acute uncomplicated Plasmodium falciparum malaria were randomized to either 7 days of artesunate monotherapy or the same schedule of artesunate plus mefloquine on day 7 and 8. Neurological testing targeting coordination and behaviour was carried out at day 0, 7, 9, 10, 14 and 28. Non-febrile healthy control children from the same population were tested on days 0, 7, 14 and 28. RESULTS: From December 1994 to July 1997, 91 children with uncomplicated P. falciparum, 45 treated with artesunate monotherapy, 46 treated with mefloquine and artesunate combination therapy and 36 non-febrile controls, underwent neurological testing. Malaria and fever had a significant negative impact on testing performance. By contrast, the anti-malarial treatments were not associated with worsening performances in the various components of the test. Artesunate and mefloquine do not appear to have a significant influence on coordination and behaviour. Children treated with mefloquine were significantly less likely to suffer recurrent malaria infection during follow-up compared to those treated with artesunate alone (P = 0.033). CONCLUSION: In keeping with the results of randomized controlled trials in adults, mefloquine was not associated with a decrease in specific items of neurological performance. Likewise, children treated with artesunate did not perform significantly differently to control children. This study does not exclude subtle or rare treatment CNS effects of artesunate or mefloquine. Treatment of acute uncomplicated malaria results in a significant improvement on items of neurological performance.
BACKGROUND: A novel variant of influenza A (H1N1) is causing a pandemic and, although the illness is usually mild, there are concerns that its virulence could change through reassortment with other influenza viruses. This is of greater concern in parts of Southeast Asia, where the population density is high, influenza is less seasonal, human-animal contact is common and avian influenza is still endemic. METHODS: We developed an age- and spatially-structured mathematical model in order to estimate the potential impact of pandemic H1N1 in Vietnam and the opportunities for reassortment with animal influenza viruses. The model tracks human infection among domestic animal owners and non-owners and also estimates the numbers of animals may be exposed to infected humans. RESULTS: In the absence of effective interventions, the model predicts that the introduction of pandemic H1N1 will result in an epidemic that spreads to half of Vietnam's provinces within 57 days (interquartile range (IQR): 45-86.5) and peaks 81 days after introduction (IQR: 62.5-121 days). For the current published range of the 2009 H1N1 influenza's basic reproductive number (1.2-3.1), we estimate a median of 410,000 cases among swine owners (IQR: 220,000-670,000) with 460,000 exposed swine (IQR: 260,000-740,000), 350,000 cases among chicken owners (IQR: 170,000-630,000) with 3.7 million exposed chickens (IQR: 1.9 M-6.4 M), and 51,000 cases among duck owners (IQR: 24,000 - 96,000), with 1.2 million exposed ducks (IQR: 0.6 M-2.1 M). The median number of overall human infections in Vietnam for this range of the basic reproductive number is 6.4 million (IQR: 4.4 M-8.0 M). CONCLUSION: It is likely that, in the absence of effective interventions, the introduction of a novel H1N1 into a densely populated country such as Vietnam will result in a widespread epidemic. A large epidemic in a country with intense human-animal interaction and continued co-circulation of other seasonal and avian viruses would provide substantial opportunities for H1N1 to acquire new genes.
BMJ, 339 pp. b3991. | Citations: 1 (Scopus) | Read more2009. Online video sharing and patients' privacy.
BACKGROUND: The scaling of malaria control to achieve universal coverage requires a better understanding of the population sub-groups that are least protected and provide barriers to interrupted transmission. Here we examine the age pattern of use of insecticide treated nets (ITNs) in Africa in relation to biological vulnerabilities and the implications for future prospects for universal coverage. METHODS: Recent national household survey data for 18 malaria endemic countries in Africa were assembled to identify information on use of ITNs by age and sex. Age-structured medium variant projected population estimates for the mid-point year of the earliest and most recent national surveys were derived to compute the population by age protected by ITNs. RESULTS: All surveys were undertaken between 2005 and 2009, either as demographic health surveys (n = 12) or malaria indicator surveys (n = 6). Countries were categorized into three ITN use groups: <10%; 10 to <20%; and > or =20% and projected population estimates for the mid-point year of 2007 were computed. In general, the pattern of overall ITNs use with age was similar by country and across the three country groups with ITNs use initially high among children <5 years of age, sharply declining among the population aged 5-19 years, before rising again across the ages 20-44 years and finally decreasing gradually in older ages. For all groups of countries, the highest proportion of the population not protected by ITNs (38% - 42%) was among those aged 5-19 years. CONCLUSION: In malaria-endemic Africa, school-aged children are the least protected with ITNs but represent the greatest reservoir of infections. With increasing school enrollment rates, school-delivery of ITNs should be considered as an approach to reach universal ITNs coverage and improve the likelihood of impacting upon parasite transmission.
BACKGROUND: A number of molecular tools have been developed to monitor the emergence and spread of anti-malarial drug resistance to Plasmodium falciparum. One of the major obstacles to the wider implementation of these tools is the absence of practical methods enabling high throughput analysis. Here a new Zip-code array is described, called FlexiChip, linked to a dedicated software program, which largely overcomes this problem. METHODS: Previously published microarray probes detecting single-nucleotide polymorphisms (SNP) associated with parasite resistance to anti-malarial drugs (ResMalChip) were adapted for a universal microarray FlexiChip format. To evaluate the overall sensitivity of the FlexiChip package (microarray + software), the results of FlexiChip were compared to ResMalChip microarray, using the same extension probes and with the same PCR products. In both cases, sequence results were used as gold standard to calculate sensitivity and specificity. FlexiChip results obtained with a set of field isolates were then compared to those assessed in an independent reference laboratory. RESULTS: The FlexiChip package gave results identical to the ResMalChip results in 92.7% of samples (kappa coefficient 0.8491, with a standard error 0.021) and had a sensitivity of 95.88% and a specificity of 97.68% compared to the sequencing as the reference method. Moreover the method performed well compared to the results obtained in the reference laboratories, with 99.7% of identical results (kappa coefficient 0.9923, S.E. 0.0523). CONCLUSION: Microarrays could be employed to monitor P. falciparum drug resistance markers with greater cost effectiveness and the possibility for high throughput analysis. The FlexiChip package is a promising tool for use in poor resource settings of malaria endemic countries.
BACKGROUND: In areas where non-falciparum malaria is common rapid diagnostic tests (RDTs) capable of distinguishing malaria species reliably are needed. Such tests are often based on the detection of parasite lactate dehydrogenase (pLDH). METHODS: In Dawei, southern Myanmar, three pLDH based RDTs (CareStart Malaria pLDH (Pan), CareStart Malaria pLDH (Pan, Pf) and OptiMAL-IT)were evaluated in patients presenting with clinically suspected malaria. Each RDT was read independently by two readers. A subset of patients with microscopically confirmed malaria had their RDTs repeated on days 2, 7 and then weekly until negative. At the end of the study, samples of study batches were sent for heat stability testing. RESULTS: Between August and November 2007, 1004 patients aged between 1 and 93 years were enrolled in the study. Slide microscopy (the reference standard) diagnosed 213 Plasmodium vivax (Pv) monoinfections, 98 Plasmodium falciparum (Pf) mono-infections and no malaria in 650 cases. The sensitivities (sens) and specificities (spec), of the RDTs for the detection of malaria were- CareStart Malaria pLDH (Pan) test: sens 89.1% [CI95 84.2-92.6], spec 97.6% [CI95 96.5-98.4]. OptiMal-IT: Pf+/- other species detection: sens 95.2% [CI95 87.5-98.2], spec 94.7% [CI95 93.3-95.8]; non-Pf detection alone: sens 89.6% [CI95 83.6-93.6], spec 96.5% [CI95 94.8-97.7]. CareStart Malaria pLDH (Pan, Pf): Pf+/- other species: sens 93.5% [CI95 85.4-97.3], spec 97.4% [95.9-98.3]; non-Pf: sens 78.5% [CI95 71.1-84.4], spec 97.8% [CI95 96.3-98.7]. Inter-observer agreement was excellent for all tests (kappa > 0.9). The median time for the RDTs to become negative was two days for the CareStart Malaria tests and seven days for OptiMAL-IT. Tests were heat stable up to 90 days except for OptiMAL-IT (Pf specific pLDH stable to day 20 at 35 degrees C). CONCLUSION: None of the pLDH-based RDTs evaluated was able to detect non-falciparum malaria with high sensitivity, particularly at low parasitaemias. OptiMAL-IT performed best overall and would perform best in an area of high malaria prevalence among screened fever cases. However, heat stability was unacceptable and the number of steps to perform this test is a significant drawback in the field. A reliable, heat-stable, highly sensitive RDT, capable of diagnosing all Plasmodium species has yet to be identified.
BACKGROUND: Between 2003 and 2005, highly pathogenic avian influenza A (H5N1) viruses caused large scale outbreaks in poultry in the Ho Chi Minh City area in Vietnam. We studied the prevalence of antibodies against H5N1 in poultry workers and cullers who were active in the program in Ho Chi Minh City in 2004 and 2005. METHODOLOGY/PRINCIPAL FINDINGS: Single sera from 500 poultry workers and poultry cullers exposed to infected birds were tested for antibodies to avian influenza H5N1, using microneutralization assays and hemagglutination inhibition assay with horse blood. All sera tested negative using microneutralization tests. Three samples showed a 1ratio80 titer in the hemagglutination inhibition assay. CONCLUSIONS/SIGNIFICANCE: This study provides additional support for the low transmissibility of clade 1 H5N1 to humans, but limited transmission to highly exposed persons cannot be excluded given the presence of low antibody titers in some individuals.
Naturally acquired immunity to falciparum malaria protects millions of people routinely exposed to Plasmodium falciparum infection from severe disease and death. There is no clear concept about how this protection works. There is no general agreement about the rate of onset of acquired immunity or what constitutes the key determinants of protection; much less is there a consensus regarding the mechanism(s) of protection. This review summarizes what is understood about naturally acquired and experimentally induced immunity against malaria with the help of evolving insights provided by biotechnology and places these insights in the context of historical, clinical, and epidemiological observations. We advocate that naturally acquired immunity should be appreciated as being virtually 100% effective against severe disease and death among heavily exposed adults. Even the immunity that occurs in exposed infants may exceed 90% effectiveness. The induction of an adult-like immune status among high-risk infants in sub-Saharan Africa would greatly diminish disease and death caused by P. falciparum. The mechanism of naturally acquired immunity that occurs among adults living in areas of hyper- to holoendemicity should be understood with a view toward duplicating such protection in infants and young children in areas of endemicity.
BACKGROUND TO THE DEBATE: Current guidelines recommend that all fever episodes in African children be treated presumptively with antimalarial drugs. But declining malarial transmission in parts of sub-Saharan Africa, declining proportions of fevers due to malaria, and the availability of rapid diagnostic tests mean it may be time for this policy to change. This debate examines whether enough evidence exists to support abandoning presumptive treatment and whether African health systems have the capacity to support a shift toward laboratory-confirmed rather than presumptive diagnosis and treatment of malaria in children under five.
ANNALS OF NUTRITION AND METABOLISM, 55 pp. 215-215.2009. MUAC AND BMI: USEFULNESS IN PREGNANCY AMONG REFUGEE AND MIGRANT POPULATIONS ALONG THE THAI-MYANMAR BORDER
ANNALS OF NUTRITION AND METABOLISM, 55 pp. 299-299.2009. GROWTH VELOCITY OF TERM INFANTS BORN IN MAELA CAMP FOR DISPLACED POPULATION ALONG THE THAI-MYANMAR BORDER
ANNALS OF NUTRITION AND METABOLISM, 55 pp. 440-441.2009. IRON, THIAMINE, AND MICRONUTRIENT STATUS IN PREGNANT AND BREAST-FEEDING WOMEN: ASSESSMENT OF FOOD RATIONS AND SUPPLEMENTATION IN MAELA REFUGEE CAMP, NORTHERN THAILAND
BACKGROUND: In 2006, the Médecins sans Frontières nutritional program in the region of Maradi (Niger) included 68,001 children 6-59 months of age with either moderate or severe malnutrition, according to the NCHS reference (weight-for-height<80% of the NCHS median, and/or mid-upper arm circumference<110 mm for children taller than 65 cm and/or presence of bipedal edema). Our objective was to identify baseline risk factors for death among children diagnosed with severe malnutrition using the newly introduced WHO growth standards. As the release of WHO growth standards changed the definition of severe malnutrition, which now includes many children formerly identified as moderately malnourished with the NCHS reference, studying this new category of children is crucial. METHODOLOGY: Program monitoring data were collected from the medical records of all children admitted in the program. Data included age, sex, height, weight, MUAC, clinical signs on admission including edema, and type of discharge (recovery, death, and default/loss to follow up). Additional data included results of a malaria rapid diagnostic test due to Plasmodium falciparum (Paracheck) and whether the child was a resident of the region of Maradi or came from bordering Nigeria to seek treatment. Multivariate logistic regression was performed on a subset of 27,687 children meeting the new WHO growth standards criteria for severe malnutrition (weight-for-height<-3 Z score, mid-upper arm circumference<110 mm for children taller than 65 cm or presence of bipedal edema). We explored two different models: one with only basic anthropometric data and a second model that included perfunctory clinical signs. PRINCIPAL FINDINGS: In the first model including only weight, height, sex and presence of edema, the risk factors retained were the weight/height(1.84) ratio (OR: 5,774; 95% CI: [2,284; 14,594]) and presence of edema (7.51 [5.12; 11.0]). A second model, taking into account supplementary data from perfunctory clinical examination, identified other risk factors for death: apathy (9.71 [6.92; 13.6]), pallor (2.25 [1.25; 4.05]), anorexia (1.89 [1.35; 2.66]), fever>38.5 degrees C (1.83 [1.25; 2.69]), and age below 1 year (1.42 [1.01; 1.99]). CONCLUSIONS: Although clinicians will continue to perform screening using clinical signs and anthropometry, these risk indicators may provide additional criteria for the assessment of absolute and relative risk of death. Better appraisal of the child's risk of death may help orientate the child towards either hospitalization or ambulatory care. As the transition from the NCHS growth reference to the WHO standards will increase the number of children classified as severely malnourished, further studies should explore means to identify children at highest risk of death within this group using simple and standardized indicators.
BACKGROUND: Until the 1970s the prevalence of non-venereal trepanomatosis, including yaws, was greatly reduced after worldwide mass treatment. In 2005, cases were again reported in the Democratic Republic of the Congo. We carried out a survey to estimate the village-level prevalence of yaws in the region of Equator in the north of the country in order to define appropriate strategies to effectively treat the affected population. METHODOLOGY/PRINCIPAL FINDINGS: We designed a community-based survey using the Lot Quality Assurance Sampling method to classify the prevalence of active yaws in 14 groups of villages (lots). The classification into high, moderate, or low yaws prevalence corresponded to World Health Organization prevalence thresholds for identifying appropriate operational treatment strategies. Active yaws cases were defined by suggestive clinical signs and positive rapid plasma reagin and Treponema pallidum hemagglutination serological tests. The overall prevalence in the study area was 4.7% (95% confidence interval: 3.4-6.0). Two of 14 lots had high prevalence (>10%), three moderate prevalence (5-10%) and nine low prevalence (<5%.). CONCLUSIONS/SIGNIFICANCE: Although yaws is no longer a World Health Organization priority disease, the presence of yaws in a region where it was supposed to be eradicated demonstrates the importance of continued surveillance and control efforts. Yaws should remain a public health priority in countries where previously it was known to be endemic. The integration of sensitive surveillance systems together with free access to effective treatment is recommended. As a consequence of our study results, more than 16,000 people received free treatment against yaws.
Although Staphylococcus aureus is a bacterial species of medical significance, only approximately 30% of all humans carry staphylococcal cells persistently but asymptomatically in their nasopharynx and/or other body sites. This goes largely unnoticed by the host, which shows that in the natural situation the human ecosystem is hospitable or at least receptive to the bacteria and that by a process of co-evolution this has lead to a state of mutual acceptance or tolerance. However, upon disturbance of this balanced, neutral state, localized or disseminated invasive infection can occur. Unfortunately, the events leading to infection are still largely unknown and especially the causal events leading to the transition from colonization to infection are ill-defined in vivo. Whether certain genotypes of S. aureus are more prone to colonise and/or infect humans is still quite heavily debated. The genetic population structure of S. aureus has been largely solved by using a number of different DNA polymorphism-interrogating laboratory methods. However, even this major effort has not (yet) revealed major clues with respect to colonisation and infection potency of the clonal lineages that were thus identified, except for the fact that certain lineages are highly epidemic. The overall picture is that in principle all S. aureus strains can become invasive given the proper circumstances. What these, primarily host-defined circumstances are is still enigmatic. However, a large variety of staphylococcal virulence and colonization factors have been identified as well as a number of host' colonisation and infection susceptibility traits. How these are specifically involved in colonisation and infection has been experimentally substantiated in only a limited number of cases. The present review paper will explore the relevance of these and other, for instance environmental factors that define the colonisation or infection state in humans. When the nature of these states would be known in more detail, this knowledge could be used to design novel and empirical, knowledge-driven means of preventing colonisation from proceeding into S. aureus infection.
BACKGROUND: Lassa fever is a viral hemorrhagic fever endemic in West Africa. The reservoir host of the virus is a multimammate rat, Mastomys natalensis. Prevalence estimates of Lassa virus antibodies in humans vary greatly between studies, and the main modes of transmission of the virus from rodents to humans remain unclear. We aimed to (i) estimate the prevalence of Lassa virus-specific IgG antibodies (LV IgG) in the human population of a rural area of Guinea, and (ii) identify risk factors for positive LV IgG. METHODS AND FINDINGS: A population-based cross-sectional study design was used. In April 2000, all individuals one year of age and older living in three prefectures located in the tropical secondary forest area of Guinea (Gueckedou, Lola and Yomou) were sampled using two-stage cluster sampling. For each individual identified by the sampling procedure and who agreed to participate, a standardized questionnaire was completed to collect data on personal exposure to potential risk factors for Lassa fever (mainly contact with rodents), and a blood sample was tested for LV IgG. A multiple logistic regression model was used to determine risk factors for positive LV IgG. A total of 1424 subjects were interviewed and 977 sera were tested. Prevalence of positive LV Ig was of 12.9% [10.8%-15.0%] and 10.0% [8.1%-11.9%] in rural and urban areas, respectively. Two risk factors of positive LV IgG were identified: to have, in the past twelve months, undergone an injection (odds ratio [OR] = 1.8 [1.1-3.1]), or lived with someone displaying a haemorrhage (OR = 1.7 [1.1-2.9]). No factors related to contacts with rats and/or mice remained statistically significant in the multivariate analysis. CONCLUSIONS: Our study underlines the potential importance of person-to-person transmission of Lassa fever, via close contact in the same household or nosocomial exposure.
The Pastorex((R)) (BioRad) rapid agglutination test is one of the main rapid diagnostic tests (RDTs) for meningococcal disease currently in use in the "meningitis belt". Earlier evaluations, performed after heating and centrifugation of cerebrospinal fluid (CSF) samples, under good laboratory conditions, showed high sensitivity and specificity. However, during an epidemic, the test may be used without prior sample preparation. Recently a new, easy-to-use dipstick RDT for meningococcal disease detection on CSF was developed by the Centre de Recherche Médicale et Sanitaire in Niger and the Pasteur Institute in France. We estimate diagnostic accuracy in the field during the 2006 outbreak of Neisseria meningitidis serogroup A in Maradi, Niger, for the dipstick RDT and Pastorex((R)) on unprepared CSF, (a) by comparing each test's sensitivity and specificity with previously reported values; and (b) by comparing results for each test on paired samples, using McNemar's test. We also (c) estimate diagnostic accuracy of the dipstick RDT on diluted whole blood. We tested unprepared CSF and diluted whole blood from 126 patients with suspected meningococcal disease presenting at four health posts. (a) Pastorex((R)) sensitivity (69%; 95%CI 57-79) was significantly lower than found previously for prepared CSF samples [87% (81-91); or 88% (85-91)], as was specificity [81% (95%CI 68-91) vs 93% (90-95); or 93% (87-96)]. Sensitivity of the dipstick RDT [89% (95%CI 80-95)] was similar to previously reported values for ideal laboratory conditions [89% (84-93) and 94% (90-96)]. Specificity, at 62% (95%CI 48-75), was significantly lower than found previously [94% (92-96) and 97% (94-99)]. (b) McNemar's test for the dipstick RDT vs Pastorex((R)) was statistically significant (p<0.001). (c) The dipstick RDT did not perform satisfactorily on diluted whole blood (sensitivity 73%; specificity 57%).Sensitivity and specificity of Pastorex((R)) without prior CSF preparation were poorer than previously reported results from prepared samples; therefore we caution against using this test during an epidemic if sample preparation is not possible. For the dipstick RDT, sensitivity was similar to, while specificity was not as high as previously reported during a more stable context. Further studies are needed to evaluate its field performance, especially for different populations and other serogroups.
BACKGROUND: Shigella dysenteriae type 1 (Sd1) is a cause of major dysentery outbreaks, particularly among children and displaced populations in tropical countries. Although outbreaks continue, the characteristics of such outbreaks have rarely been documented. Here, we describe the Sd1 outbreaks occurring between 1993 and 1995 in 11 refugee settlements in Rwanda, Tanzania and Democratic Republic of the Congo (DRC). We also explored the links between the different types of the camps and the magnitude of the outbreaks. METHODOLOGY/PRINCIPAL FINDINGS: Number of cases of bloody diarrhea and deaths were collected on a weekly basis in 11 refugee camps, and analyzed retrospectively. Between November 1993 and February 1995, 181,921 cases of bloody diarrhea were reported. Attack rates ranged from 6.3% to 39.1% and case fatality ratios (CFRs) from 1.5% to 9.0% (available for 5 camps). The CFRs were higher in children under age 5. In Tanzania where the response was rapidly deployed, the mean attack rate was lower than in camps in the region of Goma without an immediate response (13.3% versus 32.1% respectively). CONCLUSIONS/SIGNIFICANCE: This description, and the areas where data is missing, highlight both the importance of collecting data in future epidemics, difficulties in documenting outbreaks occurring in complex emergencies and most importantly, the need to assure that minimal requirements are met.
Background: The fixed dose antimalarial combination of dihydroartemisinin-piperaquine (DP) is a promising new artemisinin-based combination therapy (ACT). We present an individual patient data analysis of efficacy and tolerability in acute uncomplicated falciparum malaria, from seven published randomized clinical trials conducted in Africa and South East Asia using a predefined in-vivo protocol. Comparator drugs were mefloquine-artesunate (MAS3) in Thailand, Myanmar, Laos and Cambodia; artemether-lumefantrine in Uganda; and amodiaquine+sulfadoxine-pyrimethamine and artesunate+amodiaquine in Rwanda. Methods and Findings: In total 3,547 patients were enrolled: 1,814 patients (32% children under five years) received DP and 1,733 received a comparator antimalarial at 12 different sites and were followed for 28-63 days. There was no significant heterogeneity between trials. DP was well tolerated with 1.7% early vomiting. There were less adverse events with DP in children and adults compared to MAS3 except for diarrhea; ORs (95%CI) 2.74 (2.13 to 3.51) and 3.11 (2.31 to 4.18), respectively. DP treatment resulted in a rapid clearance of fever and parasitaemia. The PCR genotype corrected efficacy at Day 28 of DP assessed by survival analysis was 98.7% (95%CI 97.6-99.8). DP was superior to the comparator drugs in protecting against both P.falciparum recurrence and recrudescence (P = 0.001, weighted by site). There was no difference between DP and MAS3 in treating P. vivax co-infections and in suppressing the first relapse (median interval to P. vivax recurrence: 6 weeks). Children under 5 y were at higher risk of recurrence for both infections. The proportion of patients developing gametocytaemia (P = 0.002, weighted by site) and the subsequent gametocyte carriage rates were higher with DP (11/1000 person gametocyte week, PGW) than MAS3 (6/ 1000 PGW, P = 0.001, weighted by site). Conclusions: DP proved a safe, well tolerated, and highly effective treatment of P.falciparum malaria in Asia and Africa, but the effect on gametocyte carriage was inferior to that of MAS3. © 2009 Zwang et al.
New England Journal of Medicine, 361 (18), pp. 1808-1809. | Citations: 2 (Scopus) | Read more2009. Peginterferon Alfa-2b or Alfa-2a with Ribavirin for Hepatitis C
BACKGROUND: Audio computer-assisted self-interview (ACASI) may elicit more frequent reporting of socially sensitive behaviours than face-to-face (FtF)-interview. However, no study compared responses to both methods in female and male sex workers (FSW; MSW) in Africa. METHODOLOGY/PRINCIPAL FINDINGS: We sequentially enrolled adults recruited for an HIV-1 intervention trial into a comparative study of ACASI and FtF-interview, in a clinic near Mombasa, Kenya. Feasibility and acceptability of ACASI, and a comparative analysis of enrolment responses between ACASI and FtF on an identical risk assessment questionnaire were evaluated. In total, 139 women and 259 men, 81% of eligible cohort participants, completed both interviews. ACASI captured a higher median number of regular (2 vs. 1, p<0.001, both genders) and casual partners in the last week (3 vs. 2, p = 0.04 in women; 2 vs. 1, p<0.001 in men). Group sex (21.6 vs. 13.5%, p<0.001, in men), intravenous drug use (IDU; 10.8 vs. 2.3%, p<0.001 in men; 4.4 vs. 0%, p = 0.03 in women), and rape (8.9 vs. 3.9%, p = 0.002, in men) were reported more frequently in ACASI. A surprisingly high number of women reported in ACASI that they had paid for sex (49.3 vs. 5.8%, p<0.001). Behaviours for recruitment (i.e. anal sex, sex work, sex between males) were reported less frequently in ACASI. The majority of women (79.2%) and men (69.7%) felt that answers given in ACASI were more honest. Volunteers who were not able to take ACASI (84 men, and 37 women) mostly lacked reading skills. CONCLUSIONS/SIGNIFICANCE: About 1 in 5 cohort participants was not able to complete ACASI, mostly for lack of reading skills. Participants who completed ACASI were more likely to report IDU, rape, group sex, and payment for sex by women than when asked in FtF interview. ACASI appears to be a useful tool for high risk behaviour assessments in the African context.
BACKGROUND: Clinical laboratory reference intervals have not been established in many African countries, and non-local intervals are commonly used in clinical trials to screen and monitor adverse events (AEs) among African participants. Using laboratory reference intervals derived from other populations excludes potential trial volunteers in Africa and makes AE assessment challenging. The objective of this study was to establish clinical laboratory reference intervals for 25 hematology, immunology and biochemistry values among healthy African adults typical of those who might join a clinical trial. METHODS AND FINDINGS: Equal proportions of men and women were invited to participate in a cross sectional study at seven clinical centers (Kigali, Rwanda; Masaka and Entebbe, Uganda; two in Nairobi and one in Kilifi, Kenya; and Lusaka, Zambia). All laboratories used hematology, immunology and biochemistry analyzers validated by an independent clinical laboratory. Clinical and Laboratory Standards Institute guidelines were followed to create study consensus intervals. For comparison, AE grading criteria published by the U.S. National Institute of Allergy and Infectious Diseases Division of AIDS (DAIDS) and other U.S. reference intervals were used. 2,990 potential volunteers were screened, and 2,105 (1,083 men and 1,022 women) were included in the analysis. While some significant gender and regional differences were observed, creating consensus African study intervals from the complete data was possible for 18 of the 25 analytes. Compared to reference intervals from the U.S., we found lower hematocrit and hemoglobin levels, particularly among women, lower white blood cell and neutrophil counts, and lower amylase. Both genders had elevated eosinophil counts, immunoglobulin G, total and direct bilirubin, lactate dehydrogenase and creatine phosphokinase, the latter being more pronounced among women. When graded against U.S. -derived DAIDS AE grading criteria, we observed 774 (35.3%) volunteers with grade one or higher results; 314 (14.9%) had elevated total bilirubin, and 201 (9.6%) had low neutrophil counts. These otherwise healthy volunteers would be excluded or would require special exemption to participate in many clinical trials. CONCLUSIONS: To accelerate clinical trials in Africa, and to improve their scientific validity, locally appropriate reference ranges should be used. This study provides ranges that will inform inclusion criteria and evaluation of adverse events for studies in these regions of Africa.
BACKGROUND: Adjunctive dexamethasone reduces mortality from tuberculous meningitis, but how it produces this effect is not known. Matrix metalloproteinases (MMPs) are important in the immunopathology of many inflammatory CNS diseases thus we hypothesized that that their secretion is important in TBM and might be influenced by dexamethasone. METHODOLOGY/PRINCIPAL FINDINGS: The kinetics of cerebrospinal fluid (CSF) MMP and tissue inhibitors of MMPs (TIMPs) concentrations were studied in a subset of HIV uninfected adults (n = 37) with TBM recruited to a randomized, placebo-controlled trial of adjuvant dexamethasone. Analysis followed a pre-defined plan. Dexamethasone significantly reduced CSF MMP-9 concentrations in early follow up samples (median 5 days (range 3-8) of treatment), but had no significant influence on other MMPs/TIMPs. Additionally CSF MMP-9 concentration was strongly correlated to concomitant CSF neutrophil count. CONCLUSIONS/SIGNIFICANCE: Dexamethasone decreased CSF MMP-9 concentrations early in treatment and this may represent one mechanism by which corticosteroids improve outcome in TBM. The strong correlation between CSF MMP-9 and neutrophil count suggests that polymorphonuclear leukocytes may play a central role in the early pathogenesis of TBM.
Understanding the host genetic susceptibility to typhoid fever may provide a better understanding of pathogenesis and help in the development of new therapeutics and vaccines. Here we determine the genetic variation within the human TLR4 gene encoding the principal receptor for bacterial endotoxin recognition in typhoid fever patients. It is possible that genetic variants of TLR4 could detrimentally affect the innate immune response against S. typhi infection. Mutation detection and genotyping of TLR4 was performed on DNA from 414 Vietnamese typhoid fever patients and 372 population controls. dHPLC detected a total of 10 polymorphisms within the upstream and exonic regions of TLR4, of which 7 are novel. Two SNPs, T4025A and C4215G, were more frequent in typhoid cases than in controls however due to their low allele frequencies they showed borderline significance (T4025A: OR 1.9, 95%Cl 0.9-4.3, P 0.07 and C4215G: OR 6.7, 95%Cl 0.8-307, P 0.04). Six missense mutations were identified, with 5/6 positioned in the ectoplasmic domain. Four missense mutations and one promoter SNP (A-271G) were only present in typhoid cases, albeit at low allele frequencies. Here we determined the extent of genetic variation within TLR4 in a Vietnamese population and suggest that TLR4 may be involved in defense against typhoid fever in this population. © 2009 Hue et al.
médecine/sciences, 25 (10), pp. 867-869. | Citations: 2 (Scopus) | Read more2009. Paludisme et grossesse : un dilemme thérapeutique
European Journal of Clinical Pharmacology, 65 (8), pp. 847. | Read more2009. Erratum:The pharmacokinetics of artemether and lumefantrine in pregnant women with uncomplicated falciparum malaria(European Journal of Clinical Pharmacology (2006) 62 (1021-1031) DOI: 10.1007/s00228-006-0199-7)
MODS is a novel liquid culture based technique that has been shown to be effective and rapid for early diagnosis of tuberculosis (TB). We evaluated the MODS assay for diagnosis of TB in children in Viet Nam. 217 consecutive samples including sputum (n=132), gastric fluid (n=50), CSF (n= 32) and pleural fluid (n=3) collected from 96 children with suspected TB, were tested by smear, MODS and MGIT. When test results were aggregated by patient, the sensitivity and specificity of smear, MGIT and MODS against ''clinical diagnosis'' (confirmed and probable groups) as the gold standard were 28.2% and 100%, 42.3% and 100%, 39.7% and 94.4%, respectively. The sensitivity of MGIT and MODS was not significantly different in this analysis (P= 0.5), but MGIT was more sensitive than MODS when analysed on the sample level using a marginal model (P =0.03). The median time to detection of MODS and MGIT were 8 days and 13 days, respectively, and the time to detection was significantly shorter for MODS in samples where both tests were positive (P,0.001). An analysis of time-dependent sensitivity showed that the detection rates were significantly higher for MODS than for MGIT by day 7 or day 14 (P,0.001 and P =0.04), respectively. MODS is a rapid and sensitive alternative method for the isolation of M.tuberculosis from children. © 2009 Ha et al.
Travel Med Infect Dis, 7 (4), pp. 179-180. | Citations: 4 (Scopus) | Read more2009. Travel medicine has come of age and a new examination marks the 21st birthday.
European Journal of Clinical Nutrition, 63 (3), pp. 450. | Read more2009. Erratum: Postpartum traditions and nutrition practices among urban Lao women and their infants in Vientiane, Lao PDR (European Journal of Clinical Nutrition (2009) vol. 63 (323-331) 10.1038/sj.ejcn.1602928)
Journal of Infection, 59 (1), pp. 73-73. | Read more2009. Corrigendum to “UK malaria treatment guidelines” [Journal of Infection 54 (2007) 111–121]
BACKGROUND: Chromobacterium violaceum is a Gram negative facultative anaerobic bacillus, found in soil and stagnant water, that usually has a violet pigmented appearance on agar culture. It is rarely described as a human pathogen, mostly from tropical and subtropical areas. CASE PRESENTATION: A 53 year-old farmer died with Chromobacterium violaceum septicemia in Laos. A modified oxidase method was used to demonstrate that this violacious organism was oxidase positive. Forensic analysis of the glucose-6-phosphate dehydrogenase genotypes of his family suggest that the deceased patient did not have this possible predisposing condition. CONCLUSION: C. violaceum infection should be included in the differential diagnosis in patients presenting with community-acquired septicaemia in tropical and subtropical areas. The apparently neglected but simple modified oxidase test may be useful in the oxidase assessment of other violet-pigmented organisms or of those growing on violet coloured agar.
Development and optimization of first generation malaria vaccine candidates has been facilitated by the existence of a well-established Plasmodium falciparum clinical challenge model in which infectious sporozoites are administered to human subjects via mosquito bite. While ideal for testing pre-erythrocytic stage vaccines, some researchers believe that the sporozoite challenge model is less appropriate for testing blood stage vaccines. Here we report a consultation, co-sponsored by PATH MVI, USAID, EMVI and WHO, where scientists from all institutions globally that have conducted such clinical challenges in recent years and representatives from regulatory agencies and funding agencies met to discuss clinical malaria challenge models. Participants discussed strengthening and harmonizing the sporozoite challenge model and considered the pros and cons of further developing a blood stage challenge possibly better suited for evaluating the efficacy of blood stage vaccines. This report summarizes major findings and recommendations, including an update on the Plasmodium vivax clinical challenge model, the prospects for performing experimental challenge trials in malaria endemic countries and an update on clinical safety data. While the focus of the meeting was on the optimization of clinical challenge models for evaluation of blood stage candidate malaria vaccines, many of the considerations are relevant for the application of challenge trials to other purposes.
Background to the debate: Current guidelines recommend that all fever episodes in African children be treated presumptively with antimalarial drugs. But declining malarial transmission in parts of sub-Saharan Africa, declining proportions of fevers due to malaria, and the availability of rapid diagnostic tests mean it may be time for this policy to change. This debate examines whether enough evidence exists to support abandoning presumptive treatment and whether African health systems have the capacity to support a shift toward laboratory-confirmed rather than presumptive diagnosis and treatment of malaria in children under five. © 2009 English et al.
BACKGROUND: Intervention coverage and funding for the control of malaria in Africa has increased in recent years, however, there are few descriptions of changing disease burden and the few reports available are from isolated, single site observations or are of reports at country-level. Here we present a nationwide assessment of changes over 10 years in paediatric malaria hospitalization across Kenya. METHODS: Paediatric admission data on malaria and non-malaria diagnoses were assembled for the period 1999 to 2008 from in-patient registers at 17 district hospitals in Kenya and represented the diverse malaria ecology of the country. These data were then analysed using autoregressive moving average time series models with malaria and all-cause admissions as the main outcomes adjusted for rainfall, changes in service use and populations-at-risk within each hospital's catchment to establish whether there has been a statistically significant decline in paediatric malaria hospitalization during the observation period. RESULTS: Among the 17 hospital sites, adjusted paediatric malaria admissions had significantly declined at 10 hospitals over 10 years since 1999; had significantly increased at four hospitals, and remained unchanged in three hospitals. The overall estimated average reduction in malaria admission rates was 0.0063 cases per 1,000 children aged 0 to 14 years per month representing an average percentage reduction of 49% across the 10 hospitals registering a significant decline by the end of 2008. Paediatric admissions for all-causes had declined significantly with a reduction in admission rates of greater than 0.0050 cases per 1,000 children aged 0 to 14 years per month at 6 of 17 hospitals. Where malaria admissions had increased three of the four sites were located in Western Kenya close to Lake Victoria. Conversely there was an indication that areas with the largest declines in malaria admission rates were areas located along the Kenyan coast and some sites in the highlands of Kenya. CONCLUSION: A country-wide assessment of trends in malaria hospitalizations indicates that all is not equal, important variations exist in the temporal pattern of malaria admissions between sites and these differences require more detailed investigation to understand what is required to promote a clinical transition across Africa.
BACKGROUND: Genotype 2/3 hepatitis C virus (HCV) has a good response to treatment with peginterferon and ribavirin. Patients with psychiatric disorders and injecting drug users (IDUs) are considered 'difficult to treat' and are often excluded from treatment despite the lack of evidence supporting this decision. AIMS: To investigate the outcome and factors associated with treatment failure in these groups. METHODS: This is an observational study of a cohort of patients infected by genotype 2/3 HCV. IDUs and patients with psychiatric diseases were not excluded from treatment. We performed an intention-to-treat analysis to evaluate factors related to treatment failure. RESULTS: A sustained virological response (SVR) was achieved in 91 of the 125 patients treated (72.8%). Patients with chronic psychotic disorders or former IDUs had SVR rates similar to other groups. After multivariate analysis, independent factors associated with treatment failure were liver cirrhosis [odds ratio (OR) 3.4, 95% confidence interval (CI) 1.1-10.4], a history of depression and not being on antidepressants at the commencement of HCV treatment (OR 4.4, 95% CI 1.2-16) and active IDUs (OR 7.3, 95% CI 1.77-30.4). CONCLUSIONS: Patients with a history of depression who were not receiving antidepressants and active IDUs are more likely to fail treatment for genotype 2/3 HCV and will need additional support.
Digestive Diseases and Sciences, pp. 1-5. | Citations: 1 (Scopus)2009. Reduction in Neutrophil Count During Hepatitis C Treatment: Drug Toxicity or Predictor of Good Response?
BACKGROUND: The evidence base for forensic mental health (FMH) services has been developing since the late 1990s. Are outcome measures sound enough for the evaluation tasks? AIMS: To identify, from published literature, outcome measures used in FMH research and, where feasible, assess their quality. METHOD: A structured review was undertaken of trials and intervention studies published between 1990 and 2006. Details of outcome variables and measures were abstracted. Evidence regarding most frequently occurring outcome measures was assessed. RESULTS: Four hundred and fifty different instruments were used to assess outcomes, incorporating 1038 distinct variables. Very little evidence could be found to support the measurement properties of commonly used instruments. CONCLUSIONS: and implications for practice There is little consistency in the use of outcome measure in FMH research. Effort is required to reach consensus on validated outcome measures in this field in order to better inform practice.
BACKGROUND: Little is known about how primary care physicians (PCPs) in Asia diagnose and manage prostatitis-like symptoms. This study investigated the clinical diagnosis of and care provided for prostatitis-like symptoms by PCPs in a Malaysian population, and compared these findings to reports from other areas. METHODS: All members of the Penang Private Medical Practitioners' Society were asked to complete a self-administered survey. Nonresponders were contacted after 3 weeks and received a telephone request after 6 weeks. RESULTS: Of the 786 practitioners contacted, 669 considered themselves to be PCPs, including 279 (42%) who responded to the survey. Adult males with prostatitis-like symptoms typically constitute <1% of the patients seen by PCPs. Most PCPs (72%) believe that prostatitis-like symptoms are caused by bacterial infection. 61% of PCPs base their diagnosis of prostatitis-like symptoms on clinical history, a physical examination and dipstick urinalysis. Standard management was to prescribe 1 or 2 courses of antimicrobials. CONCLUSIONS: Despite the 8.7% prevalence found in a previous survey in this population, prostatitis remains underdiagnosed in Malaysia. In contrast to many other clinical settings, urologists in Malaysia see a large proportion of newly diagnosed and treatment-naive prostatitis patients, providing an opportunity for clinical diagnostic and treatment studies.
Cochrane Database of Systematic Reviews, (4), | Read more2009. Antiviral prophylaxis for varicella zoster in immunocompromised patients (excluding haematological malignancies)
Background. Surveillance for invasive pneumococcal disease has been conducted using a variety of case ascertainment methods and diagnostic tools. Interstudy differences in observed rates of invasive pneumococcal disease could reflect variations in surveillance methods or true epidemiological differences in disease incidence. To facilitate comparisons of surveillance data among countries, investigators of Pneumococcal Vaccines Accelerated Development and Introduction Plan-sponsored projects have developed standard case definitions and data reporting methods. Methods. Investigators developed case definitions for meningitis, pneumonia, and very severe disease using existing World Health Organization guidelines and clinical definitions from Africa and Asia. Standardized case definitions were used to standardize reporting of aggregated results. Univariate analyses were conducted to compare results among countries and to identify factors contributing to detection of Streptococcus pneumoniae. Results. Surveillance sites varied with regard to the age groups targeted, disease syndromes monitored, specimens collected, and laboratory methods employed. The proportion of specimens positive for pneumococcus was greater for cerebrospinal fluid specimens (1.2%-19.4%) than for blood specimens (0.1%-1.4%) in all countries (range, 1.3-38-fold greater). The distribution of disease syndromes and pneumonia severity captured by surveillance differed among countries. The proportion of disease cases with pneumococcus detected varied by syndrome (meningitis, 1.4%-10.8%; pneumonia, 0.2%-1.3%; other, 0.2%-1.2%) and illness severity (nonsevere pneumonia, 0%-2.7%; severe pneumonia, 0.2%-1.2%), although these variations were not consistent for all sites. Antigen testing and polymerase chain reaction increased the proportion of cerebrospinal fluid specimens with pneumococcus identified by 1.3-5.5-fold, compared with culture alone. Conclusions. Standardized case definitions and data reporting enhanced our understanding of pneumococcal epidemiology and enabled us to assess the contributions of specimen type, disease syndrome, pneumonia severity, and diagnostic tools to rate of pneumococcal detection. Broader standardization and more-detailed data reporting would further improve interpretation of surveillance results. © 2009 by the Infectious Diseases Society of America. All rights reserved.
OBJECTIVES: Primary care facilities are increasingly becoming the focal point for distribution of malaria intervention strategies, but physical access to these facilities may limit the extent to which communities can be reached. To investigate the impact of travel time to primary care on the incidence of hospitalized malaria episodes in a rural district in Kenya. METHODS: The incidence of hospitalized malaria in a population under continuous demographic surveillance was recorded over 3 years. The time to travel to the nearest primary health care facility was calculated for every child between birth and 5 years of age and trends in incidence of hospitalized malaria as a function of travel time were evaluated. RESULTS: The incidence of hospitalized malaria more than doubled as travel time to the nearest primary care facility increased from 10 min to 2 h. Good access to primary health facilities may reduce the burden of disease by as much as 66%. CONCLUSIONS: Our results highlight both the potential of the primary health care system in reaching those most at risk and reducing the disease burden. Insufficient access is an important risk factor, one that may be inequitably distributed to the poorest households.
Background: Control measures which reduce individual exposure to malaria are expected to reduce disease, but also to eventually reduce immunity. Reassuringly, long term data following community wide ITN distribution show sustained benefits at a population level. However, the more common practice in Sub-Saharan Africa is to target ITN distribution on young children. There are few data on the long term outcomes of this practice.Methodology/Principal Findings: Episodes of febrile malaria were identified by active surveillance in 383 children over 18 months of follow up. In order to compare the short and long term outcomes of ITN use, we examined interactions between ITN use and age (12-42 months of age versus 42-80 months) in determining the risk of febrile malaria. ITN use and older age protected against the first or only episode of malaria (Hazard Ratio [HR] =0.33, 95%CI 0.17-0.65 and HR =0.30, 95%CI0.17-0.51, respectively). The interaction term between ITN use and older age was HR =2.91, 95%CI 1.02-8.3, p=0.045, indicating that ITNs did not protect older children. When multiple episodes were included in analysis, ITN use and older age were again protective against malaria episodes (Incident Rate Ratio [IRR] =0.43 95%CI 0.27-0.7) and IRR = 0.23, 95%CI 0.13-0.42, respectively) and the interaction term indicated that ITNs did not protect older children (IRR =2.71, 95%CI 1.3-5.7, p=0.008). Conclusions/Significance: These data on age interactions with ITN use suggest that larger scale studies on the long term individual outcomes should be undertaken if the policy of targeted ITN use for vulnerable groups is to continue. © 2009 Bejon et al.
BACKGROUND: Streptococcus suis is a zoonotic pathogen that infects pigs and can occasionally cause serious infections in humans. S. suis infections occur sporadically in human Europe and North America, but a recent major outbreak has been described in China with high levels of mortality. The mechanisms of S. suis pathogenesis in humans and pigs are poorly understood. METHODOLOGY/PRINCIPAL FINDINGS: The sequencing of whole genomes of S. suis isolates provides opportunities to investigate the genetic basis of infection. Here we describe whole genome sequences of three S. suis strains from the same lineage: one from European pigs, and two from human cases from China and Vietnam. Comparative genomic analysis was used to investigate the variability of these strains. S. suis is phylogenetically distinct from other Streptococcus species for which genome sequences are currently available. Accordingly, approximately 40% of the approximately 2 Mb genome is unique in comparison to other Streptococcus species. Finer genomic comparisons within the species showed a high level of sequence conservation; virtually all of the genome is common to the S. suis strains. The only exceptions are three approximately 90 kb regions, present in the two isolates from humans, composed of integrative conjugative elements and transposons. Carried in these regions are coding sequences associated with drug resistance. In addition, small-scale sequence variation has generated pseudogenes in putative virulence and colonization factors. CONCLUSIONS/SIGNIFICANCE: The genomic inventories of genetically related S. suis strains, isolated from distinct hosts and diseases, exhibit high levels of conservation. However, the genomes provide evidence that horizontal gene transfer has contributed to the evolution of drug resistance.
BACKGROUND: Endemic human pathogens are subject to strong immune selection, and interrogation of pathogen genome variation for signatures of balancing selection can identify important target antigens. Several major antigen genes in the malaria parasite Plasmodium falciparum have shown such signatures in polymorphism-versus-divergence indices (comparing with the chimpanzee parasite P. reichenowi), and in allele frequency based indices. METHODOLOGY/PRINCIPAL FINDINGS: To compare methods for prospective identification of genes under balancing selection, 26 additional genes known or predicted to encode surface-exposed proteins of the invasive blood stage merozoite were first sequenced from a panel of 14 independent P. falciparum cultured lines and P. reichenowi. Six genes at the positive extremes of one or both of the Hudson-Kreitman-Aguade (HKA) and McDonald-Kreitman (MK) indices were identified. Allele frequency based analysis was then performed on a Gambian P. falciparum population sample for these six genes and three others as controls. Tajima's D (TjD) index was most highly positive for the msp3/6-like PF10_0348 (TjD = 1.96) as well as the positive control ama1 antigen gene (TjD = 1.22). Across the genes there was a strong correlation between population TjD values and the relative HKA indices (whether derived from the population or the panel of cultured laboratory isolates), but no correlation with the MK indices. CONCLUSIONS/SIGNIFICANCE: Although few individual parasite genes show significant evidence of balancing selection, analysis of population genomic and comparative sequence data with the HKA and TjD indices should discriminate those that do, and thereby identify likely targets of immunity.
OBJECTIVES: To evaluate final year medical students' access to new medical information. METHOD: Cross-sectional survey of final year medical students at the University of Nairobi using anonymous, self-administered questionnaires. RESULTS: Questionnaires were distributed to 85% of a possible 343 students and returned by 44% (152). Half reported having accessed some form of new medical information within the previous 12 months, most commonly from books and the internet. Few students reported regular access; and specific, new journal articles were rarely accessed. Absence of internet facilities, slow internet speed and cost impeded access to literature; and current training seems rarely to encourage students to seek new information. CONCLUSION: Almost half the students had not accessed any new medical information in their final year in medical school. This means they are ill prepared for a career that may increasingly demand life-long, self-learning.
Background: Knowledge of treatment cost is essential in assessing cost effectiveness in healthcare. Evidence of the potential impact of implementing available interventions against childhood illnesses in developing countries challenges us to define the costs of treating these diseases. The purpose of this study is to describe the total costs associated with treatment of pneumonia, malaria and meningitis in children less than five years in seven Kenyan hospitals. Methods: Patient resource use data were obtained from largely prospective evaluation of medical records and household expenditure during illness was collected from interviews with caretakers. The estimates for costs per bed day were based on published data. A sensitivity analysis was conducted using WHO-CHOICE values for costs per bed day. Results: Treatment costs for 572 children (pneumonia = 205, malaria = 211, meningitis = 102 and mixed diagnoses = 54) and household expenditure for 390 households were analysed. From the provider perspective the mean cost per admission at the national hospital was US $95.58 for malaria, US $177.14 for pneumonia and US $284.64 for meningitis. In the public regional or district hospitals the mean cost per child treated ranged from US $47.19 to US $81.84 for malaria and US $54.06 to US $99.26 for pneumonia. The corresponding treatment costs in the mission hospitals were between US $43.23 to US $88.18 for malaria and US $ 43.36 to US $142.22 for pneumonia. Meningitis was treated for US $ 189.41 at the regional hospital and US $ 201.59 at one mission hospital. The total treatment cost estimates were sensitive to changes in the source of bed day costs. The median treatment related household payments within quintiles defined by total household expenditure differed by type of facility visited. Public hospitals recovered up to 40% of provider costs through user charges while mission facilities recovered 44% to 100% of costs. Conclusion: Treatments cost for inpatient malaria, pneumonia and meningitis vary by facility type, with mission and tertiary referral facilities being more expensive compared to primary referral. Households of sick children contribute significantly towards provider cost through payment of user fees. These findings could be used in cost effectiveness analysis of health interventions. © 2009 Ayieko et al; licensee BioMed Central Ltd.
Considerable uncertainty persists regarding the efficacy and safety of methylxanthines (caffeine, theophylline - in intravenous form named aminophylline) for the prevention and treatment of infant apnea. To help inform national guideline development in Kenya we undertook structured literature searches to identify current evidence on caffeine therapy for infant apnea. Available evidence shows that caffeine is as effective as intravenous theophylline (aminophylline), but is safer and easier to give and has better therapeutic properties. It is therefore recommended for the treatment of apnea of prematurity. Caffeine is also the preferred drug if clinicians plan to provide apnea prophylaxis. As prematurity is likely to result in more than 1 million deaths a year, mostly in resource-poor settings, greater efforts need to be made to ensure interventions such as caffeine, currently unavailable in countries such as Kenya, are made more widely available.
We conducted a prospective audit of 101 children with severe malnutrition aged 6 to 59 months admitted to Kenyatta National Hospital, Kenya's largest tertiary level health facility, from February-April 2008. A structured tool was prepared to capture data to allow assessment of implementation of the WHO guidelines steps 1-8. Overall, 58% of children had marasmus and 47% of children were younger than one year old. Common co-morbidities at admission were diarrhoea (70.3%) and pneumonia (51.4%). The highest degree of implementation was observed for Step 5, treatment of potentially severe infections (90%, (95% CI 85.1-96.9)). Only 55% of the patients had F75 prescribed although this starter formula was available in this hospital. There was a delay in initiating feeds with a median time of 14.7 hours from the time of admission. There was modest implementation of Step 2, ensuring warmth (46.5%, 36.8-56.2), Step 3, treat dehydration (54.9%, 43.3-66.5) and Step 4, correct electrolyte imbalance, (45.5%, 35.6-55.8%). There was least implementation of Step 8, transition to catch-up feeding (23.8%, 13.6-34.0). We conclude that quality of care for children admitted with severe malnutrition at KNH is inadequate and often does not follow the WHO guidelines. Improving care will require a holistic and not simply medical approach.
Background. Although considerable efforts are directed at developing international guidelines to improve clinical management in low-income settings they appear to influence practice rarely. This study aimed to explore barriers to guideline implementation in the early phase of an intervention study in four district hospitals in Kenya. Methods. We developed a simple interview guide based on a simple characterisation of the intervention informed by review of major theories on barriers to uptake of guidelines. In-depth interviews, non-participatory observation, and informal discussions were then used to explore perceived barriers to guideline introduction and general improvements in paediatric and newborn care. Data were collected four to five months after in-service training in the hospitals. Data were transcribed, themes explored, and revised in two rounds of coding and analysis using NVivo 7 software, subjected to a layered analysis, reviewed, and revised after discussion with four hospital staff who acted as within-hospital facilitators. Results. A total of 29 health workers were interviewed. Ten major themes preventing guideline uptake were identified: incomplete training coverage; inadequacies in local standard setting and leadership; lack of recognition and appreciation of good work; poor communication and teamwork; organizational constraints and limited resources; counterproductive health worker norms; absence of perceived benefits linked to adoption of new practices; difficulties accepting change; lack of motivation; and conflicting attitudes and beliefs. Conclusion. While the barriers identified are broadly similar in theme to those reported from high-income settings, their specific nature often differs. For example, at an institutional level there is an almost complete lack of systems to introduce or reinforce guidelines, poor teamwork across different cadres of health worker, and failure to confront poor practice. At an individual level, lack of interest in the evidence supporting guidelines, feelings that they erode professionalism, and expectations that people should be paid to change practice threaten successful implementation. © 2009 Nzinga et al.
Background. Organizational factors are considered to be an important influence on health workers' uptake of interventions that improve their practices. These are additionally influenced by factors operating at individual and broader health system levels. We sought to explore contextual influences on worker motivation, a factor that may modify the effect of an intervention aimed at changing clinical practices in Kenyan hospitals. Methods. Franco LM, et al's (Health sector reform and public sector health worker motivation: a conceptual framework. Soc Sci Med. 2002, 54: 1255-66) model of motivational influences was used to frame the study Qualitative methods including individual in-depth interviews, small-group interviews and focus group discussions were used to gather data from 185 health workers during one-week visits to each of eight district hospitals. Data were collected prior to a planned intervention aiming to implement new practice guidelines and improve quality of care. Additionally, on-site observations of routine health worker behaviour in the study sites were used to inform analyses. Results. Study settings are likely to have important influences on worker motivation. Effective management at hospital level may create an enabling working environment modifying the impact of resource shortfalls. Supportive leadership may foster good working relationships between cadres, improve motivation through provision of local incentives and appropriately handle workers' expectations in terms of promotions, performance appraisal processes, and good communication. Such organisational attributes may counteract de-motivating factors at a national level, such as poor schemes of service, and enhance personally motivating factors such as the desire to maintain professional standards. Conclusion. Motivation is likely to influence powerfully any attempts to change or improve health worker and hospital practices. Some factors influencing motivation may themselves be influenced by the processes chosen to implement change.
Background: It is increasingly appreciated that the interpretation of health systems research studies is greatly facilitated by detailed descriptions of study context and the process of intervention. We have undertaken an 18-month hospital-based intervention study in Kenya aiming to improve care for admitted children and newborn infants. Here we describe the baseline characteristics of the eight hospitals as environments receiving the intervention, as well as the general and local health system context and its evolution over the 18 months. Methods. Hospital characteristics were assessed using previously developed tools assessing the broad structure, process, and outcome of health service provision for children and newborns. Major health system or policy developments over the period of the intervention at a national level were documented prospectively by monitoring government policy announcements, the media, and through informal contacts with policy makers. At the hospital level, a structured, open questionnaire was used in face-to-face meetings with senior hospital staff every six months to identify maj or local developments that might influence implementation. These data provide an essential background for those seeking to understand the generalisability of reports describing the intervention's effects, and whether the intervention plausibly resulted in these effects. Results. Hospitals had only modest capacity, in terms of infrastructure, equipment, supplies, and human resources available to provide high-quality care at baseline. For example, hospitals were lacking between 30 to 56% of items considered necessary for the provision of care to the seriously ill child or newborn. An increase in spending on hospital renovations, attempts to introduce performance contracts for health workers, and post-election violence were recorded as examples of national level factors that might influence implementation success generally. Examples of factors that might influence success locally included frequent and sometimes numerous staff changes, movements of senior departmental or administrative staff, and the presence of local 'donor' partners with alternative priorities. Conclusion. The effectiveness of interventions delivered at hospital level over periods realistically required to achieve change may be influenced by a wide variety of factors at national and local levels. We have demonstrated how dynamic such contexts are, and therefore the need to consider context when interpreting an intervention's effectiveness. © 2009 English et al; licensee BioMed Central Ltd.
Background. We have conducted an intervention study aiming to improve hospital care for children and newborns in Kenya. In judging whether an intervention achieves its aims, an understanding of how it is delivered is essential. Here, we describe how the implementation team delivered the intervention over 18 months and provide some insight into how health workers, the primary targets of the intervention, received it. Methods. We used two approaches. First, a description of the intervention is based on an analysis of records of training, supervisory and feedback visits to hospitals, and brief logs of key topics discussed during telephone calls with local hospital facilitators. Record keeping was established at the start of the study for this purpose with analyses conducted at the end of the intervention period. Second, we planned a qualitative study nested within the intervention project and used in-depth interviews and small group discussions to explore health worker and facilitators' perceptions of implementation. After thematic analysis of all interview data, findings were presented, discussed, and revised with the help of hospital facilitators. Results. Four hospitals received the full intervention including guidelines, training and two to three monthly support supervision and six monthly performance feedback visits. Supervisor visits, as well as providing an opportunity for interaction with administrators, health workers, and facilitators, were often used for impromptu, limited refresher training or orientation of new staff. The personal links that evolved with senior staff seemed to encourage local commitment to the aims of the intervention. Feedback seemed best provided as open meetings and discussions with administrators and staff. Supervision, although sometimes perceived as fault finding, helped local facilitators become the focal point of much activity including key roles in liaison, local monitoring and feedback, problem solving, and orientation of new staff to guidelines. In four control hospitals receiving a minimal intervention, local supervision and leadership to implement new guidelines, despite their official introduction, were largely absent. Conclusion. The actual content of an intervention and how it is implemented and received may be critical determinants of whether it achieves its aims. We have carefully described our intervention approach to facilitate appraisal of the quantitative results of the intervention's effect on quality of care. Our findings suggest ongoing training, external supportive supervision, open feedback, and local facilitation may be valuable additions to more typical in-service training approaches, and may be feasible. © 2009 Nzinga et al; licensee BioMed Central Ltd.
New England Journal of Medicine, 360 (1), pp. 20-31. | Read more2009. Decontamination of the Digestive Tract and Oropharynx in ICU Patients
BACKGROUND: Accurate measures of the severity of pandemic (H1N1) 2009 influenza (pH1N1) are needed to assess the likely impact of an anticipated resurgence in the autumn in the Northern Hemisphere. Severity has been difficult to measure because jurisdictions with large numbers of deaths and other severe outcomes have had too many cases to assess the total number with confidence. Also, detection of severe cases may be more likely, resulting in overestimation of the severity of an average case. We sought to estimate the probabilities that symptomatic infection would lead to hospitalization, ICU admission, and death by combining data from multiple sources. METHODS AND FINDINGS: We used complementary data from two US cities: Milwaukee attempted to identify cases of medically attended infection whether or not they required hospitalization, while New York City focused on the identification of hospitalizations, intensive care admission or mechanical ventilation (hereafter, ICU), and deaths. New York data were used to estimate numerators for ICU and death, and two sources of data--medically attended cases in Milwaukee or self-reported influenza-like illness (ILI) in New York--were used to estimate ratios of symptomatic cases to hospitalizations. Combining these data with estimates of the fraction detected for each level of severity, we estimated the proportion of symptomatic patients who died (symptomatic case-fatality ratio, sCFR), required ICU (sCIR), and required hospitalization (sCHR), overall and by age category. Evidence, prior information, and associated uncertainty were analyzed in a Bayesian evidence synthesis framework. Using medically attended cases and estimates of the proportion of symptomatic cases medically attended, we estimated an sCFR of 0.048% (95% credible interval [CI] 0.026%-0.096%), sCIR of 0.239% (0.134%-0.458%), and sCHR of 1.44% (0.83%-2.64%). Using self-reported ILI, we obtained estimates approximately 7-9 x lower. sCFR and sCIR appear to be highest in persons aged 18 y and older, and lowest in children aged 5-17 y. sCHR appears to be lowest in persons aged 5-17; our data were too sparse to allow us to determine the group in which it was the highest. CONCLUSIONS: These estimates suggest that an autumn-winter pandemic wave of pH1N1 with comparable severity per case could lead to a number of deaths in the range from considerably below that associated with seasonal influenza to slightly higher, but with the greatest impact in children aged 0-4 and adults 18-64. These estimates of impact depend on assumptions about total incidence of infection and would be larger if incidence of symptomatic infection were higher or shifted toward adults, if viral virulence increased, or if suboptimal treatment resulted from stress on the health care system; numbers would decrease if the total proportion of the population symptomatically infected were lower than assumed.
BACKGROUND: The effectiveness of single-drug antiviral interventions to reduce morbidity and mortality during the next influenza pandemic will be substantially weakened if transmissible strains emerge which are resistant to the stockpiled antiviral drugs. We developed a mathematical model to test the hypothesis that a small stockpile of a secondary antiviral drug could be used to mitigate the adverse consequences of the emergence of resistant strains. METHODS AND FINDINGS: We used a multistrain stochastic transmission model of influenza to show that the spread of antiviral resistance can be significantly reduced by deploying a small stockpile (1% population coverage) of a secondary drug during the early phase of local epidemics. We considered two strategies for the use of the secondary stockpile: early combination chemotherapy (ECC; individuals are treated with both drugs in combination while both are available); and sequential multidrug chemotherapy (SMC; individuals are treated only with the secondary drug until it is exhausted, then treated with the primary drug). We investigated all potentially important regions of unknown parameter space and found that both ECC and SMC reduced the cumulative attack rate (AR) and the resistant attack rate (RAR) unless the probability of emergence of resistance to the primary drug p(A) was so low (less than 1 in 10,000) that resistance was unlikely to be a problem or so high (more than 1 in 20) that resistance emerged as soon as primary drug monotherapy began. For example, when the basic reproductive number was 1.8 and 40% of symptomatic individuals were treated with antivirals, AR and RAR were 67% and 38% under monotherapy if p(A) = 0.01. If the probability of resistance emergence for the secondary drug was also 0.01, then SMC reduced AR and RAR to 57% and 2%. The effectiveness of ECC was similar if combination chemotherapy reduced the probabilities of resistance emergence by at least ten times. We extended our model using travel data between 105 large cities to investigate the robustness of these resistance-limiting strategies at a global scale. We found that as long as populations that were the main source of resistant strains employed these strategies (SMC or ECC), then those same strategies were also effective for populations far from the source even when some intermediate populations failed to control resistance. In essence, through the existence of many wild-type epidemics, the interconnectedness of the global network dampened the international spread of resistant strains. CONCLUSIONS: Our results indicate that the augmentation of existing stockpiles of a single anti-influenza drug with smaller stockpiles of a second drug could be an effective and inexpensive epidemiological hedge against antiviral resistance if either SMC or ECC were used. Choosing between these strategies will require additional empirical studies. Specifically, the choice will depend on the safety of combination therapy and the synergistic effect of one antiviral in suppressing the emergence of resistance to the other antiviral when both are taken in combination.
Background: The T-cell mediated immune response plays a central role in the control of malaria after natural infection or vaccination. There is increasing evidence that T-cell responses are heterogeneous and that both the quality of the immune response and the balance between pro-inflammatory and regulatory T-cells determines the outcome of an infection. As Malaria parasites have been shown to induce immunosuppressive responses to the parasite and non-related antigens this study examined T-cell mediated pro-inflammatory and regulatory immune responses induced by malaria vaccination in children in an endemic area to determine if these responses were associated with vaccine immunogenicity. Methods: Using real-time RT- PCR we profiled the expression of a panel of key markers of immunogenecity at different time points after vaccination with two viral vector vaccines expressing the malaria TRAP antigen (FP9-TRAP and MVA-TRAP) or following rabies vaccination as a control. Principal Findings: The vaccine induced modest levels of IFN-γ mRNA one week after vaccination. There was also an increase in FoxP3 mRNA expression in both TRAP stimulated and media stimulated cells in the FFM ME-TRAP vaccine group; however, this may have been driven by natural exposure to parasite rather than by vaccination. Conclusion: Quantitative PCR is a useful method for evaluating vaccine induced cell mediated immune responses in frozen PBMC from children in a malaria endemic country. Future studies should seek to use vaccine vectors that increase the magnitude and quality of the IFN-γ immune response in naturally exposed populations and should monitor the induction of a regulatory T cell response. © 2009 Mwacharo et al.
New England Journal of Medicine, 360 (12), pp. 1253-1254. | Read more2009. The authors reply
We investigated factors associated with persistence of different Salmonella serovars in buildings housing laying hens in Great Britain using survival analysis. A total of 264 incidents of Salmonella detection occurring between July 1998 and August 2007 in 152 houses were recorded. For incidents involving Salmonella Enteritidis (SE), both the rodent score of the house and the type of house were positively associated with persistence. For non-SE serovars, only the type of house was associated with persistence. Persistence of SE in the houses was longest (>15 months) in step-cage and cage-scraper houses when high levels of rodents were present, and lowest in non-cage and cage-belt houses. We estimated that 42% (95% CI 23.3-63.1) of SE incidents may be cleared during the lay period, and this was related to elimination of rodents from the houses. From January 2009, EU legislation will ban the sale of fresh eggs from SE-positive and S. Typhimurium-positive flocks over their remaining lifespan. If infection is eliminated from such flocks, they would cease to represent a public health risk.
AIMS: To compare the efficiency of various sampling methods for detection of Salmonella in turkey flocks. METHODS AND RESULTS: In a field study that compared various sampling methods one pair of boot swabs taken from the whole turkey house provided suitably sensitive results for fattening and rearing flocks and was no less sensitive than two pairs, each from half the house, tested as a pooled sample. The sensitivity was further enhanced by adding a dust sample. The dust sample appeared to be particularly useful in flocks with a low prevalence, especially in breeding flocks, and was more sensitive than a method which used five pairs of boot swabs per flock. Combined incubation of a boot swab and a dust sample showed no interference between the two sample types and a maximum sensitivity of detection. Litter samples and commercial sponge drag swabs provided a lower level of detection. CONCLUSIONS: A single pair of boot swabs taken from the whole house is recommended for routine sampling of commercial rearing or fattening flocks. An additional dust sample could be added to increase detection in flocks with a low prevalence or in breeding flocks, but adding an additional pair of boot swabs would not increase detection compared with a single pair. SIGNIFICANCE AND IMPACT OF THE STUDY: This study demonstrates that significant efficiencies can be made in sampling programmes for detection of Salmonella in turkey flocks without detracting from the sensitivity. Similar studies are recommended for other poultry sectors, particularly in chicken breeding flocks.
AIMS: To investigate the effectiveness of pooled sampling methods for detection of Salmonella in turkey flocks. METHODS AND RESULTS: Individual turkey droppings were taken from 43 flocks, with half the dropping tested for Salmonella as an individual sample and the other half included in a pool of five. A pair of boot swabs and a dust sample were also taken from each flock. The results were analysed using Bayesian methods in the absence of a gold standard. This showed a dilution effect of mixing true-positive with negative samples, but even with this the pooled faecal samples were found to be a highly efficient method of testing compared with individual faecal samples. The more samples included in the pool, the more sensitive the pooled sampling method was predicted to be. The sensitivity of dust sampling was much more sensitive than faecal sampling at low prevalence. CONCLUSIONS: Pooled faecal sampling is an efficient method of Salmonella detection in turkey flocks. The additional testing of a dust sample greatly increased the effectiveness of sampling, especially at low prevalence. SIGNIFICANCE AND IMPACT OF THE STUDY: This is the first study to relate the sensitivity of the sampling methods to the within-flock prevalence.
The aim of this study was to determine the efficacy of a killed Salmonella vaccine and three live vaccines in preventing caecal colonisation of Hy-line Brown pullets by Salmonella Enteritidis PT 4. The lowest number of Salmonella-positive birds following the largest challenge (10(8) cfu) was recorded for live vaccine 1. However, birds treated with the killed vaccine had a significantly lower number of salmonellae in their caeca compared with both the control group and the other vaccine groups (P<0.05).
AIMS: To evaluate the performance of three Salmonella plating media (Rambach, Xylose Lysine Deoxycholate agar and modified Brilliant Green Agar plus Novobiocin) as part of the ISO 6579: 2002 (Annex D) on poultry environmental samples. METHODS AND RESULTS: The samples analysed were those for the European Union Salmonella baseline surveys of laying (N = 3087), broiler (N = 1550), turkey fattening (N = 1540) and turkey breeding (N = 580) flocks for Great Britain. Results were considered separately for Rambach (including and excluding pale orange colonies) and for growth on selective media [Modified semi-solid Rappaport Vassiliadis (MSRV)] after 24 and 48 h of incubation. Overall, Rambach was the most sensitive medium, provided that pale orange colonies were checked. In all cases, an increase in the sensitivity of detection was obtained by plating growth on MSRV after 48 h of incubation. In broilers and laying flocks, the specificity significantly improved when Rambach only was used. CONCLUSION: The use of Rambach results in considerable savings compared with the two-plate method prescribed by ISO 6579:2002 (Annex D) without compromising sensitivity. SIGNIFICANCE AND IMPACT OF THE STUDY: Salmonella isolation protocols should be reviewed in terms of their efficiency and cost.
Vet Rec, 165 (23), pp. 681-688. | Citations: 22 (Scopus) | Show Abstract2009. Retrospective analysis of Salmonella isolates recovered from animal feed in Great Britain.
To examine feed contamination rates with Salmonella, the diversity of serovars and the antimicrobial resistance of isolates from animal feedingstuffs in Great Britain, and to compare Salmonella strains found in animal feed and in livestock, data collected under voluntary and statutory Salmonella surveillance during the period 1987 to 2006 were analysed retrospectively. The feed contamination rate decreased from 3.8 per cent in 1993 to 1.1 per cent in 2006. A total of 263 Salmonella serovars were recovered: S Mbandaka (11.2 per cent), S Tennessee (10.4 per cent), S Senftenberg (8.4 per cent), S Agona (6.4 per cent), S Montevideo (6.4 per cent) and S Ohio (3.1 per cent) were the most prevalent. S Typhimurium was recovered at a proportion of 1.6 per cent from raw ingredients and 2.4 per cent from finished feed, while S Enteritidis was recovered at a proportion of 0.5 per cent from raw ingredients and 0.6 per cent from finished feed; 14.1 per cent of the isolates were resistant to at least one antimicrobial, and 1.9 per cent were multiresistant. There was no evidence of a statistical association (P<0.05) between the top 10 serovars recovered from feed and from livestock.
Eight pig breeding units previously associated with Salmonella Typhimurium were visited during a period of up to seven years. Samples from voided faeces, surfaces, fomites and wildlife were cultured. Certain serovars (Derby, Stanley, Give, Bredeney, Mbandaka and Manhattan) were isolated repeatedly on certain units, while others (Agona, Ajiobo, Heidelberg, Meleagridis, Muenchen, Montevideo, Rissen and Senftenberg) were detected only once or intermittently. Serovars Kedougou, Newport and Typhimurium were isolated consistently on some units but only intermittently on others. There was an association between the Salmonella serovar in pens and in the immediate environment of the pens. Pens holding breeding stock destined for production herds were frequently positive for Salmonella. Herds under common ownership showed similar serovar combinations. Serovars from wildlife were typical of the associated premises. Cleaning and disinfection was frequently ineffective. On one unit, a low level of Salmonella was attributed to a small herd size, good cleaning and disinfection, and good rodent control. Breeding herds are therefore susceptible to endemic infections with multiple Salmonella serovars, and cleaning, disinfection and vector control may be inadequate in many cases. The prevalence of S Typhimurium was greater in youngstock, which may have important implications for public health.
Serovar and antimicrobial resistance data from the scanning surveillance of British turkey flocks for Salmonella between 1995 and 2006 were analysed and compared with prevalence data from other livestock and animal feed. A total of 2753 incidents of 57 different serovars were reported. The five most prevalent serovars were Salmonella Typhimurium (20.8%), Salmonella Newport (14.7%), Salmonella Derby (10.6%), Salmonella Indiana (8.3%) and Salmonella Agona (6.4%). S. Typhimurium reports peaked in the mid- to late 1990s; this occurred in parallel with the S. Typhimurium DT104 epidemic in other livestock species. S. Enteritidis reports peaked in mid- to late 1990s, followed by a considerable decrease after 2000, which was also noted in flocks of domestic fowl. S. Newport, Salmonella Montevideo, Salmonella Senftenberg and Salmonella Binza occurred in marked clusters, indicating that they were introduced into one or more flocks at a certain time (i.e. via contaminated feed or infected 1-day-old chicks). A proportion of 43.1% of the reported Salmonella isolates were resistant to at least one antimicrobial, while 17.7% were multi-resistant. No isolates were resistant to ciprofloxacin or to the third-generation cephalosporins ceftazidime and cefotaxime. Resistance to ampicillin, chloramphenicol, streptomycin, sulphonamide compounds and tetracycline was common, and it was mainly a characteristic of S. Typhimurium DT104 compared with S. Typhimurium non-DT104 and non-S. Typhimurium isolates (P<0.001). Resistance to nalidixic acid decreased from 16.9% in 1995 to 11.8% in 2006. Nalidixic acid resistance was most frequently found in Salonella Hadar (71.4%), S. Typhimurium DT104 (30.0%), S. Newport (17.9%) and S. Typhimurium non-DT104 (11.1%).
Effective terminal cleaning and disinfection (C&D) is regarded as a necessary step for the elimination of Salmonella spp. from laying houses. A total of 60 commercial laying houses that had housed laying flocks infected with Salmonella enterica serovar Enteritidis or Salmonella enterica serovar Typhimurium that were representative of all production systems (cage, barn, free-range) were intensively sampled immediately after C&D as well as in the follow-on flock. The procedures investigated were: (1) a compound disinfectant consisting of a mixture of formaldehyde, glutaraldehyde and quarternary ammonium applied at the recommended concentration; (2) a 10% (vol/vol) dilution of the standard 37% commercial formalin, applied by a contractor; and (3) other disinfection procedures selected and applied by the farmer. The recovery of Salmonella in the cleaned and disinfected houses was variable, with samples from floor and dropping boards/belts (cage houses) and scratching areas (non-cage houses) being the most likely to remain contaminated. In cage houses, the use of the 10% formalin dilution led to a statistically greater reduction in the sample prevalence than using any of the other C&D methods. A negative post-C&D result predicted clearance of Salmonella in 52% of cases, although the isolation of Salmonella from the houses immediately after C&D was not a perfect predictor of carry-over of infection.
PLoS Curr, 1 pp. RRN1130. | Read more2009. The Early Transmission Dynamics of H1N1pdm Influenza in the United Kingdom.
Lancet, 373 (9663), pp. 522-523. | Citations: 5 (Scopus) | Read more2009. Pre-referral rectal artesunate in severe malaria.
Wilderness Environ Med, 20 (1), pp. 81-82. | Read more2009. Clinical images: a pneumonic confusion.
We present a case of splenic artery embolisation (SAE) after traumatic splenic injury that was complicated by acute necrotizing pancreatitis, caused by inadvertently extensive embolisation of the splenic artery. Although SAE is increasingly used for splenic preservation in trauma, there is insufficient knowledge on its efficacy and pitfalls. This report aims to draw attention to a rare but potentially serious complication of SAE.
Total publications on this page: 315
Total citations for publications on this page: 14282