Search results
Found 22047 matches for
In a guest blog, Professor Stephen Baker explains the importance of monitoring the emergence of infectious diseases in Asia. Zoonotic diseases that pass from animal to human are an international public health problem regardless of location, but in lower-income countries the opportunities for such pathogens to enter the food chain are amplified.
Antiepileptic properties of quinine: a systematic review
Background: Quinine has anti-epileptic properties in animals. However, in humans this has not been systematically investigated. Purpose: To examine the available research evidence on the effects of quinine on seizures in adults or children. Methods: We searched online databases for published and unpublished studies in any language from January 1966 to March 2011. We considered randomized controlled trials (RCTs) evaluating the use of quinine in comparison to other drugs in humans with malaria or other conditions, and that reported the prevalence of seizures. Random effects meta-analysis was used to pool effect estimates in order to determine the effect of quinine on the prevalence of seizures. Results: We identified six randomized controlled trials on severe malaria. Quinine was compared to the artemisinin derivatives in all trials. A total of 8,244 patients were included. In the meta-analysis, there was no significant effect of quinine on the prevalence of seizures when compared to the artemisinin derivatives (Odds ratio (OR) =0.90, 95% Confidence Interval (95%CI) =0.63-1.30). There was significant heterogeneity (I2=66%, Chi-square=17.44, p=0.008). Subgroup analysis showed that quinine significantly reduced seizures when compared to artemether (OR = 0.66, 95%CI = 0.49-0.88) but when compared to artesunate, prevalence of seizures increased significantly (OR = 1.24, 95%CI = 1.05-1.47). Conclusion: There is no sufficient evidence to conclude that quinine has any antiepileptic properties in humans.
Roles of medical, nursing and clinical specialists in selected African health systems: a document review of numbers, norms, training and scope of practices
Background Specialist health professionals are essential for meeting the evolving health needs of Sub-Saharan Africa (SSA), especially as the burden of complex and chronic conditions rises. They contribute not only to patient care but also to teaching, research, and policy development. However, there is a significant shortfall and uneven distribution of specialists across the region, creating major challenges for health systems. This paper examines the roles, numbers, training pathways, and scope of practice of medical, nursing, and clinical specialists in four SSA countries, with the aim of informing more effective workforce planning. Methods Between September 2023 and July 2024, we conducted a document review of policies and guidelines related to specialist health professionals in Kenya, Uganda, Nigeria, and South Africa. Sources included ministries of health, regulatory bodies, academic institutions, and professional associations. We focused on the composition of the specialist workforce, training pathways, and defined roles across different health cadres. Results There is marked variation in specialist workforce composition between countries. South Africa and Kenya reported the highest numbers of medical specialists, while nursing and clinical officer specialists were more common in Kenya and Uganda. Training pathways ranged from university-based master’s programmes to national or regional fellowship systems. However, many curricula lacked essential non-clinical competencies - such as leadership, management, and communication skills - limiting specialists’ effectiveness in broader health system roles. Conclusion Strengthening the specialist workforce in SSA requires better alignment between training and health system needs. This includes integrating non-clinical competencies into curricula, enhancing data systems for workforce planning, and addressing gaps in distribution and capacity. Policy reforms and strong leadership are critical to building a sustainable, well-equipped specialist workforce to meet the region’s growing healthcare challenges.
Using routinely collected data to inform infection-prevention policy decisions
Measures to reduce transmission are a vital response to infectious disease epidemics. Collectively such measures are effective in reducing the burden of infectious disease but effectiveness of individual interventions is less certain. Methodologies for causal inference from observational data are well developed, but many methods have requirements that are not met by epidemic data. They may require an individual's outcome to be independent of anyone else's treatment, but the very purpose of infection-prevention measures is to break chains of transmission, benefiting both treated and untreated individuals. I combine causal inference methods, mechanistic models, and observational data to estimate effects of interventions that were used to reduce the spread of severe acute respiratory syndrome coronavirus 2 in the United Kingdom. I combine difference-in-differences methodology with a renewal-equation model. If its assumptions are met, this can detect effects of interventions on transmissibility, but if assumptions are violated, erroneous results can arise with no indication that an error is occurring. I apply the method to mass testing and mandatory use of face masks. Difference-in-differences results suggest that interventions increased incidence of detected infections. I investigate optimal timing of vaccination against respiratory viral infections with models incorporating immune boosting from re-exposure to the virus. Boosting can lead to synchrony in susceptibility and cause periodic outbreaks even without seasonal variation in infectiousness. In scenarios with more immune boosting, vaccinating sooner tends to lead to fewer infections, while in scenarios with less boosting, later vaccination is beneficial. Analyses in this thesis highlight potential problems with causal analyses that disregard mechanisms of disease transmission, and with models that oversimplify immunity. These analyses suggest that greater understanding of changing immunity over time is necessary to determine optimal approaches to reducing transmission of these respiratory viral infections.
Factors affecting integration of an early warning system for antimalarial drug resistance within a routine surveillance system in a pre-elimination setting in Sub-Saharan Africa
To address the current threat of antimalarial resistance, countries need innovative solutions for timely and informed decision-making. Integrating molecular surveillance for drug-resistant malaria into routine malaria surveillance in pre-elimination contexts offers a potential early warning mechanism for further investigation and response. However, there is limited evidence on what influences the performance of such a system in resource-limited settings. From March 2018 to February 2020, a sequential mixed-methods study was conducted in primary healthcare facilities in a South African pre-elimination setting to explore factors influencing the flow, quality and linkage of malaria case notification and molecular resistance marker data. Using a process-oriented framework, we undertook monthly and quarterly data linkage and consistency analyses at different levels of the health system, as well as a survey, focus group discussions and interviews to identify potential barriers to, and enhancers of, the roll-out and uptake of this integrated information system. Over two years, 4,787 confirmed malaria cases were notified from 42 primary healthcare facilities in the Nkomazi sub-district, Mpumalanga, South Africa. Of the notified cases, 78.5% (n = 3,758) were investigated, and 55.1% (n = 2,636) were successfully linked to their Plasmodium falciparum molecular resistance marker profiles. Five tangible processes—malaria case detection and notification, sample collection, case investigation, analysis and reporting—were identified within the process-oriented logic model. Workload, training, ease of use, supervision, leadership, and resources were recognized as cross-cutting influencers affecting the program’s performance. Approaching malaria elimination, linking molecular markers of antimalarial resistance to routine malaria surveillance is feasible. However, cross-cutting barriers inherent in the healthcare system can influence its success in a resource-limited setting.
Pneumococcal density and respiratory co-detection in severe pediatric pneumonia in Laos.
There is growing evidence on the importance of bacterial/viral interaction in the course of pneumonia. In Laos, no study has investigated respiratory pathogen co-detection. We conducted a study at Mahosot Hospital in Vientiane to determine whether bacterial/viral co-detection and pneumococcal density are associated with severe pneumonia. Between December 2013 and December 2016, 934 under 5 years old hospitalized children with ARI were enrolled. Swabs from the upper respiratory tract were collected and analyzed by real-time PCR. The most common co-detected microorganisms were Streptococcus pneumoniae/Haemophilus influenzae (24%), Respiratory Syncytial Virus (RSV)/S. pneumoniae (12%) and RSV/H. influenzae (16%). Pneumococcal density was 4.52 times higher in influenza virus positive participants. RSV/S. pneumoniae and RSV/H. influenzae co-detections were positively associated with severe pneumonia in univariate analysis (OR 1.86, 95%CI:1.22-2.81, p = 0.003 and OR 2.09, 95%CI:1.46-3.00), but not confirmed in adjusted analysis (aOR 0.72, 95%CI:0.38-1.6, p = 0.309 and aOR 1.37, 95%CI:0.73-2.58). In RSV positive patients, there was no association between pneumococcal density and severe pneumonia. Our findings confirmed an association between pneumococcal density and influenza but not RSV severe pneumonia in young children. Results highlight the complexity of the interaction of viral/bacterial pathogens, which might not have a simple synergistic action in the evolution of pneumonia.
Global Immune Biomarkers and Donor Serostatus Can Predict Cytomegalovirus Infection Within Seropositive Lung Transplant Recipients.
BACKGROUND: Predicting which lung transplant recipients (LTR) will develop cytomegalovirus (CMV) infection remains challenging. The aim of this retrospective cohort study was to further explore the predictive utility of global immune biomarkers within recipient seropositive (R+) LTRs, focusing on the mitogen component of the QuantiFERON (QF)-CMV assay and the absolute lymphocyte count (ALC). METHODS: R+ LTR with QF-CMV testing performed at 5 mo posttransplant were included. ALC and mitogen were evaluated as predictors of CMV infection (>150 IU/mL) in plasma and/or bronchoalveolar lavage fluid using Cox regression, controlling for antiviral prophylaxis. Optimal cutoffs were calculated with receiver-operating characteristic curves. RESULTS: CMV infection occurred in 111 of 204 patients (54%) and was associated with donor seropositivity (80/111 [72%] versus 42/93 [45%], P
Impact of Late HIV Diagnosis on Costs of Care in a Public Health Care Setting.
Despite increased HIV testing and access to treatment in Australia, presentations with advanced disease occur, placing a significant burden on the health system. We sought to describe costs associated with HIV care in the first year post diagnosis in a specialized, tertiary-level HIV service and identify factors predicting increased health care costs. People newly diagnosed with HIV from 2016 to 2020 were included in the study. Data were gathered regarding their demographics (age, gender, birthplace, and first language), HIV parameters (viral load [VL] and CD4 cell count), antiretroviral therapy start date, opportunistic illness history, and health care costs (inpatient, outpatient, and emergency) from 12 months of diagnosis. Multivariable modeling was used to identify factors associated with increased costs. We identified 147 people; median age 38 years, 90% male, median CD4 count at diagnosis 338 cells/µL with median initial cost of care AUD $22,929 (interquartile range $11,902-$39,175). Costs associated with advanced HIV diagnosis (CD4 < 200 cells/µL; n = 52) were more than double an early HIV diagnosis (CD4 ≧ 350 cells/µL; n = 69) (median $46,406 vs. $20,274; p < .001). In univariate analysis, older age, higher VL, low CD4 count, and VL >200 copies/mL after 6 months were associated with increased costs. In multivariate analysis, older age (p = .001) and CD4 count <200 cells/µL (p = .001) were the only factors predicting increased cost in the first year after HIV diagnosis. Prioritizing HIV testing strategies to allow earlier diagnosis of HIV would significantly reduce the financial burden of HIV care.
Injecting drug use is a risk factor for methicillin resistance in patients with Staphylococcus aureus bloodstream infections.
We investigated whether injecting drug use was a risk factor for methicillin resistance among inpatients with Staphylococcus aureus bloodstream infections (SABSIs) at an Australian health service. In 273 inpatients, 46 (16.9%) of SABSIs were methicillin-resistant S. aureus (MRSA). MRSA was more frequent in those who had injected drugs in the past 6 months (20.6%) compared with other inpatients (15.7%). Injecting drug use was associated with a 4.82-fold (95% confidence interval = 1.54-16.29) increased odds of MRSA after accounting for confounders.
Study protocol for ADAPT-TDM: A beta-lactam antibiotic Dose AdaPtation feasibility randomised controlled Trial using Therapeutic Drug Monitoring.
IntroductionCritically ill patients are at risk of suboptimal beta-lactam antibiotic (beta-lactam) exposure due to the impact of altered physiology on pharmacokinetics. Suboptimal concentrations can lead to treatment failure or toxicity. Therapeutic drug monitoring (TDM) involves adjusting doses based on measured plasma concentrations and individualising dosing to improve the likelihood of improving exposure. Despite its potential benefits, its adoption has been slow, and data on implementation, dose adaptation and safety are sparse. The aim of this trial is to assess the feasibility and fidelity of implementing beta-lactam TDM-guided dosing in the intensive care unit setting.Methods and analysisA beta-lactam antibiotic Dose AdaPtation feasibility randomised controlled Trial using Therapeutic Drug Monitoring (ADAPT-TDM) is a single-centre, unblinded, feasibility randomised controlled trial aiming to enroll up to 60 critically ill adult participants (≥18 years). TDM and dose adjustment will be performed daily in the intervention group; the standard of care group will undergo plasma sampling, but no dose adjustment. The main outcomes include: (1) feasibility of recruitment, defined as the number of participants who are recruited from a pool of eligible participants, and (2) fidelity of TDM, defined as the degree to which TDM as a test is delivered as intended, from accurate sample collection, sample processing to result availability. Secondary outcomes include target attainment, uptake of TDM-guided dosing and incidence of neurotoxicity, hepatotoxicity and nephrotoxicity.Ethics and disseminationThis study has been approved by the Alfred Hospital human research ethics committee, Office of Ethics and Research Governance (reference: Project No. 565/22; date of approval: 22/11/2022). Prospective consent will be obtained and the study will be conducted in accordance with the Declaration of Helsinki. The finalised manuscript, including aggregate data, will be submitted for publication in a peer reviewed journal. ADAPT-TDM will determine whether beta-lactam TDM-guided dose adaptation is reproducible and feasible and provide important information required to implement this intervention in a phase III trial.Trial registration numberAustralian New Zealand Clinical Trials Registry, ACTRN12623000032651.
Pilot study to evaluate the need and implementation of a multifaceted nurse-led antimicrobial stewardship intervention in residential aged care.
ObjectivesTo evaluate the need and feasibility of a nurse-led antimicrobial stewardship (AMS) programme in two Australian residential aged care homes (RACHs) to inform a stepped-wedged, cluster randomized controlled trial (SW-cRCT).MethodsA mixed-methods pilot study of a nurse-led AMS programme was performed in two RACHs in Victoria, Australia (July-December 2019). The AMS programme comprised education, infection assessment and management guidelines, and documentation to support appropriate antimicrobial use in urinary, lower respiratory and skin/soft tissue infections. The programme was implemented over three phases: (i) pre-implementation education and integration (1 month); (ii) implementation of the intervention (3 months); and (iii) post-intervention evaluation (1 month). Baseline RACH and resident data and weekly infection and antimicrobial usage were collected and analysed descriptively to evaluate the need for AMS strategies. Feedback on intervention resources and implementation barriers were identified from semi-structured interviews, an online staff questionnaire and researcher field notes.ResultsSix key barriers to implementation of the intervention were identified and used to refine the intervention: aged care staffing and capacity; access to education; resistance to practice change; role of staff in AMS; leadership and ownership of the intervention at the RACH and organization level; and family expectations. A total of 61 antimicrobials were prescribed for 40 residents over the 3 month intervention. Overall, 48% of antibiotics did not meet minimum criteria for appropriate initiation (respiratory: 73%; urinary: 54%; skin/soft tissue: 0%).ConclusionsSeveral barriers and opportunities to improve implementation of AMS in RACHs were identified. Findings were used to inform a revised intervention to be evaluated in a larger SW-cRCT.
Tocilizumab, sarilumab and anakinra in critically ill patients with COVID-19: a randomised, controlled, open-label, adaptive platform trial.
IntroductionTocilizumab improves outcomes in critically ill patients with COVID-19. Whether other immune-modulator strategies are equally effective or better is unknown.MethodsWe investigated treatment with tocilizumab, sarilumab, anakinra and no immune modulator in these patients. In this ongoing, adaptive platform trial in 133 sites in 9 countries, we randomly assigned patients with allocation ratios dependent on the number of interventions available at each site. The primary outcome was an ordinal scale combining in-hospital mortality (assigned -1) and days free of organ support to day 21 in survivors. The trial used a Bayesian statistical model with predefined triggers for superiority, inferiority, efficacy, equivalence or futility.ResultsOf 2274 critically ill participants enrolled between 25 March 2020 and 10 April 2021, 972 were assigned to tocilizumab, 485 to sarilumab, 378 to anakinra and 418 to control. Median organ support-free days were 7 (IQR -1, 16), 9 (IQR -1, 17), 0 (IQR -1, 15) and 0 (IQR -1, 15) for tocilizumab, sarilumab, anakinra and control, respectively. Median adjusted ORs were 1.46 (95% credible intervals (CrI) 1.13, 1.87), 1.50 (95% CrI 1.13, 2.00) and 0.99 (95% CrI 0.74, 1.35) for tocilizumab, sarilumab and anakinra relative to control, yielding 99.8%, 99.8% and 46.6% posterior probabilities of superiority, respectively, compared with control. All treatments appeared safe.ConclusionsIn critically ill patients with COVID-19, tocilizumab and sarilumab have equivalent effectiveness at reducing duration of organ support and death. Anakinra is not effective in this population.Trial registration numberNCT02735707.
A LC‐MS/MS Assay for Quantification of Amodiaquine and Desethylamodiaquine in Dried Blood Spots on Filter Paper
Artesunate–amodiaquine (ARS–AQ) is a first‐line antimalarial treatment recommended by the World Health Organization. AQ is the long acting partner drug in this combination, and therapeutic success is correlated with the terminal exposure to AQ. Dried blood spot (DBS) sampling for AQ is a convenient and minimally invasive technique, especially suitable for clinical studies in resource limited settings and pediatric studies. Our primary aim was to develop and validate a bioanalytical method for quantification of AQ and its active metabolite in capillary blood applied onto filter paper as a DBS sample. The separation was achieved using a reverse phase column (Zorbax SB‐CN 50 × 4.6 mm, I.D. 3.5 μm) and a mobile phase consisting of acetonitrile:ammonium formate 20 mM with 0.5% formic acid (15:85, v/v). A 50 μL DBS was punctured with five 3.2 mm punches from the filter paper, and the punches collected correspond to approximately 15 μL of dried blood. The blood was then extracted using a mixture of 0.5% formic acid in water:acetonitrile (50:50, v/v), along with stable isotope‐labeled internal standards (AQ‐D10 and desethylamodiaquine [DAQ]‐D5). Mass spectrometry was used for quantification over the range of 2.03–459 ng/mL for AQ and 3.13–1570 ng/mL for DAQ. The validation of the method was carried out in compliance with regulatory requirements. The intra‐ and interbatch precisions were below 15% and passed all validation acceptance criteria. No carryover and no matrix effects were detected. Normalized matrix factors (analyte/internal standard) ranged from 0.96 to 1.03 for all analytes, hence no matrix effects. AQ and DAQ were stable in all conditions evaluated. Long‐term stability in DBS samples was demonstrated for up to 10 years when stored at −80°C and for 15 months when stored at room temperature. The developed method was demonstrated to be reliable and accurate. This assay may be particularly useful in the context of resource limited settings and in pediatric field studies.
Practice of ventilation in critically ill pediatric patients: protocol for an international, long-term, observational study, and results of the pilot feasibility study.
ObjectiveThis manuscript describes the protocol of an investigator-initiated, international, multicenter, long-term, prospective observational study named PRactice of VENTilation in PEDiatric Patients (PRoVENT-PED), designed to investigate the epidemiology, respiratory support practices and outcomes of critically ill pediatric patients.DesignData will be collected biannually over 10 years during predefined 4-week intervals, with an additional optional period to accommodate data collection during an epidemic or pandemic. The specific focus of PRoVENT-PED will evolve as the study progresses, initially emphasizing collecting detailed ventilator data from invasively ventilated patients. In later phases, the focus will shift to noninvasive respiratory support and typical aspects of respiratory support, like patient-ventilator asynchronies, weaning practices, and rescue therapies, as extracorporeal support. PRoVENT-PED includes patients under 18 years of age, admitted to a participating intensive care unit, and receiving respiratory support. The endpoints vary with the focus in each phase but will always include a set of key settings and ventilation parameters and related outcomes. If applicable, potentially modifiable factors and associations with outcomes will be studied. The pilot feasibility study demonstrated that the electronic capturing system effectively collects all necessary data within a reasonable time limit, with little missing data.ConclusionPRoVENT-PED is a 10-year, international, multicenter study focused on collecting data on respiratory support practices in critically ill pediatric patients. Its scope evolves from invasive to noninvasive ventilatory support, ultimately encompassing patient-ventilator asynchronies, weaning practices, and rescue therapies.
Enhanced data quality to improve malaria surveillance in Papua, Indonesia
Abstract Background Papua has a high burden of malaria, with an annual parasite incidence 300 times the national average. A key component of malaria elimination strategies is robust surveillance which is essential for monitoring trends in case numbers, guiding public health interventions, and prioritizing resource allocation. This study aimed to enhance malaria surveillance in Central Papua, Indonesia, by improving data collection, record-keeping, and treatment practices. Methods The study was conducted at five public clinics in Central Papua province, Indonesia, as part of a wider health systems strengthening programme to promote safer and more effective anti-malarial treatment (The SHEPPI Study). Clinical and laboratory details of patients with malaria and their treatment were documented in clinic registers which were digitalized into an electronic database. Automated reports were generated each month and used to provide regular feedback to clinic staff. Continuous Quality Improvement (CQI) workshops were conducted with clinic staff using the Plan-Do-Study-Act approach to address challenges and drive sustained improvements. Results Between January 2019 and December 2023, a total of 314,561 patients were tested for malaria, of whom 41.9% (131,948) had peripheral parasitaemia detected. The first round of Continuous Quality Improvement (CQI) workshops were held in May 2019 and improved data quality significantly, increasing data completeness from 46.3% (4540/9802) in the initial period (Jan–May 2019) to 71.5% (9053/12,665) after the first CQI (Jun–Oct 2019), p < 0.001. The second CQI round reduced DHP prescribing errors from 17.1% (1111/6489) in the initial period to 5.7% (607/10,669) after the second CQI (Sep 2019–Jan 2020) and PQ prescribing errors from 17.4% (552/3175) to 3.4% (160/4659) over the same time interval, p < 001. In total, 347 patients were prescribed fewer than the recommended number of PQ tablets during the initial period, 89 (25.6%) of whom were erroneously given only a single dose. Over the 4 year study period, a total of 11 workshops were conducted, driving continuous improvements in data quality and prescribing practices. Conclusion One or two rounds of CQI, supported by regular follow-up, can enhance the quality of malariometric surveillance, however interventions needed to be tailored to address specific needs of participating clinics. Improvements in data quality and prescribing practices have potential to contribute to better malaria management, improved clinical outcomes, and strengthened trust in healthcare providers.