Search results
Found 1838 matches for
Progression of lymphatic filariasis antigenaemia and microfilaraemia over 4.5 years in antigen-positive individuals, Samoa 2019-2023
Objectives: The first round of triple-drug mass drug administration (MDA) for lymphatic filariasis (LF) in Samoa was in 2018. This study aims to i) examine progression of LF antigen (Ag) and microfilaria (Mf) in Ag-positive individuals from 2019-2023; and ii) compare Ag/Mf prevalence in household members of Mf-positive vs Mf-negative participants. Methods: In 2023, we tested Ag-positive participants (indexes) from a 2019 survey in Samoa, and their household members. We tested for Ag (Alere/Abbott Filariasis Test Strip) and Mf. We examined changes in Ag/Mf status in index participants and compared Ag/Mf prevalence between household members of Mf-positive and Mf-negative indexes. Results: We recruited 91 indexes and 317 household members. In 2023, all 17 Mf-positive indexes remained Ag-positive and 11/15 with Mf results (73.3%) were Mf-positive. Of 74 Mf-negative indexes, 79.7% remained Ag-positive in 2023 and 31.1% became Mf-positive. Household members of Mf-positive indexes were more likely to be Ag-positive (odds ratios 3.3, 95% CI 1.0-10.3) compared to those of Mf-negative indexes. Conclusion: Our results raise concerns regarding long-term effectiveness of a single-dose of triple-drug MDA for sustained clearance of Mf in Samoa. Guidelines for follow-up and treatment of Ag/Mf-positive people and household members are urgently required.
Patient-reported harm from NHS treatment or care, or the lack of access to care: a cross-sectional survey of general population prevalence, impact and responses.
OBJECTIVES: The aim of this article is to provide an estimate of the proportion of the general public reporting healthcare-related harm in Great Britain, its location, impact, responses post-harm and desired reactions from healthcare providers. DESIGN: We used a cross-sectional survey, using quota sampling. SETTING: This research was conducted in Great Britain. PARTICIPANTS: The survey had 10 064 participants (weighted analysis). RESULTS: In our survey 9.7% participants reported harm caused by the National Health Service (NHS) in the last 3 years through treatment or care (6.2%) or the lack of access to care (3.5%). The main location where the harm first occurred was hospitals. A total of 37.6% of participants reported a moderate impact and 44.8% a severe impact of harm. The most common response to harm was to share their experience with others (67.1%). Almost 60% sought professional advice and support, with 11.6% contacting the Patient Advice and Liaison Service (PALS). Only 17% submitted a formal complaint, and 2.1% made a claim for financial compensation. People wanted treatment or care to redress the harm (44.4%) and an explanation (34.8%). Two-thirds of those making a complaint felt it was not handled well and approximately half were satisfied with PALS. Experiences and responses differed according to sex and age (eg, women reported more harm). People with long-term illness or disability, those in lower social grades, and people in other disadvantaged groups reported higher rates and more severe impact of harm. CONCLUSIONS: We found that 9.7% of the British general population reported harm by the NHS, a higher rate than reported in two previous surveys. Our study used a broader and more inclusive definition of harm and was conducted during the COVID-19 pandemic, making comparison to previous surveys challenging. People responded to harm in different ways, such as sharing experiences with others and seeking professional advice and support. Mostly, people who were harmed wanted help to redress the harm or to gain access to the care needed. Low satisfaction with PALS and complaints services may reflect that these services do not always deliver the required support. There is a need to better understand the patient perspective following harm and for further consideration of what a person-centred approach to resolution and recovery might look like.
Recurrence of microfilaraemia after triple-drug therapy for lymphatic filariasis in Samoa: Recrudescence or reinfection?
OBJECTIVES: Contrasting evidence is emerging on the long-term effectiveness of triple-drug therapy for elimination of lymphatic filariasis (LF) in the Pacific region. We evaluated the effectiveness of ivermectin, diethylcarbamazine and albendazole (IDA) for sustained clearance of microfilariae (Mf) in Samoa. METHODS: We enrolled two cohorts of Mf-positive participants. Cohort A were Mf-positive participants from 2018, who received directly observed triple-drug therapy in 2019 and were retested and retreated in 2023 and 2024. Cohort B were Mf-positive and treated in 2023 and retested in 2024. Participants were tested for LF antigen and Mf. RESULTS: In Cohort A, eight of the 14 participants from 2018/2019 were recruited in 2023; six were Mf-positive. In 2024, six participants were retested, and two were Mf-positive. Cohort B included eight participants, and two remained Mf-positive in 2024. Mf prevalence in 2023 for Cohort A (71.4%, 95% CI 29.0%-96.3%) was significantly higher than among their household members (12.0%, 95% CI 2.5%-31.2%). CONCLUSION: One or two doses of directly observed IDA was not sufficient for sustained clearance of Wuchereria bancrofti Mf in Samoa. The high Mf prevalence in treated individuals compared to household members suggests recrudescence rather than reinfection.
Global, regional, and national incidence and mortality burden of non-COVID-19 lower respiratory infections and aetiologies, 1990-2021: a systematic analysis from the Global Burden of Disease Study 2021.
BACKGROUND: Lower respiratory infections (LRIs) are a major global contributor to morbidity and mortality. In 2020-21, non-pharmaceutical interventions associated with the COVID-19 pandemic reduced not only the transmission of SARS-CoV-2, but also the transmission of other LRI pathogens. Tracking LRI incidence and mortality, as well as the pathogens responsible, can guide health-system responses and funding priorities to reduce future burden. We present estimates from the Global Burden of Diseases, Injuries, and Risk Factors Study (GBD) 2021 of the burden of non-COVID-19 LRIs and corresponding aetiologies from 1990 to 2021, inclusive of pandemic effects on the incidence and mortality of select respiratory viruses, globally, regionally, and for 204 countries and territories. METHODS: We estimated mortality, incidence, and aetiology attribution for LRI, defined by the GBD as pneumonia or bronchiolitis, not inclusive of COVID-19. We analysed 26 259 site-years of mortality data using the Cause of Death Ensemble model to estimate LRI mortality rates. We analysed all available age-specific and sex-specific data sources, including published literature identified by a systematic review, as well as household surveys, hospital admissions, health insurance claims, and LRI mortality estimates, to generate internally consistent estimates of incidence and prevalence using DisMod-MR 2.1. For aetiology estimation, we analysed multiple causes of death, vital registration, hospital discharge, microbial laboratory, and literature data using a network analysis model to produce the proportion of LRI deaths and episodes attributable to the following pathogens: Acinetobacter baumannii, Chlamydia spp, Enterobacter spp, Escherichia coli, fungi, group B streptococcus, Haemophilus influenzae, influenza viruses, Klebsiella pneumoniae, Legionella spp, Mycoplasma spp, polymicrobial infections, Pseudomonas aeruginosa, respiratory syncytial virus (RSV), Staphylococcus aureus, Streptococcus pneumoniae, and other viruses (ie, the aggregate of all viruses studied except influenza and RSV), as well as a residual category of other bacterial pathogens. FINDINGS: Globally, in 2021, we estimated 344 million (95% uncertainty interval [UI] 325-364) incident episodes of LRI, or 4350 episodes (4120-4610) per 100 000 population, and 2·18 million deaths (1·98-2·36), or 27·7 deaths (25·1-29·9) per 100 000. 502 000 deaths (406 000-611 000) were in children younger than 5 years, among which 254 000 deaths (197 000-320 000) occurred in countries with a low Socio-demographic Index. Of the 18 modelled pathogen categories in 2021, S pneumoniae was responsible for the highest proportions of LRI episodes and deaths, with an estimated 97·9 million (92·1-104·0) episodes and 505 000 deaths (454 000-555 000) globally. The pathogens responsible for the second and third highest episode counts globally were other viral aetiologies (46·4 million [43·6-49·3] episodes) and Mycoplasma spp (25·3 million [23·5-27·2]), while those responsible for the second and third highest death counts were S aureus (424 000 [380 000-459 000]) and K pneumoniae (176 000 [158 000-194 000]). From 1990 to 2019, the global all-age non-COVID-19 LRI mortality rate declined by 41·7% (35·9-46·9), from 56·5 deaths (51·3-61·9) to 32·9 deaths (29·9-35·4) per 100 000. From 2019 to 2021, during the COVID-19 pandemic and implementation of associated non-pharmaceutical interventions, we estimated a 16·0% (13·1-18·6) decline in the global all-age non-COVID-19 LRI mortality rate, largely accounted for by a 71·8% (63·8-78·9) decline in the number of influenza deaths and a 66·7% (56·6-75·3) decline in the number of RSV deaths. INTERPRETATION: Substantial progress has been made in reducing LRI mortality, but the burden remains high, especially in low-income and middle-income countries. During the COVID-19 pandemic, with its associated non-pharmaceutical interventions, global incident LRI cases and mortality attributable to influenza and RSV declined substantially. Expanding access to health-care services and vaccines, including S pneumoniae, H influenzae type B, and novel RSV vaccines, along with new low-cost interventions against S aureus, could mitigate the LRI burden and prevent transmission of LRI-causing pathogens. FUNDING: Bill & Melinda Gates Foundation, Wellcome Trust, and Department of Health and Social Care (UK).
Protocol for the process evaluation for a cluster randomised controlled trial evaluating primary school-based screening and intervention delivery for childhood anxiety problems.
INTRODUCTION: Anxiety problems are prevalent in childhood and, without intervention, can persist into adulthood. Effective evidence-based interventions for childhood anxiety disorders exist, specifically cognitive-behavioural therapy (CBT) in a range of formats. However, only a small proportion of children successfully access and receive treatment. Conducting mental health screening in schools and integrating evidence-based interventions for childhood anxiety problems may be an effective way to ensure support reaches children in need. The Identifying Child Anxiety Through Schools-Identification to Intervention (iCATS i2i) trial involves screening for childhood anxiety problems and offering a brief online parent-led CBT intervention. This paper presents the protocol for the process evaluation of the iCATS i2i trial, which aims to examine the implementation and acceptability of the study procedures, the mechanisms of change and whether any external factors had an impact on procedure engagement or delivery. METHODS AND ANALYSIS: This process evaluation will use both quantitative and qualitative methods to evaluate the implementation and acceptability of and barriers/facilitators to engagement and delivery of the iCATS screening/intervention procedures. Quantitative data sources will include opt-out and completion rates of baseline measures and usage analytics extracted from the online intervention platform. Qualitative interviews will be conducted with children, parents, school staff, iCATS i2i clinicians and researchers delivering study procedures. The Medical Research Council framework for process evaluations will guide study design and analysis. ETHICS AND DISSEMINATION: This study has received ethical approval from the University of Oxford Research Ethics Committee (R66068_RE003). Findings from the study will be disseminated via peer-reviewed publications in academic journals, conferences, digital and social media platforms and stakeholder meetings. TRIAL REGISTRATION: ISRCTN76119074.
Insect pest control, approximate dynamic programming, and the management of the evolution of resistance.
Ecological decision problems, such as those encountered in agriculture, often require managing conflicts between short-term costs and long-term benefits. Dynamic programming is an ideal method for optimally solving such problems but agricultural problems are often subject to additional complexities that produce state spaces intractable to exact solutions. In contrast, look-ahead policies, a class of approximate dynamic programming (ADP) algorithm, may attempt to solve problems of arbitrary magnitude. However, these algorithms focus on a temporally truncated caricature of the full decision problem over a defined planning horizon and as such are not guaranteed to suggest optimal actions. Thus, look-ahead policies may offer promising means of addressing detail-rich ecological decision problems but may not be capable of fully utilizing the information available to them, especially in scenarios where the best short- and long-term solutions may differ. We constructed and applied look-ahead policies to the management of a hypothetical, stage-structured, continually reproducing, agricultural insect pest. The management objective was to minimize the combined costs of management actions and crop damage over a 16-week growing season. The manager could elect to utilize insecticidal sprays or one of six release ratios of male-selecting transgenic insects where the release ratio determines the number of transgenic insects to be released for each wild-type male insect in the population. Complicating matters was the expression of insecticide resistance at non-trivial frequencies in the pest population. We assessed the extent to which look-ahead policies were able to recognize the potential threat of insecticide resistance and successfully integrate insecticides and transgenic releases to capitalize upon their respective benefits. Look-ahead policies were competent at anticipating and responding to ecological and economic information. Policies with longer planning horizons made fewer, better-timed insecticidal sprays and made more frequent transgenic releases, which consequently facilitated lower resistance allele frequencies. However, look-ahead policies were ultimately inefficient resistance managers, and directly responded to resistance only when it was dominant and prevalent. Effective long-term agricultural management requires the capacity to anticipate and respond to the evolution of resistance. Look-ahead policies can accommodate all the information pertinent to making the best long-term decision but may lack the perspective to actually do so.
Management of a stage-structured insect pest: an application of approximate optimization.
Ecological decision problems frequently require the optimization of a sequence of actions over time where actions may have both immediate and downstream effects. Dynamic programming can solve such problems only if the dimensionality is sufficiently low. Approximate dynamic programming (ADP) provides a suite of methods applicable to problems of arbitrary complexity at the expense of guaranteed optimality. The most easily generalized method is the look-ahead policy: a brute-force algorithm that identifies reasonable actions by constructing and solving a series of temporally truncated approximations of the full problem over a defined planning horizon. We develop and apply this approach to a pest management problem inspired by the Mediterranean fruit fly, Ceratitis capitata. The model aims to minimize the cumulative costs of management actions and medfly-induced losses over a single 16-week season. The medfly population is stage-structured and grows continuously while management decisions are made at discrete, weekly intervals. For each week, the model chooses between inaction, insecticide application, or one of six sterile insect release ratios. Look-ahead policy performance is evaluated over a range of planning horizons, two levels of crop susceptibility to medfly and three levels of pesticide persistence. In all cases, the actions proposed by the look-ahead policy are contrasted to those of a myopic policy that minimizes costs over only the current week. We find that look-ahead policies always out-performed a myopic policy and decision quality is sensitive to the temporal distribution of costs relative to the planning horizon: it is beneficial to extend the planning horizon when it excludes pertinent costs. However, longer planning horizons may reduce decision quality when major costs are resolved imminently. ADP methods such as the look-ahead-policy-based approach developed here render questions intractable to dynamic programming amenable to inference but should be applied carefully as their flexibility comes at the expense of guaranteed optimality. However, given the complexity of many ecological management problems, the capacity to propose a strategy that is "good enough" using a more representative problem formulation may be preferable to an optimal strategy derived from a simplified model.
The evolutionary stability of attenuators that mask information about animals that social partners can exploit.
Signals and cues are fundamental to social interactions. A well-established concept in the study of animal communication is an amplifier, defined as a trait that does not add extra information to that already present in the original cue or signal, but rather enhances the fidelity with which variation in the original cue or signal is correctly perceived. Attenuators as the logical compliment of amplifiers: attenuators act to reduce the fidelity with which variation in a signal or cue can be reliably evaluated by the perceivers. Where amplifiers reduce the effect of noise on the perception of variation, attenuators add noise. Attenuators have been subject to much less consideration than amplifiers; however, they will be the focus of our theoretical study. We utilize an extension of a well-established model incorporated signal or cue inaccuracy and costly investments by emitter and perceiver in sending and attending to the signal or cue. We present broad conditions involving some conflict of interest between emitter and perceiver where it may be advantageous for emitters to invest in costly attenuators to mask cues from potential perceivers, and a subset of these conditions where the perceiver may be willing to invest in costly anti-attenuators to mitigate the loss of information to them. We demonstrate that attenuators can be evolutionary stable even if they are costly, even if they are sometimes disadvantageous and even if a perceiver can mount counter-measures to them. As such, we feel that attenuators of cues may be deserving of much more research attention.
Type of fitness cost influences the rate of evolution of resistance to transgenic Bt crops.
The evolution of resistance to pesticides by insect pests is a significant challenge for sustainable agriculture. For transgenic crops expressing Bacillus thuringiensis (Bt), crystalline (Cry) toxins resistance evolution may be delayed by the high-dose/refuge strategy in which a non-toxic refuge is planted to promote the survival of susceptible insects. The high-dose/refuge strategy may interact with fitness costs associated with resistance alleles to further delay resistance. However, while a diverse range of fitness costs are reported in the field, they are typically represented as a fixed reduction in survival or viability which is insensitive to ecological conditions such as competition. Furthermore, the potential dynamic consequences of restricting susceptible insects to a refuge which represents only a fraction of the available space have rarely been considered.We present a generalized discrete time model which utilizes dynamic programming methods to derive the optimal management decisions for the control of a theoretical insect pest population exposed to Bt crops. We consider three genotypes (susceptible homozygotes, resistant homozygotes and heterozygotes) and implement fitness costs of resistance to Bt toxins as either a decrease in the relative competitive ability of resistant insects or as a penalty on fecundity. Model analysis is repeated and contrasted for two types of density dependence: uniform density dependence which operates equally across the landscape and heterogeneous density dependence where the intensity of competition scales inversely with patch size and is determined separately for the refuge and Bt crop.When the planting of Bt is decided optimally, fitness costs to fecundity allow for the planting of larger areas of Bt crops than equivalent fitness costs that reduce the competitive ability of resistant insects.Heterogeneous competition only influenced model predictions when the proportional area of Bt planted in each season was decided optimally and resistance was not recessive. Synthesis and applications. The high-dose/refuge strategy alone is insufficient to preserve susceptibility to transgenic Bacillus thuringiensis (Bt) crops in the long term when constraints upon the evolution of resistance are not insurmountable. Fitness costs may enhance the delaying effect of the refuge, but the extent to which they do so depends upon how the cost is realized biologically. Fitness costs which apply independently of other variables may be more beneficial to resistance management than costs which are only visible to selection under a limited range of ecological conditions.
Investment in attending to cues and the evolution of amplifiers.
Signals and cues are extensively used in social interactions across diverse communication systems. Here, we extend an existing theoretical framework to explore investment by emitters and perceivers in the fidelity with which cues and signals associated with the former are detected by the latter. Traits of the emitter that improve cue or signal fidelity without adding information are termed 'amplifiers'. We assume that each party can invest in improving fidelity but that it is increasingly costly the more fidelity is improved. Our model predicts that evolution of amplifier traits of a pre-existing cue occurs over a broader range of circumstances than evolution of signalling in situations where the emitter offered no pre-existing cue to the perceiver. It further predicts that the greater the intrinsic informational value of a cue, the more likely it is that the perceiver (and not the emitter) will invest in the fidelity of detecting that cue. A consequence of this predicted asymmetry is that true communication with reciprocal adaptations in emitters and perceivers to improve signal fidelity is likely to occur predominantly for traits of intermediate reliability. The corollary is that uncertainty of the perceiver will then be a key feature of communication. Uncertainty can arise because perceivers misinterpret signals or do not perceive them correctly, but here we argue that uncertainty is more fundamentally at the root of communication because traits that are intrinsically highly informative will induce only the perceiver and not the emitter to invest in improved fidelity of perception of that trait.
Linking signal fidelity and the efficiency costs of communication.
The handicap principle has been the overarching framework to explain the evolution and maintenance of communication. Yet, it is becoming apparent that strategic costs of signalling are not the only mechanism maintaining signal honesty. Rather, the fidelity of detecting signals can itself be strongly selected. Specifically, we argue that the fidelity of many signals will be constrained by the investment in signal generation and reception by the signaller and perceiver, respectively. Here, we model how investments in signal fidelity influence the emergence and stability of communication using a simple theoretical framework. The predictions of the model indicate that high-cost communication can be stable whereas low-cost intermediates are generally selected against. This dichotomy suggests that the most parsimonious route to the evolution of communication is for initial investment in communicative traits to be driven by noncommunicative functions. Such cues can appeal to pre-existing perceptual biases and thereby stimulate signal evolution. We predict that signal evolution will vary between systems in ways that can be linked to the economics of communication to the two parties involved.
Unpredicted impacts of insect endosymbionts on interactions between soil organisms, plants and aphids.
Ecologically significant symbiotic associations are frequently studied in isolation, but such studies of two-way interactions cannot always predict the responses of organisms in a community setting. To explore this issue, we adopt a community approach to examine the role of plant-microbial and insect-microbial symbioses in modulating a plant-herbivore interaction. Potato plants were grown under glass in controlled conditions and subjected to feeding from the potato aphid Macrosiphum euphorbiae. By comparing plant growth in sterile, uncultivated and cultivated soils and the performance of M. euphorbiae clones with and without the facultative endosymbiont Hamiltonella defensa, we provide evidence for complex indirect interactions between insect- and plant-microbial systems. Plant biomass responded positively to the live soil treatments, on average increasing by 15% relative to sterile soil, while aphid feeding produced shifts (increases in stem biomass and reductions in stolon biomass) in plant resource allocation irrespective of soil treatment. Aphid fecundity also responded to soil treatment with aphids on sterile soil exhibiting higher fecundities than those in the uncultivated treatment. The relative allocation of biomass to roots was reduced in the presence of aphids harbouring H. defensa compared with plants inoculated with H. defensa-free aphids and aphid-free control plants. This study provides evidence for the potential of plant and insect symbionts to shift the dynamics of plant-herbivore interactions.
Within-host modeling of primaquine-induced hemolysis in hemizygote glucose-6-phosphate dehydrogenase deficient healthy volunteers.
Primaquine is the only widely available drug to prevent relapses of Plasmodium vivax malaria. Primaquine is underused because of concerns over oxidant hemolysis in glucose-6-phosphate dehydrogenase (G6PD) deficiency. A pharmacometric trial showed that ascending-dose radical cure primaquine regimens causing 'slow burn' hemolysis were safe in G6PD-deficient Thai and Burmese male volunteers. We developed and calibrated a within-host model of primaquine hemolysis in G6PD deficiency, using detailed serial hemoglobin and reticulocyte count data from 23 hemizygote deficient volunteers given ascending-dose primaquine (1,523 individual measurements over 656 unique time points). We estimate that primaquine doses of ~0.75 mg base/kg reduce the circulating lifespan of deficient erythrocytes by ~30 days in individuals with common Southeast Asian G6PD variants. We predict that 5 mg/kg primaquine total dose can be administered safely to G6PD-deficient individuals over 14 days with expected hemoglobin drops of 18 to 43% (2.7 to 6.5 g/dL drop from a baseline of 15 g/dL).CLINICAL TRIALSThis study is registered with the Thai Clinical Trials Registry (TCTR) as TCTR20170830002 and TCTR20220317004.
Intensive blood-glucose control with sulphonylureas or insulin compared with conventional treatment and risk of complications in patients with type 2 diabetes (UKPDS 33)
Background: Improved blood-glucose control decreases the progression of diabetic microvascular disease, but the effect on macrovascular complications is unknown. There is concern that sulphonylureas may increase cardiovascular mortality in patients with type 2 diabetes and that high insulin concentrations may enhance atheroma formation. We compared the effects of intensive blood-glucose control with either sulphonylurea or insulin and conventional treatment on the risk of microvascular and macrovascular complications in patients with type 2 diabetes in a randomised controlled trial. Methods: 3867 newly diagnosed patients with type 2 diabetes, median age 54 years (IQR 48-60 years), who after 3 months' diet treatment had a mean of two fasting plasma glucose (FPG) concentrations of 6.1-15.0 mmol/L were randomly assigned intensive policy with a sulphonylurea (chlorpropamide, glibenclamide, or glipizide) or with insulin, or conventional policy with diet. The aim in the intensive group was FPG less than 6 mmol/L. In the conventional group, the aim was the best achievable FPG with diet alone; drugs were added only if there were hyperglycaemic symptoms or FPG greater than 15 mmol/L. Three aggregate endpoints were used to assess differences between conventional and intensive treatment: any diabetes-related endpoint (sudden death, death from hyperglycaemia or hypoglycaemia, fatal or non-fatal myocardial infarction, angina, heart failure, stroke, renal failure, amputation [of at least one digit], vitreous haemorrhage, retinopathy requiring photocoagulation, blindness in one eye, or cataract extraction); diabetes-related death (death from myocardial infarction, stroke, peripheral vascular disease, renal disease, hyperglycaemia or hypoglycaemia, and sudden death); all-cause mortality. Single clinical endpoints and surrogate subclinical endpoints were also assessed. All analyses were by intention to treat and frequency of hypoglycaemia was also analysed by actual therapy. Findings: Over 10 years, haemoglobin A(1c) (HbA(1c)) was 7.0% (6.2-8.2) in the intensive group compared with 7.9% (6.9-8.8) in the conventional group - an 11% reduction. There was no difference in HbA(1c) among agents in the intensive group. Compared with the conventional group, the risk in the intensive group was 12% lower (95% CI 1-21, p = 0.029) for any diabetes-related endpoint; 10% lower (-11 to 27, p = 0.34) for any diabetes-related death; and 6% lower (-10 to 20, p = 0.44) for all-cause mortality. Most of the risk reduction in the any diabetes-related aggregate endpoint was due to a 25% risk reduction (7-40, p = 0.0099) in microvascular endpoints, including the need for retinal photocoagulation. There was no difference for any of the three aggregate endpoints between the three intensive agents (chlorpropamide, glibenclamide, or insulin). Patients in the intensive group had more hypoglycaemic episodes than those in the conventional group on both types of analysis (both p < 0.0001). The rates of major hypoglycaemic episodes per year were 0.7% with conventional treatment, 1.0% with chlorpropamide, 1.4% with glibenclamide, and 1.8% with insulin. Weight gain was significantly higher in the intensive group (mean 2.9 kg) than in the conventional group (p < 0.001), and patients assigned insulin had a greater gain in weight (4.0 kg) than those assigned chlorpropamide (2.6 kg) or glibenclamide (1.7 kg). Interpretation: Intensive blood-glucose control by either sulphonylureas or insulin substantially decreases the risk of microvascular complications, but not macrovascular disease, in patients with type 2 diabetes. None of the individual drugs had an adverse effect on cardiovascular outcomes. All intensive treatment increased the risk of hypoglycaemia.
A study protocol for developing a spatial vulnerability index for infectious diseases of poverty in the Caribbean region.
Infectious diseases of poverty (IDoP) affect disproportionately resource-limited and marginalized populations, resulting in spatial patterns of vulnerability across various geographical areas. Currently, no spatial indices exist to quantify vulnerability to IDoP at a fine geographical level within countries, such as municipalities or provinces. Without such an index, policymakers cannot effectively allocate resources or target interventions in the most vulnerable areas. This protocol aims to specify a methodological approach to measure spatial variation in vulnerability to IDoP. We will evaluate this methodological approach using surveillance and seroprevalence data from the Dominican Republic (DR) as part of a broader effort to develop a regional index for the Caribbean region. The study will consist of three main components. The first component involves identifying the relevant factors associated with IDoP in the Caribbean region through a scoping review, supplemented by expert-elicited opinion. The second component will apply a Fuzzy Analytic Hierarchy Process to weigh the aforementioned factors and develop a spatial composite index, using open data and available national surveys in the DR. In the final component, we will evaluate and validate the index by analysing the prevalence of at least three IDoPs at a fine-grained municipal level in the DR, using seroprevalence data from a 2021 national field study and other national surveillance programs. The spatial vulnerability index framework developed in this study will assess the degree of vulnerability to IDoP across different geographical scales, depending on data availability in each country.
Global burden of bacterial antimicrobial resistance 1990-2021: a systematic analysis with forecasts to 2050.
BACKGROUND: Antimicrobial resistance (AMR) poses an important global health challenge in the 21st century. A previous study has quantified the global and regional burden of AMR for 2019, followed with additional publications that provided more detailed estimates for several WHO regions by country. To date, there have been no studies that produce comprehensive estimates of AMR burden across locations that encompass historical trends and future forecasts. METHODS: We estimated all-age and age-specific deaths and disability-adjusted life-years (DALYs) attributable to and associated with bacterial AMR for 22 pathogens, 84 pathogen-drug combinations, and 11 infectious syndromes in 204 countries and territories from 1990 to 2021. We collected and used multiple cause of death data, hospital discharge data, microbiology data, literature studies, single drug resistance profiles, pharmaceutical sales, antibiotic use surveys, mortality surveillance, linkage data, outpatient and inpatient insurance claims data, and previously published data, covering 520 million individual records or isolates and 19 513 study-location-years. We used statistical modelling to produce estimates of AMR burden for all locations, including those with no data. Our approach leverages the estimation of five broad component quantities: the number of deaths involving sepsis; the proportion of infectious deaths attributable to a given infectious syndrome; the proportion of infectious syndrome deaths attributable to a given pathogen; the percentage of a given pathogen resistant to an antibiotic of interest; and the excess risk of death or duration of an infection associated with this resistance. Using these components, we estimated disease burden attributable to and associated with AMR, which we define based on two counterfactuals; respectively, an alternative scenario in which all drug-resistant infections are replaced by drug-susceptible infections, and an alternative scenario in which all drug-resistant infections were replaced by no infection. Additionally, we produced global and regional forecasts of AMR burden until 2050 for three scenarios: a reference scenario that is a probabilistic forecast of the most likely future; a Gram-negative drug scenario that assumes future drug development that targets Gram-negative pathogens; and a better care scenario that assumes future improvements in health-care quality and access to appropriate antimicrobials. We present final estimates aggregated to the global, super-regional, and regional level. FINDINGS: In 2021, we estimated 4·71 million (95% UI 4·23-5·19) deaths were associated with bacterial AMR, including 1·14 million (1·00-1·28) deaths attributable to bacterial AMR. Trends in AMR mortality over the past 31 years varied substantially by age and location. From 1990 to 2021, deaths from AMR decreased by more than 50% among children younger than 5 years yet increased by over 80% for adults 70 years and older. AMR mortality decreased for children younger than 5 years in all super-regions, whereas AMR mortality in people 5 years and older increased in all super-regions. For both deaths associated with and deaths attributable to AMR, meticillin-resistant Staphylococcus aureus increased the most globally (from 261 000 associated deaths [95% UI 150 000-372 000] and 57 200 attributable deaths [34 100-80 300] in 1990, to 550 000 associated deaths [500 000-600 000] and 130 000 attributable deaths [113 000-146 000] in 2021). Among Gram-negative bacteria, resistance to carbapenems increased more than any other antibiotic class, rising from 619 000 associated deaths (405 000-834 000) in 1990, to 1·03 million associated deaths (909 000-1·16 million) in 2021, and from 127 000 attributable deaths (82 100-171 000) in 1990, to 216 000 (168 000-264 000) attributable deaths in 2021. There was a notable decrease in non-COVID-related infectious disease in 2020 and 2021. Our forecasts show that an estimated 1·91 million (1·56-2·26) deaths attributable to AMR and 8·22 million (6·85-9·65) deaths associated with AMR could occur globally in 2050. Super-regions with the highest all-age AMR mortality rate in 2050 are forecasted to be south Asia and Latin America and the Caribbean. Increases in deaths attributable to AMR will be largest among those 70 years and older (65·9% [61·2-69·8] of all-age deaths attributable to AMR in 2050). In stark contrast to the strong increase in number of deaths due to AMR of 69·6% (51·5-89·2) from 2022 to 2050, the number of DALYs showed a much smaller increase of 9·4% (-6·9 to 29·0) to 46·5 million (37·7 to 57·3) in 2050. Under the better care scenario, across all age groups, 92·0 million deaths (82·8-102·0) could be cumulatively averted between 2025 and 2050, through better care of severe infections and improved access to antibiotics, and under the Gram-negative drug scenario, 11·1 million AMR deaths (9·08-13·2) could be averted through the development of a Gram-negative drug pipeline to prevent AMR deaths. INTERPRETATION: This study presents the first comprehensive assessment of the global burden of AMR from 1990 to 2021, with results forecasted until 2050. Evaluating changing trends in AMR mortality across time and location is necessary to understand how this important global health threat is developing and prepares us to make informed decisions regarding interventions. Our findings show the importance of infection prevention, as shown by the reduction of AMR deaths in those younger than 5 years. Simultaneously, our results underscore the concerning trend of AMR burden among those older than 70 years, alongside a rapidly ageing global community. The opposing trends in the burden of AMR deaths between younger and older individuals explains the moderate future increase in global number of DALYs versus number of deaths. Given the high variability of AMR burden by location and age, it is important that interventions combine infection prevention, vaccination, minimisation of inappropriate antibiotic use in farming and humans, and research into new antibiotics to mitigate the number of AMR deaths that are forecasted for 2050. FUNDING: UK Department of Health and Social Care's Fleming Fund using UK aid, and the Wellcome Trust.
Global burden and strength of evidence for 88 risk factors in 204 countries and 811 subnational locations, 1990-2021: a systematic analysis for the Global Burden of Disease Study 2021.
BACKGROUND: Understanding the health consequences associated with exposure to risk factors is necessary to inform public health policy and practice. To systematically quantify the contributions of risk factor exposures to specific health outcomes, the Global Burden of Diseases, Injuries, and Risk Factors Study (GBD) 2021 aims to provide comprehensive estimates of exposure levels, relative health risks, and attributable burden of disease for 88 risk factors in 204 countries and territories and 811 subnational locations, from 1990 to 2021. METHODS: The GBD 2021 risk factor analysis used data from 54 561 total distinct sources to produce epidemiological estimates for 88 risk factors and their associated health outcomes for a total of 631 risk-outcome pairs. Pairs were included on the basis of data-driven determination of a risk-outcome association. Age-sex-location-year-specific estimates were generated at global, regional, and national levels. Our approach followed the comparative risk assessment framework predicated on a causal web of hierarchically organised, potentially combinative, modifiable risks. Relative risks (RRs) of a given outcome occurring as a function of risk factor exposure were estimated separately for each risk-outcome pair, and summary exposure values (SEVs), representing risk-weighted exposure prevalence, and theoretical minimum risk exposure levels (TMRELs) were estimated for each risk factor. These estimates were used to calculate the population attributable fraction (PAF; ie, the proportional change in health risk that would occur if exposure to a risk factor were reduced to the TMREL). The product of PAFs and disease burden associated with a given outcome, measured in disability-adjusted life-years (DALYs), yielded measures of attributable burden (ie, the proportion of total disease burden attributable to a particular risk factor or combination of risk factors). Adjustments for mediation were applied to account for relationships involving risk factors that act indirectly on outcomes via intermediate risks. Attributable burden estimates were stratified by Socio-demographic Index (SDI) quintile and presented as counts, age-standardised rates, and rankings. To complement estimates of RR and attributable burden, newly developed burden of proof risk function (BPRF) methods were applied to yield supplementary, conservative interpretations of risk-outcome associations based on the consistency of underlying evidence, accounting for unexplained heterogeneity between input data from different studies. Estimates reported represent the mean value across 500 draws from the estimate's distribution, with 95% uncertainty intervals (UIs) calculated as the 2·5th and 97·5th percentile values across the draws. FINDINGS: Among the specific risk factors analysed for this study, particulate matter air pollution was the leading contributor to the global disease burden in 2021, contributing 8·0% (95% UI 6·7-9·4) of total DALYs, followed by high systolic blood pressure (SBP; 7·8% [6·4-9·2]), smoking (5·7% [4·7-6·8]), low birthweight and short gestation (5·6% [4·8-6·3]), and high fasting plasma glucose (FPG; 5·4% [4·8-6·0]). For younger demographics (ie, those aged 0-4 years and 5-14 years), risks such as low birthweight and short gestation and unsafe water, sanitation, and handwashing (WaSH) were among the leading risk factors, while for older age groups, metabolic risks such as high SBP, high body-mass index (BMI), high FPG, and high LDL cholesterol had a greater impact. From 2000 to 2021, there was an observable shift in global health challenges, marked by a decline in the number of all-age DALYs broadly attributable to behavioural risks (decrease of 20·7% [13·9-27·7]) and environmental and occupational risks (decrease of 22·0% [15·5-28·8]), coupled with a 49·4% (42·3-56·9) increase in DALYs attributable to metabolic risks, all reflecting ageing populations and changing lifestyles on a global scale. Age-standardised global DALY rates attributable to high BMI and high FPG rose considerably (15·7% [9·9-21·7] for high BMI and 7·9% [3·3-12·9] for high FPG) over this period, with exposure to these risks increasing annually at rates of 1·8% (1·6-1·9) for high BMI and 1·3% (1·1-1·5) for high FPG. By contrast, the global risk-attributable burden and exposure to many other risk factors declined, notably for risks such as child growth failure and unsafe water source, with age-standardised attributable DALYs decreasing by 71·5% (64·4-78·8) for child growth failure and 66·3% (60·2-72·0) for unsafe water source. We separated risk factors into three groups according to trajectory over time: those with a decreasing attributable burden, due largely to declining risk exposure (eg, diet high in trans-fat and household air pollution) but also to proportionally smaller child and youth populations (eg, child and maternal malnutrition); those for which the burden increased moderately in spite of declining risk exposure, due largely to population ageing (eg, smoking); and those for which the burden increased considerably due to both increasing risk exposure and population ageing (eg, ambient particulate matter air pollution, high BMI, high FPG, and high SBP). INTERPRETATION: Substantial progress has been made in reducing the global disease burden attributable to a range of risk factors, particularly those related to maternal and child health, WaSH, and household air pollution. Maintaining efforts to minimise the impact of these risk factors, especially in low SDI locations, is necessary to sustain progress. Successes in moderating the smoking-related burden by reducing risk exposure highlight the need to advance policies that reduce exposure to other leading risk factors such as ambient particulate matter air pollution and high SBP. Troubling increases in high FPG, high BMI, and other risk factors related to obesity and metabolic syndrome indicate an urgent need to identify and implement interventions. FUNDING: Bill & Melinda Gates Foundation.
Integrated Serosurveillance of Infectious Diseases Using Multiplex Bead Assays: A Systematic Review.
Integrated serological surveillance (serosurveillance) involves testing for antibodies to multiple pathogens (or species) simultaneously and can be achieved using multiplex bead assays (MBAs). This systematic review aims to describe pathogens studied using MBAs, the operational implementation of MBAs, and how the data generated were synthesised. In November and December 2023, four databases were searched for studies utilising MBAs for the integrated serosurveillance of infectious diseases. Two reviewers independently screened and extracted data regarding the study settings and population, methodology, seroprevalence results, and operational implementation elements. Overall, 4765 studies were identified; 47 were eligible for inclusion, of which 41% (n = 19) investigated multiple malaria species, and 14% performed concurrent surveillance of malaria in combination with other infectious diseases (n = 14). Additionally, 14 studies (29%) investigated a combination of multiple infectious diseases (other than malaria), and seven studies examined a combination of vaccine-preventable diseases. Haiti (n = 8) was the most studied country, followed by Ethiopia (n = 6), Bangladesh (n = 3), Kenya (n = 3), and Tanzania (n = 3). Only seven studies were found where integrated serosurveillance was the primary objective. The synthesis of data varied and included the investigation of age-specific seroprevalence (n = 25), risk factor analysis (n = 15), and spatial analysis of disease prevalence (n = 8). This review demonstrated that the use of MBAs for integrated surveillance of multiple pathogens is gaining traction; however, more research and capabilities in lower- and middle-income countries are needed to optimise and standardise sample collection, survey implementation, and the analysis and interpretation of results. Geographical and population seroprevalence data can enable targeted public health interventions, highlighting the potential and importance of integrated serological surveillance as a public health tool.