Search results
Found 17418 matches for
Do you want to know more about Open Access? Find out about Act on Acceptance & ORCID from an expert? Book a place for our session on Tuesday 23rd August, 2-3pm in Room B at the WTCHG. Juliet Ralph, the Open Access Subject librarian, will be available for your toughest questions.
Distribution of Burkholderia pseudomallei within a 300-cm deep soil profile: implications for environmental sampling
AbstractThe environmental distribution of Burkholderia pseudomallei, the causative agent of melioidosis, remains poorly understood. B. pseudomallei is known to have the ability to occupy a variety of environmental niches, particularly in soil. This paper provides novel information about a putative association of soil biogeochemical heterogeneity and the vertical distribution of B. pseudomallei. We investigated (1) the distribution of B. pseudomallei along a 300-cm deep soil profile together with the variation of a range of soil physico-chemical properties; (2) whether correlations between the distribution of B. pseudomallei and soil physico-chemical properties exist and (3) when they exist, what such correlations indicate with regards to the environmental conditions conducive to the occurrence of B. pseudomallei in soils. Unexpectedly, the highest concentrations of B. pseudomallei were observed between 100 and 200 cm below the soil surface. Our results indicate that unravelling the environmental conditions favorable to B. pseudomallei entails considering many aspects of the actual complexity of soil. Important recommendations regarding environmental sampling for B. pseudomallei can be drawn from this work, in particular that collecting samples down to the water table is of foremost importance, as groundwater persistence appears to be a controlling factor of the occurrence of B. pseudomallei in soil.
Rickettsial Infections Are Neglected Causes of Acute Febrile Illness in Teluk Intan, Peninsular Malaysia
Rickettsial infections are among the leading etiologies of acute febrile illness in Southeast Asia. However, recent data from Malaysia are limited. This prospective study was conducted in Teluk Intan, Peninsular Malaysia, during January to December 2016. We recruited 309 hospitalized adult patients with acute febrile illness. Clinical and biochemistry data were obtained, and patients were stratified into mild and severe infections based on the sepsis-related organ failure (qSOFA) scoring system. Diagnostic assays including blood cultures, real-time PCR, and serology (IFA and MAT) were performed. In this study, pathogens were identified in 214 (69%) patients, of which 199 (93%) patients had a single etiology, and 15 (5%) patients had >1 etiologies. The top three causes of febrile illness requiring hospitalization in this Malaysian study were leptospirosis (68 (32%)), dengue (58 (27%)), and rickettsioses (42 (19%)). Fifty-five (18%) patients presented with severe disease with a qSOFA score of >2. Mortality was documented in 38 (12%) patients, with the highest seen in leptospirosis (16 (42%)) followed by rickettsiosis (4 (11%)). While the significance of leptospirosis and dengue are recognized, the impact of rickettsial infections in Peninsular Malaysia remains under appreciated. Management guidelines for in-patient care with acute febrile illness in Peninsular Malaysia are needed.
Abattoir-Based Serological Surveillance and Spatial Risk Analysis of Foot-and-Mouth Disease, Brucellosis, and Q Fever in Lao PDR Large Ruminants
A national animal disease surveillance network initiated by the Lao PDR government is adopted and reinforced by a joint research project between the National Animal Health Laboratory (NAHL), the Department of Livestock and Fisheries (DLF), and the Mahidol Oxford Tropical Medicine Research Unit (MORU). The network is strengthened by staff training and practical exercises and is utilised to provide zoonotic or high-impact disease information on a national scale. Between January and December 2020, large ruminant samples are collected monthly from 18 abattoirs, one in each province, by provincial and district agriculture and forestry officers. The surveillance network collected a total of 4247 serum samples (1316 buffaloes and 2931 cattle) over this period. Samples are tested for antibodies against Brucella spp., Coxiella burnetii (Q fever) and Foot-and-Mouth Disease Non-Structural Protein (FMD NSP) using commercial ELISA kits and the Rose Bengal test. Seroprevalences of Q fever and brucellosis in large ruminants are low at 1.7% (95% CI: 1.3, 2.1) and 0.7% (95% CI: 0.5, 1.0) respectively, while for FMD NSP it is 50.5% (95% CI: 49.0, 52.0). Univariate analyses show differences in seroprevalences of Q fever between destination (abattoir) province (p-value = 0.005), province of origin (p-value = 0.005), animal type (buffalo or cattle) (p-value = 0.0008), and collection month (p-value = 3.4 × 10−6). Similar to Q fever, seroprevalences of brucellosis were significantly different for destination province (p-value < 0.00001), province of origin (p-value < 0.00001), animal type (p-value = 9.9 × 10−5) and collection month (p-value < 0.00001), plus body condition score (p-value = 0.003), and age (p-value = 0.007). Additionally, risk factors of the FMD NSP dataset include the destination province (p-value < 0.00001), province of origin (p-value < 0.00001), sex (p-value = 7.97 × 10−8), age (p-value = 0.009), collection date (p-value < 0.00001), and collection month (p-value < 0.00001). Spatial analyses revealed that there is no spatial correlation of FMD NSP seropositive animals. High-risk areas for Q fever and brucellosis are identified by spatial analyses. Further investigation of the higher risk areas would provide a better epidemiological understanding of both diseases in Lao PDR. In conclusion, the abattoir serological survey provides useful information about disease exposure and potential risk factors. The network is a good base for field and laboratory staff training in practical technical skills. However, the sustainability of such a surveillance activity is relatively low without an external source of funding, given the operational costs and insufficient government budget. The cost-effectiveness of the abattoir survey could be increased by targeting hotspot areas, reducing fixed costs, and extending the focus to cover more diseases.
Implementation of corticosteroids in treatment of COVID-19 in the ISARIC WHO Clinical Characterisation Protocol UK: prospective, cohort study
Background: Dexamethasone was the first intervention proven to reduce mortality in patients with COVID-19 being treated in hospital. We aimed to evaluate the adoption of corticosteroids in the treatment of COVID-19 in the UK after the RECOVERY trial publication on June 16, 2020, and to identify discrepancies in care. Methods: We did an audit of clinical implementation of corticosteroids in a prospective, observational, cohort study in 237 UK acute care hospitals between March 16, 2020, and April 14, 2021, restricted to patients aged 18 years or older with proven or high likelihood of COVID-19, who received supplementary oxygen. The primary outcome was administration of dexamethasone, prednisolone, hydrocortisone, or methylprednisolone. This study is registered with ISRCTN, ISRCTN66726260. Findings: Between June 17, 2020, and April 14, 2021, 47 795 (75·2%) of 63 525 of patients on supplementary oxygen received corticosteroids, higher among patients requiring critical care than in those who received ward care (11 185 [86·6%] of 12 909 vs 36 415 [72·4%] of 50 278). Patients 50 years or older were significantly less likely to receive corticosteroids than those younger than 50 years (adjusted odds ratio 0·79 [95% CI 0·70–0·89], p=0·0001, for 70–79 years; 0·52 [0·46–0·58], p<0·0001, for >80 years), independent of patient demographics and illness severity. 84 (54·2%) of 155 pregnant women received corticosteroids. Rates of corticosteroid administration increased from 27·5% in the week before June 16, 2020, to 75–80% in January, 2021. Interpretation: Implementation of corticosteroids into clinical practice in the UK for patients with COVID-19 has been successful, but not universal. Patients older than 70 years, independent of illness severity, chronic neurological disease, and dementia, were less likely to receive corticosteroids than those who were younger, as were pregnant women. This could reflect appropriate clinical decision making, but the possibility of inequitable access to life-saving care should be considered. Funding: UK National Institute for Health Research and UK Medical Research Council.
Spatiotemporal Epidemiology of Tuberculosis in Thailand from 2011 to 2020
Tuberculosis is a leading cause of infectious disease globally, especially in developing countries. Better knowledge of spatial and temporal patterns of tuberculosis burden is important for effective control programs as well as informing resource and budget allocation. Studies have demonstrated that TB exhibits highly complex dynamics in both spatial and temporal dimensions at different levels. In Thailand, TB research has been primarily focused on surveys and clinical aspects of the disease burden with little attention on spatiotemporal heterogeneity. This study aimed to describe temporal trends and spatial patterns of TB incidence and mortality in Thailand from 2011 to 2020. Monthly TB case and death notification data were aggregated at the provincial level. Age-standardized incidence and mortality were calculated; time series and global and local clustering analyses were performed for the whole country. There was an overall decreasing trend with seasonal peaks in the winter. There was spatial heterogeneity with disease clusters in many regions, especially along international borders, suggesting that population movement and socioeconomic variables might affect the spatiotemporal distribution in Thailand. Understanding the space-time distribution of TB is useful for planning targeted disease control program activities. This is particularly important in low- and middle-income countries including Thailand to help prioritize allocation of limited resources.
Infectious diseases data observatory (IDDO) visceral leishmaniasis library of clinical therapeutic studies: A protocol for a living systematic review of clinical studies
Introduction: Visceral leishmaniasis (VL) is a vector-borne disease caused by protozoan parasites of the genus Leishmania. The disease is endemic in parts of South Asia, East Africa, South America and the Mediterranean region, with an estimated 50,000 to 90,000 cases occurring annually. A living systematic review of existing scientific literature is proposed to identify clinical drug efficacy studies against VL, conducted following the Preferred Reporting Items for Systematic-Reviews and Meta-Analyses (PRISMA) guidelines. Methods and analysis: The proposed living systematic review builds on a previous systematic review first carried out in 2016, and the current protocol is designed to capture any published or registered VL clinical study from Nov-2021 onwards. The following databases will be searched by a medical librarian: PubMed, Ovid Embase, Scopus, Web of Science Core Collection, Cochrane Central Register of Controlled Trials, clinicaltrials.gov, WHO ICTRP, as well as IMEMR, IMSEAR, and LILACS from the WHO Global Index Medicus. The systematic review will consider both randomised and non-randomised interventional studies, including single-armed studies. Ethics and dissemination: A database of eligible studies, including study characteristics, is openly available (https://www.iddo.org/tool/vl-surveyor) and will be continually updated every six months. All findings will be published in a peer-reviewed journal. PROSPERO registration: CRD42021284622 (29/11/2021)
No advantage of quadruple- or triple-class antiretroviral therapy as initial treatment in patients with very high viraemia.
BackgroundWe assessed whether quadruple or triple-class therapy for the initial treatment of HIV-1 infection provides a virological benefit over standard triple therapy in patients with very high plasma viraemia. The assessment was made based on a national observational HIV cohort in the Netherlands.MethodsInclusion criteria were age ≥18 years, treatment-naive, plasma viral load (pVL) ≥500,000 copies/ml and initiation of quadruple or triple therapy between 2001 and 2011. Time to viral suppression, defined as pVL<50 copies/ml, was compared between the two groups using Kaplan-Meier plots and multivariate Cox regression analysis.ResultsA total of 675 patients were included: 125 (19%) initiated quadruple and 550 (81%) triple therapy. Median pVL was 5.9 (IQR 5.8-6.1) log(10) copies/ml in both groups (P=0.49). 22 (18%) patients on quadruple and 63 (12%) on triple therapy interrupted the treatment regimen because of drug-related toxicity (P=0.06). Median time to viral suppression was 5.8 (IQR 4.6-7.9) and 6.0 (4.0-9.4) months in the patients on quadruple and triple therapy, respectively (log-rank, P=0.42). In the adjusted Cox analysis, quadruple therapy was not associated with time to viral suppression (HR 1.07 [95% CI 0.86, 1.33], P=0.53). Similar results were seen when comparing triple- versus dual-class therapy (n=72 versus n=601, respectively).ConclusionsInitial quadruple- or triple-class therapy was equally effective as standard triple therapy in the suppression of HIV-1 in treatment-naive patients with very high viraemia and did not result in faster pVL decreases, but did expose patients to additional toxicity.
Temporary antiretroviral treatment during primary HIV-1 infection has a positive impact on health-related quality of life: data from the Primo-SHM cohort study.
ObjectivesThe aim of the study was to compare health-related quality of life (HRQL) over 96 weeks in patients receiving no treatment or 24 or 60 weeks of combination antiretroviral therapy (cART) during primary HIV-1 infection (PHI).MethodsA multicentre prospective cohort study of PHI patients, with an embedded randomized trial, was carried out. HRQL was assessed with the Medical Outcomes Study Health Survey for HIV (MOS-HIV) and a symptom checklist administered at weeks 0, 8, 24, 36, 48, 60, 72, 84 and 96. Mixed linear models were used for the analysis of differences in HRQL among the three groups.ResultsA total of 112 patients were included in the study: 28 received no treatment, 45 received 24 weeks of cART and 39 received 60 weeks of cART. Over 96 weeks of follow-up, the groups receiving 24 and 60 weeks of cART had better cognitive functioning than the no-treatment group (P = 0.005). Patients receiving 60 weeks of cART had less pain (P = 0.004), better role functioning (P = 0.001), better physical functioning (P = 0.02) and a better physical health summary score (P = 0.006) than the groups receiving no treatment or 24 weeks of cART. Mental health was better in patients receiving 24 weeks of cART than in patients in the no-treatment group or the group receiving 60 weeks of cART (P = 0.02). At week 8, patients in the groups receiving 24 and 60 weeks of cART reported more nausea (P = 0.002), diarrhoea (P ConclusionsTemporary cART during PHI had a significant positive impact on patients' HRQL as compared with no treatment, despite the initial, short-term occurrence of more physical symptoms, probably related to drug toxicity.
A Case of Kaposi's Sarcoma during Primary HIV-1 Infection.
The majority of cases of Kaposi's sarcoma (KS) occur at low CD4 counts during chronic HIV-1 infection. We present a case of KS which was diagnosed during primary HIV-1 infection. This report aims to draw attention that KS may occur early in the course of HIV-1 infection and that primary HIV-1 infection may rapidly progress to AIDS.
Adaptation of HIV-1 envelope gp120 to humoral immunity at a population level.
By comparing HIV-1 variants from people who became infected at the beginning of the epidemic and from people who have recently contracted the virus, we observed an enhanced resistance of the virus to antibody neutralization over time, accompanied by an increase in the length of the variable loops and in the number of potential N-linked glycosylation sites on the HIV-1 envelope gp120 subunit. The enhanced neutralization resistance of HIV-1 in contemporary seroconverters coincided with the poorer elicitation of neutralizing antibody responses, which may have implications for vaccine design.
Activity of broadly neutralizing antibodies, including PG9, PG16, and VRC01, against recently transmitted subtype B HIV-1 variants from early and late in the epidemic.
For the development of a neutralizing antibody-based human immunodeficiency virus type 1 (HIV-1) vaccine, it is important to characterize which antibody specificities are most effective against currently circulating HIV-1 variants. We recently reported that HIV-1 has become more resistant to antibody neutralization over the course of the epidemic, and we here explore whether this increased neutralization resistance is also observed for the newly identified broadly neutralizing antibodies (BrNAbs) PG9, PG16, and VRC01. Furthermore, we performed a comprehensive analysis of the neutralizing sensitivity of currently circulating recently transmitted subtype B viruses to the currently most known BrNAbs. Virus variants isolated less than 6 months after seroconversion from individuals who seroconverted between 2003 and 2006 (n = 21) were significantly more resistant to neutralization by VRC01 than viruses from individuals who seroconverted between 1985 and 1989 (n = 14). In addition, viruses from contemporary seroconverters tended to be more resistant to neutralization by PG16, which coincided with the presence of more mutations at positions in the viral envelope that may potentially influence neutralization by this antibody. Despite this increased neutralization resistance, all recently transmitted viruses from contemporary seroconverters were sensitive to at least one BrNAb at concentrations of ≤5 μg/ml, with PG9, PG16, and VRC01 showing the greatest breadth of neutralization at lower concentrations. These results suggest that a vaccine capable of eliciting multiple BrNAb specificities will be necessary for protection of the population against HIV-1 infection.
Altered dynamics and differential infection profiles of lymphoid and myeloid cell subsets during acute and chronic HIV-1 infection.
The dynamics of immune cell populations during acute HIV-1 infection are not fully deciphered, especially for non-T cells. In this study, we tested whether specific cellular subsets of the innate arm of the immune response are affected early after HIV-1 infection. Using a cohort of HIV-1-infected individuals, we have monitored the relative frequency of blood T lymphocytes, monocytes, and DCs at various infection stages and measured their respective intracellular HIV-1 DNA loads. The HIV-1 DNA load in naive CD4(+) T lymphocytes, which are lost very early during acute infection, was ten- to 100-fold lower than in CD57(-) and CD57(+) memory CD4(+) T lymphocytes. We observed that despite rapid, persistent loss after HIV-1 infection, pDCs represented a non-negligible HIV-1 DNA reservoir. CD16(+) proinflammatory cDCs and monocytes accumulated gradually, and HIV-infected CD16(+) monocytes contained higher HIV-1 DNA loads than their CD16(-) counterpart during acute infection. During chronic infection, CD16(+) cDCs exhibited higher HIV-1 DNA loads than the CD16(-) population. Overall, our results demonstrate that non-T cell compartments are a major HIV-1 DNA reservoir, and CD16(+) monocytes and CD16(+) cDCs potentially play an important role in HIV-1 dissemination.
With Bare Feet in the Soil: Podoconiosis, a Neglected Cause of Tropical Lymphoedema.
Podoconiosis is a form of lymphoedema that occurs in tropical highland areas in genetically susceptible individuals who are exposed to irritant volcanic soils. The disease is preventable through consistent use of footwear and attention to foot hygiene; however, in endemic areas there is a strong barefoot tradition, and many cannot afford shoes. Patients with podoconiosis face significant physical disability, psychological comorbidity, reduced quality of life and experience frequent episodes of systemic illness due to acute dermatolymphangioadenitis. This review provides an overview of this important and neglected tropical skin disease and summarizes the latest research findings.
No treatment versus 24 or 60 weeks of antiretroviral treatment during primary HIV infection: the randomized Primo-SHM trial.
BackgroundThe objective of this study was to assess the benefit of temporary combination antiretroviral therapy (cART) during primary HIV infection (PHI).Methods and findingsAdult patients with laboratory evidence of PHI were recruited in 13 HIV treatment centers in the Netherlands and randomly assigned to receive no treatment or 24 or 60 wk of cART (allocation in a 1∶1∶1 ratio); if therapy was clinically indicated, participants were randomized over the two treatment arms (allocation in a 1∶1 ratio). Primary end points were (1) viral set point, defined as the plasma viral load 36 wk after randomization in the no treatment arm and 36 wk after treatment interruption in the treatment arms, and (2) the total time that patients were off therapy, defined as the time between randomization and start of cART in the no treatment arm, and the time between treatment interruption and restart of cART in the treatment arms. cART was (re)started in case of confirmed CD4 cell count < 350 cells/mm(3) or symptomatic HIV disease. In total, 173 participants were randomized. The modified intention-to-treat analysis comprised 168 patients: 115 were randomized over the three study arms, and 53 randomized over the two treatment arms. Of the 115 patients randomized over the three study arms, mean viral set point was 4.8 (standard deviation 0.6) log(10) copies/ml in the no treatment arm, and 4.0 (1.0) and 4.3 (0.9) log(10) copies/ml in the 24- and 60-wk treatment arms (between groups: p < 0.001). The median total time off therapy in the no treatment arm was 0.7 (95% CI 0.0-1.8) y compared to 3.0 (1.9-4.2) and 1.8 (0.5-3.0) y in the 24- and 60-wk treatment arms (log rank test, p < 0.001). In the adjusted Cox analysis, both 24 wk (hazard ratio 0.42 [95% CI 0.25-0.73]) and 60 wk of early treatment (hazard ratio 0.55 [0.32-0.95]) were associated with time to (re)start of cART.ConclusionsIn this trial, temporary cART during PHI was found to transiently lower the viral set point and defer the restart of cART during chronic HIV infection.