Hepatology
○ Ovid Technologies (Wolters Kluwer Health)
Preprints posted in the last 7 days, ranked by how well they match Hepatology's content profile, based on 18 papers previously published here. The average preprint has a 0.02% match score for this journal, so anything above that is already an above-average fit.
Soundararajan, V.; Venkatakrishnan, A. J.; Murugadoss, K.; K, P.; Varma, G.; Aman, A.
Show abstract
Semaglutide has shown benefit in metabolic dysfunction-associated steatohepatitis (MASH), but real-world evidence across longitudinal liver phenotypes remains limited, particularly regarding how liver remodeling relates to weight loss and dose exposure. Using a de-identified federated electronic health record network spanning more than 29 million patients in the United States, including 489,785 semaglutide-treated adults, we analyzed 6,734 patients with baseline liver disease burden. We find that higher attained pre-landmark (0-2 years) semaglutide dose was associated with lower post-landmark (2-4 years) risk of steatohepatitis, alcoholic liver disease, and all-cause mortality, whereas greater pre-landmark weight loss was associated with lower post-landmark risk of steatohepatitis, steatotic liver disease, and hepatorenal syndrome, indicating distinct dose- and weight-linked patterns of long-term liver benefits. These associations were notable because semaglutide prescribing was generally lower during the post-landmark period, raising the possibility of durable benefit beyond peak exposure. Towards better understanding mechanistic bases for liver protection, we performed a complementary longitudinal study of 326 adults with paired noninvasive liver elastography measurements before and after treatment initiation. Median liver stiffness decreased from 4.85 [3.02 - 7.20] to 3.9 [2.6 - 5.8] kPa after semaglutide initiation (median change = -0.38 kPa; p<0.001), with 194 of 326 patients (59.5%) showing lower follow-up stiffness. A clinically meaningful reduction of at least 20% was observed in 133 of 326 patients (40.8%), and 69 of 326 (21.2%) shifted to a lower fibrosis stage by prespecified elastography thresholds. Larger improvements were also seen in patients with higher baseline stiffness (p<0.001); notably 80% of patients with cirrhosis-range baseline stiffness ([≥]12.5 kPa) achieved [≥]20% improvement versus 29.5% with minimal baseline disease (p <0.001). The proportion achieving at least 20% stiffness improvement was similar across weight-loss strata, including patients with no weight loss or weight gain and those with at least 10% weight loss (38.0% in each group), and liver stiffness change showed negligible correlation with changes in weight, BMI, HBA1c, alanine aminotransferase, or aspartate aminotransferase. To provide biological context, single cell RNA analyses demonstrated sparse overall hepatic GLP1R expression (0.0239%), with enrichment in non-parenchymal niches including cholangiocytes, intrahepatic cholangiocytes, liver sinusoidal endothelial cells, and hepatic stellate cells implicated in fibrogenesis and vascular remodeling. Together, this real-world evidence suggests diverse liver benefits for semaglutide beyond weight-loss with intricate dose response relationships.
Sergeant, S.; Easter, L.; Mustin, T.; Ivester, P.; Legins, J.; Seeds, M. C.; Standage-Beier, C. S.; Cox, A.; Furdui, C. M.; Hallmark, B.; Chilton, F. H.
Show abstract
The modern Western diet (MWD) provides high linoleic acid (LA) exposure, typically contributing 6-9% of total caloric intake. These high LA levels have fueled a longstanding debate regarding whether this dietary pattern confers benefit or risk. Importantly, LA intake is disproportionately elevated among lower socioeconomic populations due to greater reliance on industrial seed oils and ultra-processed foods. Despite decades of research, controlled dietary intervention studies directly evaluating the biological consequences of varying LA exposure remain limited. The current randomized, double-blind intervention compared the effects of a 12-week Low LA diet (2.5% energy) versus a High LA diet (10.0% energy) in healthy adults. Primary outcomes included plasma highly unsaturated fatty acid (HUFA) concentrations and ex vivo zymosan-stimulated whole-blood oxylipin generation. Fifty- two participants completed the intervention. High LA exposure resulted in a marked reduction in plasma n-3 eicosapentaenoic acid (EPA) concentrations compared with the LowLA arm. In contrast, levels of arachidonic acid (ARA), dihomo-gamma-linolenic acid (DGLA) and docosahexaenoic acid (DHA) did not differ by dietary LA exposure. Analysis of oxylipin species revealed that levels of EPA-derived relative to ARA-derived mediators were significantly reduced in the High LA arm. These findings reveal that higher dietary LA selectively suppresses EPA pools and EPA-derived oxylipins without altering ARA, shifting the lipid mediator balance toward a more n-6-dominant profile.
Xie, R.; Schöttker, B.
Show abstract
Background & AimsClonal hematopoiesis of indeterminate potential (CHIP) has been linked to chronic liver disease progression, yet its role across the full spectrum of metabolic dysfunction-associated steatotic liver disease (MASLD), from its initial development to end-stage complications, remains unclear. We aimed to comprehensively investigate the association of CHIP and its major subtypes with both the incidence and progression of MASLD. MethodsWe conducted a prospective cohort study of 353,218 UK Biobank participants, stratified into a healthy cohort free of MASLD at baseline (Cohort 1; n=230,270) and a prevalent MASLD cohort (Cohort 2; n=122,948). CHIP was ascertained from whole-exome sequencing data. We used multivariable Cox regression, competing risk models, and mediation analyses to assess the associations of CHIP (overall, by driver gene, and by clone size) with incident MASLD, cirrhosis, hepatocellular carcinoma (HCC), and liver-related death. ResultsIn Cohort 1, CHIP was associated with an increased risk of incident MASLD (HR 1.25, 95% CI 1.08-1.44) and cirrhosis (HR 1.57, 95% CI 1.10-2.25). These associations were driven by non-DNMT3A mutations, particularly TET2, and showed a linear dose-response relationship with clone size. In Cohort 2, non-DNMT3A CHIP was associated with progression to cirrhosis (HR 1.82, 95% CI 1.28-2.58). The associations were more pronounced in males and in individuals without obesity or diabetes. C-reactive protein partially mediated the CHIP-MASLD association. ConclusionCHIP, driven predominantly by non-DNMT3A mutations (particularly TET2) is an independent risk factor for both the development and progression of MASLD. These findings position CHIP as a novel player in the pathophysiology of MASLD and suggest potential avenues for risk stratification and targeted anti-inflammatory intervention. Impact and ImplicationsThis large-scale, prospective study establishes clonal hematopoiesis of indeterminate potential (CHIP) as a novel and independent risk factor for the entire spectrum of metabolic dysfunction-associated steatotic liver disease (MASLD), from its initial development to its progression to cirrhosis and liver-related death. For hepatologists and hematologists, these findings identify a genetically defined, high-risk subpopulation, particularly individuals with non-DNMT3A mutations, who may benefit from enhanced liver surveillance. The identification of systemic inflammation as a partial mediator of the CHIP-MASLD association suggests that anti-inflammatory therapies currently under development for liver disease could represent a targeted treatment strategy for this growing patient population.
Diaz, F. C.; Waldrup, B.; Carranza, F. G.; Manjarrez, S.; Velazquez-Villarreal, E.
Show abstract
BackgroundDespite extensive characterization of key oncogenic drivers, pancreatic ductal adenocarcinoma (PDAC) continues to exhibit profound molecular heterogeneity and inconsistent responses to standard therapies, including gemcitabine. The role of pathway-level alterations, particularly in the context of age at onset and therapeutic exposure, remains insufficiently defined. MethodsIn this study, we leveraged a conversational artificial intelligence framework (AI-HOPE-TP53 and AI-HOPE-PI3K) to enable precision oncology, driven interrogation of clinical and genomic data from 184 PDAC tumors, stratified by age at diagnosis and gemcitabine exposure. Using AI-enabled cohort construction and pathway-centric analyses, we evaluated alterations in TP53 and PI3K signaling networks, with findings validated through conventional statistical methods. ResultsTP53 pathway analysis revealed a significantly higher frequency of TP53 mutations in early-onset compared to late-onset PDAC among gemcitabine-treated patients (86.7% vs. 57.1%, p = 0.04), with a similar trend observed between treated and untreated early-onset cases (86.7% vs. 40%, p = 0.07). Notably, in late-onset PDAC patients not treated with gemcitabine, absence of TP53 pathway alterations was associated with improved overall survival (p = 0.011). Complementary analyses of the PI3K pathway demonstrated a higher prevalence of pathway alterations in late-onset gemcitabine-treated tumors compared to untreated counterparts (13.2% vs. 2.7%, p = 0.02). Importantly, among late-onset patients not receiving gemcitabine, those without PI3K pathway alterations exhibited significantly improved overall survival (p < 0.0001). ConclusionTogether, these findings identify distinct TP53 and PI3K pathway dependencies that are modulated by both age-of-onset and treatment exposure in PDAC. This work highlights the utility of conversational artificial intelligence in enabling rapid, integrative, and hypothesis-generating analyses within a precision oncology framework, supporting the identification of clinically relevant molecular stratification strategies for this aggressive disease.
Haeusler, I. L.; Etoori, D.; Campbell, C. N. J.; McDonald, S. L. R.; Lopez Bernal, J.; Mounier-Jack, S.; Kasstan-Dabush, B.; McDonald, H. I.; Parker, E. P. K.; Suffel, A.
Show abstract
BackgroundIn England, individuals with chronic liver disease (CLD) are among those with the lowest seasonal influenza vaccine uptake despite being at elevated risk of severe influenza. We examined the relationship between CLD severity and aetiology, and influenza vaccine uptake in England. MethodsA retrospective cohort study of adults (18-115 years) using Clinical Practice Research Datalink Aurum primary care data was conducted for five seasons (2019/20-2023/24). Poisson regression was used to estimate rates of uptake by CLD severity (clinical diagnoses categorised as low, moderate, or severe) and aetiology (alcohol-related, viral-related, and diagnoses in the Green Book guidelines). FindingsThere were 182,174-277,470 with CLD per cohort. Among those who were additionally age-eligible for vaccination, uptake was 71{middle dot}1-79{middle dot}7% compared to 30{middle dot}9-40{middle dot}5% in those not additionally age-eligible. Among individuals below age eligibility without other comorbidities, severity was associated with higher uptake (incidence rate ratio [IRR] moderate 1{middle dot}80, 95% CI 1{middle dot}69-1{middle dot}90; severe 1{middle dot}95, 95% CI 1{middle dot}84-2{middle dot}08 in 2023/24); there was no effect in those with at least one additional comorbidity (moderate 1{middle dot}05, 95% CI 0{middle dot}99-1{middle dot}10; severe 1{middle dot}05, 95% CI 1{middle dot}01-1{middle dot}09). Alcohol- and viral-related aetiology were also associated with increased uptake in those not additionally age-eligible. Among individuals meeting age eligibility without additional comorbidities, severity was associated with a reduced uptake (moderate 0{middle dot}81, 95% CI 0{middle dot}73-0{middle dot}90; severe 0{middle dot}79, 95% CI 0{middle dot}74-0{middle dot}85), with attenuation in those with additional comorbidities (moderate 0{middle dot}99, 95% CI 0{middle dot}94-1{middle dot}04; severe 0{middle dot}91, 95% CI 0{middle dot}89-0{middle dot}94). InterpretationCLD severity and aetiology were important determinants of uptake in the absence of additional indications for influenza vaccination. Future research should prioritise understanding facilitators and barriers to vaccine uptake in individuals with CLD, particularly for those at highest risk of severe infection. FundingNIHR Health Protection Research Unit in Vaccines and Immunisation (NIHR200929/NIHR207408). Research in contextO_ST_ABSEvidence before this studyC_ST_ABSWe searched PubMed up to June 2025 using the terms "chronic liver disease", "cirrhosis", "hepatitis", "influenza vaccination", "seasonal influenza", and "vaccine uptake". Previous research, including national data from England, has shown that people with chronic liver disease tend to have lower seasonal influenza vaccine uptake than individuals with other medical comorbidities which qualify for vaccination such as diabetes, chronic kidney disease or immunosuppression. The reasons for low influenza vaccine uptake in people with chronic liver disease are not well understood, and it is therefore difficult for vaccination providers, principally primary care services in England, to tailor interventions aimed to increase uptake. Qualitative research involving individuals aged less than 65 years living in England with clinical risk comorbidities, most commonly diabetes, found that chronic disease management pathways inconsistently provided information about the importance of influenza vaccination as part of chronic disease management. Individuals with long-term conditions reported low perceived risk of influenza infection and limited awareness of vaccine benefits as important reasons for non-uptake. We hypothesised that the severity and aetiology of chronic liver disease may be important determinants of uptake. Added value of this studyWe conducted a population-based study to examine how chronic liver disease severity and aetiology influence seasonal influenza vaccine uptake in adults in England. Using primary care electronic health record data from five consecutive influenza seasons (2019/20-2023/24), we found that more severe chronic liver disease was associated with a substantial increase in vaccine uptake in those without additional indications for seasonal influenza vaccination (age-based eligibility or other qualifying clinical risk comorbidities). Alcohol- and viral-related aetiology were also associated with increased uptake in those who were not additionally age-eligible for vaccination. In contrast, severity, alcohol- and viral-related underlying aetiology were associated with a modest reduction in uptake for individuals with chronic liver disease who also qualified for vaccination due to age. Implications of all the available evidenceDespite clear clinical vulnerability to infection and a substantially elevated risk of morbidity and mortality following infection, a large proportion of adults with chronic liver disease, particularly those aged under 65 years, remain unvaccinated against seasonal influenza each year. This study suggests that chronic liver disease severity and underlying aetiology are important determinants of uptake in individuals not meeting age-based vaccine eligibility, particularly in those without additional clinical risk comorbidities. This could be because of differing perceptions of influenza risk, or due to varying degrees of interaction with healthcare specialists as part of chronic disease management. In individuals who met age-based vaccination eligibility, the negative effect of severity on influenza vaccine uptake may reflect greater barriers to accessing vaccination services by those with more complex health needs, or competing medical priorities for long-term condition management during consultations. To inform targeted vaccination strategies, future research should aim to understand the specific facilitators and barriers to influenza vaccination experienced by individuals with chronic liver disease. This should include perspectives of individuals with different disease severity, across different age groups, in those with and without additional co-morbidities.
Hoskins, J. W.; Christensen, T. A.; Eiser, D.; Char, E.; Mobaraki, M.; O'Brien, A.; Collins, I.; Zhong, J.; Patel, M. B.; Prasad, G.; Pancreatic Cancer Cohort Consortium and Pancreatic Cancer Case-Control Consortium (PanScan/PanC4), ; Arda, E.; Connelly, K. E.; Amundadottir, L. T.
Show abstract
Pancreatic ductal adenocarcinoma (PDAC) remains one of the deadliest human cancers. The current largest published PDAC Genome-Wide Association Study (GWAS) identified 23 genetic risk signals, but most lack sufficient characterization. This study aimed to functionally characterize the chr13q12.2 (PLUT/PDX1) PDAC GWAS risk locus. Fine-mapping, luciferase reporter assays, and electrophoretic mobility shift assays implicated rs9581943, a PDX1 promoter SNP, as a functional variant underlying this GWAS signal. GTEx expression QTL analyses identified rs9581943 as a significant PDX1 eQTL in pancreas, and CRISPR/Cas9 editing in PDAC-derived cell lines confirmed a functional relationship. PDX1 is a transcription factor involved in early pancreas development and {beta}-cell homeostasis, but its role in exocrine pancreatic cells is unclear. Single-nucleus RNA-seq analyses of pancreatic acinar and ductal cells from neonatal, adult, and chronic pancreatitis donors suggested PDX1 activity alleviates high secretory load and ER-stress in acinar and biases ducts toward homeostatic phenotypes. Similarly, scRNA-seq analyses of pancreatic tumors suggested PDX1 activity reduces biosynthetic and inflammatory stress and promotes epithelial differentiation. Our study therefore implicates rs9581943 as a causal variant for the chr13q12.2 PDAC GWAS signal wherein the risk allele reduces PDX1 expression, eroding PDX1's capacity to buffer stress and stabilize epithelial cell fate in the exocrine compartment.
Flevaris, K.; Trbojevic-Akmacic, I.; Goh, D.; Lalli, J. S.; Vuckovic, F.; Capin Vilaj, M.; Stambuk, J.; Kristic, J.; Mijakovac, A.; Ventham, N.; Kalla, R.; Latiano, A.; Manetti, N.; Li, D.; McGovern, D. P. B.; Kennedy, N. A.; Annese, V.; Lauc, G.; Satsangi, J.; Kontoravdi, C.
Show abstract
Background and Aims: Alterations in immunoglobulin G (IgG) N-glycosylation are implicated in inflammatory bowel disease (IBD); however, the robustness of IgG glycan signatures across IBD cohorts with diverse demographics and geographic origins remains underexplored. We aimed to determine whether compositional data analysis (CoDA) and machine learning (ML) can identify IBD-related IgG N-glycan signatures and whether these signatures capture disease-associated acceleration of biological aging. Methods: We analyzed the IgG glycome profiles of 1,367 plasma samples collected from healthy controls (HC), symptomatic controls (SC), and people with newly diagnosed Crohn's (CD), and ulcerative colitis (UC) across four cohorts (UK, Italy, United States, and Netherlands). IgG glycosylation was analyzed by ultra-high-performance liquid chromatography, yielding 24 total-area-normalized glycan peaks (GPs). Analyses were performed using cross-sectional data obtained at baseline. CoDA-powered association analyses were used to identify disease-related effects on GPs while controlling for demographic covariates. ML models were trained and evaluated to assess generalizability to unseen cohorts and demographic subgroups, with a focus on discrimination and reliability. Results: Across all cohorts, people with IBD demonstrated accelerated biological aging as quantified by the GlycanAge index. This was accompanied by consistent reductions in IgG galactosylation, with effects partially modulated by age. Classification models trained on glycomics and demographics achieved robust discrimination (AUROC~0.80) between non-IBD (HC+SC) and IBD across cohorts. Conclusion: These findings reveal accelerated biological aging in people with IBD and support the translational potential of IgG glycans as biomarkers and a novel route toward clinically interpretable personalized risk estimates.
Shen, Q.; Wang, G.; Fu, M.; Yao, K.; Yang, Y.; Zeng, Q.; Guo, Y.
Show abstract
Background: Lateral lymph node metastasis (LLNM) is associated with poor prognosis in patients with rectal cancer and may influence the indication for lateral lymph node dissection. Accurate preoperative identification of LLNM remains challenging. This study aimed to develop and internally validate a clinicoradiological model for preoperative prediction of LLNM in rectal cancer. Methods A retrospective cohort of 64 patients undergoing lateral lymph node dissection (LLND) for rectal cancer was analysed; 21 (32.8%) had pathological lateral lymph node metastasis (LLNM). A prespecified preoperative clinicoradiological model was fitted using penalised logistic regression with L2 regularisation (ridge), incorporating MRI-measured lateral lymph node short-axis diameter (LLN-SAD), dichotomised clinical T stage (T3-4 vs T1-2), dichotomised clinical N stage (N+ vs N0), and log(CA19-9+1). Model performance was evaluated using the area under the receiver operating characteristic curve (AUC), calibration analysis, and bootstrap internal validation. Results The model showed good discrimination (AUC 0.914), with an optimism-corrected AUC of 0.887 on bootstrap validation. Calibration remained acceptable after optimism correction (calibration intercept -0.127; slope 1.045). Decision curve analysis suggested net benefit across clinically relevant threshold probabilities, particularly between 0.10 and 0.30. The model was implemented as a web-based calculator to facilitate clinical use. Conclusion This clinicoradiological model showed good discrimination, acceptable calibration, and potential clinical utility for preoperative assessment of LLNM risk in rectal cancer. It may assist individualized risk stratification and treatment planning, although external validation is required before routine clinical implementation.
Du, J.; Manna, A. K.; Medina-Serpas, M. A.; Hughes, E. P.; Bisoma, P.; Evason, K. J.; Young, A.; Wilson, W. D.; Brusko, T.; Farahat, A. A.; Tantin, D.
Show abstract
The transcription coregulator OCA-B promotes CD4+ T cell memory recall responses and autoimmunity. OCA-B T cell deletion prevents spontaneous type-1 diabetes (T1D) onset in non-obese diabetic (NOD) mice and blunts T1D in a subset of more aggressive models. However, the role of OCA-B in diabetes induced by treatment with immune checkpoint inhibitors (ICIs), and the role of OCA-B in the control of tumors with and without ICI treatment, has not been studied. Here we show that islet and pancreatic lymph node T cells from T1D individuals express measurable POU2AF1 mRNA. Deletion of OCA-B in T cells fully insulates 8-week-old non-obese diabetic (NOD) mice against ICI-induced diabetes and partially protects 12-week-old mice. Salivary and lacrimal gland infiltration and inflammation were also reduced. Protection was associated with a block in the differentiation of progenitor exhausted CD8+ T cells (TPEX) into terminally exhausted CD8+ T cells (TEX). We show that OCA-B T cell loss preserves anti-tumor immune responses following PD-1 blockade in different tumors and mouse strains. These findings point to a potential therapeutic window in which pharmaceuticals targeting OCA-B could be used to block the emergence of both spontaneous and ICI-induced autoimmunity while sparing anti-tumor immunity. We develop first-in-class small molecule inhibitors of Oct1/OCA-B transcription complexes and show that administration into NOD mice also blocks diabetes emergence following PD-1 blockade. These results identify OCA-B as a promising therapeutic target for the prevention of autoimmunity and immune-related adverse events (irAEs).
Ni Chan Chin (Chengqin Ni), M.; Berrio, J. A.
Show abstract
BackgroundAccelerometer-derived behavioral phenotype captures multidimensional aspects of human behavior extending well beyond physical activity, encompassing light exposure, step counts, physical activity patterns, sleep, and circadian rhythms. Whether these five domains constitute a unified behavioral architecture underlying cancer risk and whether circadian organization and light exposure confer incremental predictive value beyond movement volume alone remains to be comprehensively established. MethodsWe conducted an accelerometer-wide association study (AWAS) encompassing the complete accelerometer-derived behavioral exposome across five behavioral domains in UK Biobank participants with valid wrist accelerometry data. Incident solid cancers were designated as the primary endpoint, with prespecified site-specific solid cancers and hematological malignancy as secondary outcomes. Cox proportional hazards models with age as the timescale were used. The minimal covariate set served as the primary reporting tier, followed by sensitivity analyses additionally adjusting for adiposity/metabolic factors, independent activity patterns, shift work history, and accelerometry measurement quality. Nominal statistical significance was defined as two-sided P < 0.05 ResultsAmong 89,080 participants, 6,598 incident solid cancer events were observed over a median follow-up of 8.39 years. In the minimally adjusted model, the pan-solid-tumor association atlas was dominated by signals from activity volume, inactivity fragmentation, and circadian rhythm. Higher overall acceleration (HR per SD: 0.91, 95% CI: 0.89-0.94) and higher daily step counts (HR: 0.93, 95% CI: 0.90-0.95) were independently associated with reduced solid cancer risk, while inactivity fragmentation metrics were consistently linked to higher risk. Notably, circadian rhythms, most prominently cosinor mesor (Midline Estimating Statistic of Rhythm under cosinor model), emerged as leading inverse risk signals, underscoring the independent contribution of circadian behavioral architecture. Site-specific analyses revealed pronounced heterogeneity across tumor sites. Lung cancer exhibited a robust inverse activity-risk gradient, while breast cancer showed reproducible associations with MVPA. Most strikingly, nocturnal light exposure demonstrated a tumor-site-specific association confined to pancreatic cancer, a signal absent across all other sites examined. Associations for uterine cancer were predominantly inactivity-related and substantially attenuated following adjustment for adiposity and metabolic factors. ConclusionsAcross five accelerometer-derived behavioral domains, solid cancers as a whole were most consistently associated with a high-movement, low-fragmentation, and circadian-coherent behavioral profile. While site-specific heterogeneity exists, the broad cancer risk landscape is dominated by movement volume, inactivity fragmentation, and circadian rhythmicity. Light exposure, although more localized in its contribution, demonstrates a potentially novel and specific association with pancreatic cancer risk. These findings support a five-domain behavioral exposome framework for cancer epidemiology and, importantly, position circadian rhythm integrity and nocturnal light exposure as critically understudied dimensions warranting dedicated mechanistic investigation.
Moon, J.-Y.; Filigrana, P.; Gallo, L. C.; Perreira, K. M.; Cai, J.; Daviglus, M.; Fernandez-Rhodes, L. E.; Garcia-Bedoya, O.; Qi, Q.; Thyagarajan, B.; Tarraf, W.; Wang, T.; Kaplan, R.; Isasi, C. R.
Show abstract
Childhood socioeconomic position (SEP) can have lifelong effects on health. Many studies have used adult height as a surrogate marker for early-life conditions. In this study, we derived the non-genetic component of height, calculated as the residual from sex-specific standardized height regressed on genetically predicted height, as a surrogate for childhood SEP, using data from the Hispanic Community Healthy Study/Study of Latinos (2008-2011). A positive residual would indicate favorable early-life conditions promoting growth, while a negative residual indicates early-life adversity that may stunt the development. The height residual was associated with early-life variables such as parental education, year of birth, US nativity and age at first migration to the US (50 states/DC), supporting the validity of height residual as a surrogate for early-life conditions. Furthermore, a height residual was positively associated with better cardiovascular health (CVH) and cognitive function among middle-aged and older adults. Interestingly, among <35 years old, the height residual was negatively associated with the "Lifes Essential 8" clinical CVH scores. These results suggest the non-genetic component of height as a surrogate for childhood environment, with predictive value for CVH and cognitive function.
Andrei, F.; Tizzoni, M.; Veltri, G. A.
Show abstract
Background: Dengue is rapidly emerging in parts of Europe. How households value vector control attributes, and whether inferences depend on decision models or message framing, is unclear. Methods: We conducted a split-ballot online experiment among adults in Italy and France, as well as a hotspot subsample from Marche, Italy. National samples included 1,505 respondents in Italy and 1,501 in France; 183 respondents were recruited in Marche. Participants were randomised to a discrete choice experiment (random utility maximisation) or a regret-based choice experiment (random regret minimisation) and to one of three pre-task messages (control, loss aversion, community values). Each respondent completed 12 choice tasks comparing two dengue control programmes and an opt-out. We estimated mixed logit and mixed random-regret models with random parameters and treatment effects. Results: Across frameworks, nearby cases and high mosquito prevalence were the dominant drivers of programme uptake, whereas cost and operational burden were secondary. In pooled analyses, loss-aversion messaging increased the weight on high mosquito prevalence in both models (from 0.483 to 0.547 in the utility model; from 0.478 to 0.557 in the regret model). Cost effects were small nationally but larger in the hotspot subsample. Conclusions: Risk salience dominates preferences for dengue vector control in these European settings. Random utility and random regret models yield consistent rankings of attributes but differ in behavioural interpretation and some secondary effects; messaging effects were modest and context dependent.
Fitzgerald, O.; Keller, E.; Illingworth, P.; Lieberman, D.; Peate, M.; Kotevski, D.; Paul, R.; Rodino, I.; Parle, A.; Hammarberg, K.; Copp, T.; Chambers, G. M.
Show abstract
Study questionWhat are the characteristics and treatment outcomes of women who undertook planned egg freezing (PEF) in Australia and New Zealand between 2009 and 2023? Summary answerThere has been an average yearly increase in the uptake of PEF of 35%, with most women undergoing a single PEF procedure in their mid-thirties. Given ten years follow-up a little over one in four women return, with nearly half of those using donor sperm and one-third achieving a live birth. What is known alreadyPEF, where women freeze their eggs as a strategy to preserve fertility, has increased dramatically in high income countries in the last decade. Despite the rapid uptake of PEF, there remains limited information to guide women, clinicians and policy makers regarding the characteristics of women undertaking this procedure and treatment outcomes. Study design, size, durationA retrospective population-based cohort study of all women who undertook PEF in Australia and New Zealand between 2009 and 2023, including their subsequent return to thaw their eggs and treatment outcomes. Where women returned to utilise their eggs, all subsequent embryo transfer procedures were linked enabling calculation of live birth rates per woman. Participants/materials, setting, methods20,209 women who undertook PEF in Australia and New Zealand between 2009 and 2023 including 1,657 women who returned to thaw their eggs. Main results and the role of chanceThere has been a huge increase in uptake of PEF, from 55 women in 2009 to 4,919 in 2023. Women who freeze their eggs are typically aged 34-38 years (interquartile range) and nulliparous (98.6%). For women with at least 10 years follow-up (i.e. undertook PEF in 2009-13; N=514), 27.9% returned and thawed their frozen eggs (average time to return: 4.9 years). This reduced to 22.1% in those with at least 5 years follow-up (i.e. undertook PEF in 2009-2018; N=4,288). Of those who used their frozen eggs, 47% used donor sperm. After at least two years follow up, 33.9% had a live birth, rising over time to 37.8% for eggs thawed between 2019-2021. Limitations, reasons for cautionIn the timeframe 2009-2019 we did not have information on whether egg freezing occurred because of a cancer diagnosis, a cohort we wished to exclude from the study. As a result, for this timeframe we weighted observations by the probability that egg freezing occurred due to cancer, with the prediction model developed on the years 2020-2023. Wider implications of the findingsThis study provides recent and comprehensive data on PEF to guide prospective patients and clinicians and inform policy. The exponential growth in PEF in Australia and New Zealand mirrors trends in other high-income countries, suggesting a doubling time of 2-3 years. Study findings highlight the need for setting realistic expectations about the likelihood of returning to use frozen eggs and live birth rates. Study funding/competing interest(s)2020-2025 MRFF Emerging Priorities and Consumer Driven Research initiative: EPCD000014
Huang, X.; Hsieh, C.; Nguyen, Q.; Renteria, M. E.; Gharahkhani, P.
Show abstract
Wearable-derived physiological features have been associated with disease risk, but most current studies focus on single conditions, limiting understanding of cross-disease patterns. This study adopts a trans-diagnostic approach to examine whether wearable data capture shared and condition-specific physiological signatures across multiple chronic conditions spanning physical and mental health, and then evaluates the utility of these features for disease classification. A total of 9,301 patients with at least 21 days of consecutive FitBit data from the All of Us Controlled Tier Dataset version 8 were analyzed. Disease subcohorts included cardiovascular disease (CVD), diabetes, obstructive sleep apnea (OSA), major depressive disorder (MDD), anxiety, bipolar disorder, and attention-deficit/ hyperactivity disorder (ADHD), chosen based on prevalence and relevance. Logistic regression and XGBoost models were fitted for each disease subcohort versus the control cohort. We found that compared to using just baseline demographic and lifestyle features, incorporating wearable-derived features enabled improved classification performance in all subcohorts for both models, except for ADHD where improvement was mainly observed for ROC-AUC in logistic regression model likely due to the smaller sample size in ADHD subcohort. The largest performance gains were observed in MDD (increase in ROC-AUC of 0.077 for Logistic regression, 0.071 for XGBoost; p < 0.001) and anxiety (increase in ROC-AUC of 0.077 for logistic regression, 0.108 for XGBoost; p < 0.001). This study provides one of the first comprehensive transdiagnostic evaluations of wearable-derived features for disease classification, highlighting their potential to enhance risk stratification in the real-world setting as a practical complement to clinical assessments and providing a foundation to explore more fine-grained wearable data. Author summaryWearable devices such as fitness trackers and smartwatches are becoming increasingly popular and affordable, providing continuous measurements of heart rate, physical activity, and sleep. Alongside the growing digitization of health records, this creates new opportunities for large-scale, real-world health studies. In this study, we analyzed wearable-derived physiological patterns across a range of chronic conditions spanning both physical and mental health to better understand how these signals relate to disease risk. We found that incorporating wearable-derived heart rate, activity and sleep features improved disease risk classification across several conditions, with particularly strong gains for major depressive disorder and anxiety. By examining how individual features contributed to model predictions, we also identified meaningful associations between physiological signals and disease risk. For example, both duration and day-to-day variation of deep and rapid eye movement (REM) sleep were associated with increased risk in certain conditions. Our study supports the development of real-time, automated tools to assess disease risk alongside clinical care.
Baldry, G.; Harb, A.-K.; Findlater, L.; Ogaz, D.; Migchelsen, S. J.; Fifer, H.; Saunders, J.; Mohammed, H.; Sinka, K.
Show abstract
ObjectivesWe determined the frequency of sexually transmitted infection (STI) testing among people accessing sexual health services (SHS) in England. MethodsWe assessed STI testing frequency in face-to-face and online SHSs in England using data from the GUMCAD STI surveillance system. We quantified different combinations of tests (e.g. single chlamydia test or full STI screen), number of tests completed in 2024 and test positivity by sociodemographic and behavioural characteristics, as well as clinical setting and outcomes. ResultsOverall, there were 2,222,028 attendances at SHS in England in 2024 that involved tests for chlamydia, gonorrhoea, syphilis and/or HIV. Most of these attendances involved tests for all four of these STIs. Most people accessing SHS in England tested once (80.1%), and a small minority (1.9%) tested at least quarterly (4+ times). Some groups had a comparably larger proportion of quarterly testers; these included gay, bisexual, and other men who have sex with men (GBMSM) (6.7%), London residents (3.6%), online testers (2.5%), people using HIV-PrEP (13%), and people with 5+ partners in the previous 3 months (10.6%). Only 10.5% of GBMSM reporting higher-risk sexual behaviours tested quarterly despite recommendations for quarterly testing in this group. ConclusionsThe majority of those who tested for STIs in England in 2024 only tested once. The minority who tested at least quarterly had a higher proportion of GBMSM, people using HIV-PrEP, London residents and people reporting higher risk behaviours. Quarterly testing often appears to be aligned with current testing recommendations in England; however, we also observed that only a low proportion of behaviourally high-risk GBMSM and HIV-PrEP users are meeting these recommendations. It is important to acknowledge groups with lower or higher testing frequency when developing interventions and updating guidelines related to STI testing. WHAT IS ALREADY KNOWN ON THIS TOPICThe effectiveness of asymptomatic testing for chlamydia and gonorrhoea in gay, bisexual and other men who have sex with men (GBMSM), and the potential impact of the consequent increased antibiotic use on rising antimicrobial resistance and individual harm has recently been questioned. Testing and treatment remains a key pillar of STI prevention and management; despite this, there is limited evidence of STI testing frequency within sexual services (SHS) on a national level. WHAT THIS STUDY ADDSThis analysis shows that the majority of people attending SHSs in England in 2024 tested once, and only a small proportion of behaviourally high-risk people tested frequently. HOW THIS STUDY MIGHT AFFECT RESEARCH, PRACTICE OR POLICYAwareness of groups that are behaviourally high risk but testing infrequently is important to guide interventions and messaging regarding STI testing. The low levels of frequent testing, even among those who would be recommended quarterly testing under UK guidelines, provides important context for wider discussion around asymptomatic STI screening.
Soltys, K.; Sara-Buchbut, R.; Ish Shalom, N.; Stokar, J.; Klein, B. Y.; Calderon-Margalit, R.; Greenblatt, C. L.; Ben-Haim, M. S.
Show abstract
Dementia affects tens of millions of people worldwide, yet disease-modifying treatments remain strikingly limited. Although the recombinant zoster vaccine Shingrix has been associated with reduced dementia incidence, its potential influence on individuals already living with dementia is unknown. Here, we followed a propensity-score matched cohort of 68,960 US dementia patients using a nationwide electronic health record network, comparing Shingrix recipients within two years of diagnosis to recipients of any other vaccine. Shingrix was associated with substantially reduced all-cause mortality across the first three years of follow-up (hazard ratios 0.74, 0.88, and 0.89; P[≤]0.006), robust across multiple sensitivity analyses. Furthermore, within-individual subgroup analyses of repeated Mini-Mental State Examinations conducted 3-6 years apart revealed significantly divergent cognitive decline rates across groups (time-by-group interaction P=0.002). Interval vaccination was associated with more stable cognition, contrasting with steeper declines in unvaccinated individuals. These findings support prospective evaluation of recombinant zoster vaccination as a potential strategy to improve outcomes in patients with established dementia.
Mullen, C.; Barr, R. D.; Strumpf, E.; El-Zein, M.; Franco, E. L.; Malagon, T.
Show abstract
BackgroundTimely cancer diagnosis in children and adolescents is critical to improving outcomes, yet substantial variation in diagnostic intervals persists across cancer types and care settings. We aimed to quantify time to diagnosis and assess variations by patient, demographic, and system-level factors. MethodsWe conducted a retrospective population-based study of children and adolescents aged 0-19 years diagnosed with one of 12 common cancers between 2010 and 2022 in Quebec, Canada. The diagnostic interval was defined as the time from first cancer-related healthcare encounter to diagnosis. We calculated medians and interquartile ranges (IQR) overall and by cancer type and used multivariable quantile regression to identify factors associated with time to diagnosis at the 25th, 50th, and 75th percentiles. ResultsAmong 2,927 individuals with cancer, diagnostic intervals varied by cancer type and age. Median intervals were longest for carcinomas (100 days; IQR 33-192) and shortest for leukemias (8 days; IQR 3-44). Compared with children living in Montreal, living in regional areas and other large urban centres was associated with longer 50th and 75th percentiles of time to diagnosis for hepatic and central nervous system (CNS) tumours. Diagnostic intervals were shorter in the post-pandemic period (2020-2022) across several cancer sites, with CNS tumours showing reductions across all quantiles. InterpretationDiagnostic timeliness differed by cancer type, age, and rurality, but not by sex, material, or social deprivation. The shorter diagnostic intervals observed in the post-pandemic period suggest that pandemic-related changes in care pathways may have expedited diagnosis for some cancers.
Panapruksachat, S.; Troupin, C.; Souksavanh, M.; Keeratipusana, C.; Vongsouvath, M.; Vongphachanh, S.; Vongsouvath, M.; Phommasone, K.; Somlor, S.; Robinson, M. T.; Chookajorn, T.; Kochakarn, T.; Day, N. P.; Mayxay, M.; Letizia, A. G.; Dubot-Peres, A.; Ashley, E. A.; Buchy, P.; Xangsayarath, P.; Batty, E. M.
Show abstract
We used 2492 whole genome sequences from Laos to investigate the molecular epidemiology of SARS-CoV-2 from 2021 through 2024, covering the major waves of COVID-19 disease in Laos including time periods of travel restrictions and after relaxation of travel across international borders. We identify successive waves of COVID-19 caused by shifts in the dominant lineage, beginning with the Alpha variant in April 2021 and continuing through the Delta and Omicron variants. We quantify a shift from a small number of viral introductions responsible for widespread transmission in early waves to a larger number of introductions for each variant after travel restrictions were lifted, and identify potential routes of introduction into the country. Our study underscores the importance of genomic surveillance to public health responses to characterize viral transmission dynamics during pandemics.
Meagher, N.; Hettiarachchi, D.; Hawkins, M. R.; Tavlian, S.; Spirkoska, V.; McVernon, J.; Carville, K. S.; Price, D. J.; Villanueva Cabezas, J. P.; Marcato, A. J.
Show abstract
BackgroundThe World Health Organization has developed several global template protocols for epidemiological investigations, including for household transmission investigations (HHTIs). These investigations facilitate rapid characterisation of novel or re-emerging respiratory pathogens and support evidence-based public health actions. Beyond technical readiness, community buy-in is central to the feasibility and acceptability of HHTIs. Research is needed to determine the perceived legitimacy among the community to inform local protocol adaptation and development of implementation plans that consider community attitudes and needs. MethodsIn 2025, we conducted a convenience survey of community members living in Victoria, Australia to explore: their understanding of emerging respiratory diseases; their willingness to take part in public health surveillance activities such as HHTIs; the acceptability of clinical and epidemiological data collection and respiratory/blood sample collection as main components of HHTIs, and; participant comfort towards including their companion animals in HHTIs. ResultsWe received 282 survey responses, of which 235 were included in the analysis dataset. Compared to the general Victorian population, our participants included a higher proportion of participants who reported being female, tertiary-educated, of Aboriginal and/or Torres Strait Islander heritage, born in Australia and speaking only English at home. Participants indicated overall high levels of comfort and acceptability towards participation in HHTIs, particularly in relation to clinical and epidemiological data collection, with lesser but still high levels of comfort with providing multiple respiratory specimens in a 14-day period. Participants were least comfortable with other specimens such as urine and blood. Involving companion animals in HHTIs was similarly acceptable as human-focused components. ConclusionsDespite our survey population being non-representative of the general Victorian population, our findings provide valuable descriptive insights into the acceptability of HHTIs in Victoria, Australia from which to benchmark future local and international surveys and community engagement activities.
Ahmed, W.; Gebrewold, M.; Verhagen, R.; Koh, M.; Gazeley, J.; Levy, A.; Simpson, S.; Nolan, M.
Show abstract
Wastewater surveillance (WWS) is established as a vital tool for monitoring polio and SARS-CoV-2 with potential to improve surveillance for many other infectious diseases. This study evaluated the feasibility of detecting measles virus (MeV) RNA in wastewater as part of a national WS preparedness trial in Brisbane, Australia, from March to June 2025. Composite and passive sampling methods were employed in parallel at three wastewater treatment plants serving populations between 230,000 and 584,000. Nucleic acids were extracted and analyzed using RT-qPCR targeting MeV N and M genes to distinguish wild-type and vaccine strains. MeV RNA were detected in both 24-hour composite and passive samples on May 26 to 27, 2025 from the largest catchment of 584,000 which also included an international airport. No measles cases were reported in this city or region within 4 weeks of the WS detections. These were confirmed as vaccine-derived measles virus (MeVV) strain via specific RT-qPCR assay. Extraction recoveries varied (11.5% to 70.5%), with passive sampling showing higher efficiency. This is the first report of use of passive samples for detection of MeV. These findings are consistent with other studies reporting WWS results of both MeVV genotype A and wild type genotype B and/or D. It demonstrates the potential for sensitive MeV WWS with rapid differentiation of MeVV from wild type MeV shedding, including in airport transport hubs and with different sample types. Use of WWS could strengthen measles surveillance by enabling rapid detection of MeV RNA and supporting outbreak preparedness and response. This requires optimised methods which are specific to or differentiate wild-type MeV from MeVV. Furthermore, the successful detection of MeV using passive sampling in this study highlights its potential for deployment in diverse global contexts which may include non-sewered settings.