Psychopharmacology
○ Springer Science and Business Media LLC
Preprints posted in the last 7 days, ranked by how well they match Psychopharmacology's content profile, based on 59 papers previously published here. The average preprint has a 0.03% match score for this journal, so anything above that is already an above-average fit.
Monson, E. T.; Shabalin, A. A.; Diblasi, E.; Staley, M. J.; Kaufman, E. A.; Docherty, A. R.; Bakian, A. V.; Coon, H.; Keeshin, B. R.
Show abstract
Importance: Suicide is a leading cause of death in the United States with risk strongly influenced by Interpersonal trauma, contributing to treatment resistance and clinical complexity. Objective: To assess clinical and genetic factors in individuals who died from suicide, with and without interpersonal trauma exposure. Design: Individuals who died from suicide with and without trauma were compared in a retrospective case-case design. Prevalence of 19 broad clinical categories was assessed between groups. Results directed selection of 42 clinical subcategories, and 40 polygenic scores (PGS) for further assessment. Multivariable logistic regression models, adjusted for critical covariates and multiple tests, were formulated. Models were also stratified by age group (<26yo and >=26yo), sex, and age/sex. Setting: A population-based evaluation of comorbidity and polygenic scoring in two suicide death subgroups. Participants: A total of 8 738 Utah Suicide Mortality Research Study individuals (23.9% female, average age = 42.6 yo) who died from suicide were evaluated, divided into trauma (N = 1 091) and non-trauma exposed (N = 7 647) individuals. A subset of unrelated European genotyped individuals was also assessed in PGS analyses (Trauma N = 491; Non-trauma N = 3 233). Exposures: Trauma is here defined as interpersonal trauma exposure, including abuse, assault, and neglect from International Classification of Disease coding. Main Outcomes and Measures: Prevalence of comorbid clinical sub/categories and PGS enrichment in trauma versus non-trauma exposed suicide deaths. Results: Overall, trauma-exposed individuals died from suicide earlier (mean age of 38.1 yo versus 43.3 yo; P <0.0001) and were disproportionately female (38% versus 21%, OR = 3.3, CI = 2.9-3.8). Prevalence of asphyxiation and overdose methods, prior suicidality, psychiatric diagnoses, and substance use (OR range = 1.3-3.7) were elevated in trauma exposed individuals who died from suicide. Genetic PGS were also elevated in trauma-exposed individuals who died from suicide for depression, bipolar disorder, cannabis use, PTSD, insomnia, and schizophrenia (OR range = 1.1-1.4) with ADHD and opioid use showing uniquely elevated PGS in trauma exposed males (OR range = 1.2-1.4). Conclusions and Relevance: Results demonstrated multiple convergent lines of age- and sex-specific evidence differentiating trauma-exposed from non-trauma exposed suicide death. Such findings suggest unique biological backgrounds and may refine identification and treatment of this high-risk group.
Wei, M.; Zhang, H.; Peng, Q.
Show abstract
Background: Early initiation of substance use is linked to later adverse outcomes, and risk factors come from multiple domains and are shared across substances. In our previous work, traditional time-to-event Cox models identified individual risk factors, but these models are not designed to jointly model multiple outcomes or capture complex non-linear relationships. Multi-task learning (MTL) can leverage shared structure across related outcomes to improve prediction and distinguish common versus substance-specific predictors. However, most MTL studies rely on baseline features and focus on single outcomes, which limits their ability to capture shared risk and temporal changes. Substance use initiation is a time-dependent process that unfolds during development and reflects changing exposures over time. Baseline-only models cannot capture these changes or represent risk dynamics. Discrete-time modeling provides a practical approach by estimating interval-level initiation risk and combining it into cumulative risk at the subject level. By integrating multi-task learning with dynamic modeling, it is possible to share information across outcomes while capturing how risk evolves over time, which may improve prediction performance. Methods: Using the Adolescent Brain Cognitive Development (ABCD) Study (release 5.1), we developed two complementary multi-task learning (MTL) frameworks to predict initiation of alcohol, nicotine, cannabis, and any substance use. A baseline MTL model predicted fixed- horizon (48-month) initiation using one record per participant, while a dynamic discrete-time MTL model incorporated longitudinal interval data to model time-varying risk. Both models used multi-domain environmental exposures, core covariates, and polygenic risk scores (PRS). Performance was evaluated on a held-out test set using AUROC, PR-AUC, and calibration metrics, and compared with single-task logistic regression (LR). Feature importance was assessed using permutation importance and compared with Cox proportional hazards models. Results: MTL showed comparable or improved performance relative to LR, with larger gains for low-prevalence outcomes (cannabis and nicotine). Incorporating longitudinal information led to consistent improvements across all outcomes. Dynamic models increased AUROC by +0.044 to +0.062 for MTL and +0.050 to +0.084 for LR, indicating that temporal information was the primary driver of performance gains. Feature importance analyses showed modest overlap across methods, with higher agreement between dynamic MTL and Cox models than static MTL. A small set of features, including externalizing behavior, parental monitoring, and developmental factors, were consistently identified across all approaches. Conclusions: Dynamic multi-task learning improves the prediction of substance use initiation by leveraging longitudinal structure and shared information across outcomes. While MTL provides additional gains, incorporating time-varying information is the dominant factor for improving performance. Combining baseline and dynamic frameworks offers a comprehensive strategy for identifying robust risk factors and modeling adolescent substance use initiation.
Wei, M.; Peng, Q.
Show abstract
BackgroundSubstance use initiation in adolescence is influenced by both genetic and environmental factors; however, large-scale genetic studies often treat initiation as a binary outcome and underuse longitudinal timing information. MethodsWe conducted time-to-event (survival) genome-wide association analyses (GWAS) of initiation for four outcomes--alcohol, nicotine, cannabis, and any substance use--using longitudinal follow-up data from the Adolescent Brain Cognitive Development (ABCD) Study. We performed ancestry-stratified GWAS within European (EUR), African (AFR), and Hispanic (HISP) groups, applying consistent quality control and covariate adjustment. Summary statistics were harmonized across ancestries and meta-analyzed using inverse-variance weighted fixed-effects and DerSimonian-Laird random-effects models. We evaluated genomic inflation and heterogeneity (Cochrans Q and I2), identified independent lead variants at genome-wide and suggestive significance thresholds, and assessed cross-trait overlap of associated loci. ResultsIn the multi-ancestry meta-analysis, we observed suggestive association signals across traits (minimum p-values: alcohol [~] 1 x 10-7, any [~] 1 x 10-7, cannabis [~] 5 x 10-8, nicotine [~] 1 x 10-8). Nicotine initiation showed one genome-wide significant variant in both fixed- and random-effects meta-analyses (p < 5 x 10-8). Across traits, suggestive loci demonstrated limited overlap, with the strongest concordance between alcohol and any substance use, consistent with shared liability. Heterogeneity statistics indicated that some loci exhibited cross-ancestry variation in effect estimates. ConclusionsSurvival GWAS leveraging initiation timing can identify genetic signals that may be missed by binary designs and enables principled multi-ancestry synthesis. Our results highlight both shared and trait-specific genetic contributions to early substance initiation and provide a foundation for downstream functional annotation and integrative modeling with environmental risk factors. These findings demonstrate the value of incorporating developmental timing into genetic discovery and provide a framework for integrating longitudinal risk modeling with genomic analyses.
Umar, M.; Hussain, F.; Khizar, B.; Khan, I.; Khan, F.; Cotic, M.; Chan, L.; Hussain, A.; Ali, M. N.; Gill, S. A.; Mustafa, A. B.; Dogar, I. A.; Nizami, A. T.; Haq, M. M. u.; Mufti, K.; Ansari, M. A.; Hussain, M. I.; Choudhary, S. T.; Maqsood, N.; Rasool, G.; Ali, H.; Ilyas, M.; Tariq, M.; Shafiq, S.; Khan, A. A.; Rashid, S.; Ahmad, H.; Bettani, K. U.; Khan, M. K.; Choudhary, A. R.; Mehdi, M.; Shakoor, A.; Mehmood, N.; Mufti, A. A.; Bhatia, M. R.; Ali, M.; Khan, M. A.; Alam, N.; Naqvi, S. Q.-i.-H.; Mughal, N.; Ilyas, N.; Channar, P.; Ijaz, P.; Din, A.; Agha, H.; Channa, S.; Ambreen, S.; Rehman,
Show abstract
BackgroundMajor depressive disorder (MDD), a leading cause of disability worldwide, exhibits substantial heterogeneity in treatment outcomes. Patients who do not respond to standard antidepressant therapy account for the majority of MDDs disease burden. Risk factors have been implicated in treatment response, including genes impacting on how antidepressants are metabolised. Yet, despite its clinical importance, risk factors for treatment-resistant depression (TRD) remain unexplored in low- and middle-income countries (LMIC). We used data from the DIVERGE study on MDD to investigate the risk factors of TRD in Pakistan. MethodsDIVERGE is a genetic epidemiological study that recruited adult MDD patients ([≥]18 years) between Sep 27,2021 to Jun 30, 2025, from psychiatric care facilities across Pakistan. Detailed phenotypic information was collected by trained interviewers and blood samples taken. Infinium Global Diversity Array with Enhanced PGx-8 from Illumina was used for genotyping followed by DRAGEN calling to infer metaboliser phenotypes for Cytochrome P450 (CYP) enzyme genes. We defined TRD as minimal to no improvement after [≥]12 weeks of adherent antidepressant therapy. We conducted multi-level logistic regression to test the association of demographic, clinical and pharmacogenetic variables with TRD. FindingsAmong 3,677 eligible patients, polypharmacy was rampant; 86% were prescribed another psychotropic drug along with an antidepressant. Psychological therapies were uncommon (6%) while 49% of patients had previously visited to a religious leader/faith healer in relation to their mental health problems. TRD was experienced by 34% (95%CI: 32-36%) patients. The TRD group was characterised by more psychotic symptoms and suicidal behaviour (OR=1.39, 95%CI=1.04-1.84, p=0.02; OR=1.03, 95%CI=1.01-1.05, p=0.005). Social support (OR=0.55, 95%CI=0.44-0.69, p=1.4x10-7) and parents being first cousins (OR=0.81, 95%CI=0.69-0.96, p=0.01) were associated with lower odds of TRD. In 1,085 patients with CYP enzyme data, poor (OR=1.85, 95%CI=1.11-3.07, p=0.01) and ultra-rapid (OR=3.11, 95%CI=1.59-6.12, p=0.0009) metabolizers for CYP2C19 had increased risk of TRD compared with normal metabolisers. InterpretationThere was an excessive use of polypharmacy in the treatment of depression while psychological therapies were uncommon highlighting the need for more evidence-based practice. This first large study of MDD from Pakistan uncovered the importance of culture-specific forms of social support in preventing TRD, highlighting opportunities for interventions in low-income settings. Pharmacogenetic markers can be leveraged to predict TRD.
Spann, D. J.; Hall, L. M.; Moussa-Tooks, A.; Sheffield, J. M.
Show abstract
BackgroundNegative symptoms are core features of schizophrenia that relate strongly to functional impairment, yet interventions targeting these symptoms remain largely ineffective. Emerging theoretical work highlights how environmental factors may shape and maintain negative symptoms. Although racial disparities in schizophrenia diagnosis among Black Americans are well documented and linked to racial stress and psychosis, the impact of racial stress on negative symptoms has not been examined. This study provides an initial test of a novel theory proposing that racial stress - here measured by racial discrimination - influences negative symptom severity through exacerbation of negative cognitions about the self, particularly defeatist performance beliefs (DPB). Study DesignParticipants diagnosed with schizophrenia-spectrum disorder (SSD) (N = 208; 80 Black, 128 White) completed the Positive and Negative Syndrome Scale (PANSS), the Defeatist Beliefs Scale, and self-report measures of subjective racial and ethnic discrimination (Racial and Ethnic Minority Scale and General Ethnic Discrimination Scale). Relationships among variables were tested using linear regression and mediation analysis. Study ResultsBlack participants exhibited significantly greater total and experiential negative symptoms than White participants with no group difference in DPB. Racial discrimination explained 46% of the relationship between race and negative symptoms. Among Black participants, higher DPB were associated with greater negative symptom severity. Discrimination was positively related to both DPB and negative symptoms. DPB partially mediated the relationship between discrimination and negative symptoms. ConclusionsFindings suggest that racial stress contributes to negative symptom severity via defeatist beliefs among Black individuals, highlighting potential targets for culturally informed interventions.
Xu, M.; Philips, R.; Singavarapu, A.; Zheng, M.; Martin, D.; Nikolin, S.; Mutz, J.; Becker, A.; Firenze, R.; Tsai, L.-H.
Show abstract
Background: Gamma oscillation dysfunction has been implicated in neuropsychiatric disorders. Restoring gamma oscillations via brain stimulation represents an emerging therapeutic approach. However, the strength of its clinical effects and treatment moderators remain unclear. Method: We conducted a systematic review and meta-analysis to examine the clinical effects of gamma neuromodulation in neuropsychiatric disorders. A literature search for controlled trials using gamma stimulation was performed across five databases up until April 2025. Effect sizes were calculated using Hedge's g. Separate analyses using the random-effects model examined the clinical effects in schizophrenia (SZ), major depressive disorder (MDD), bipolar disorder, and autism spectrum disorder. For SZ and MDD, subgroup analyses evaluated the effects of stimulation modality, stimulation frequency, treatment duration, and pulses per session. Result: Fifty-six studies met the inclusion criteria (NSZ = 943, NMDD = 916, NBD = 175, NASD = 232). In SZ, gamma stimulation was associated with improvements in positive (k = 10, g = -0.60, p < 0.001), negative (k = 12, g = -0.37, p = 0.03), depressive (k = 8, g = -0.39, p < 0.001), anxious symptoms (k = 5, g = -0.59, p < 0.001), and overall cognitive function (k = 7, g = 0.55, p < 0.001). Stimulation frequency and treatment duration moderated therapeutic effects. In MDD, reductions in depressive symptoms were observed (k = 23, g = -0.34, p = 0.007). Conclusion: Gamma neuromodulation showed moderate therapeutic benefits in SZ and MDD. Substantial heterogeneity likely reflects protocol differences, highlighting the need for well-powered future trials.
Jacobsen, A. M.; Quednow, B. B.; Bavato, F.
Show abstract
ImportanceBlood neurofilament light chain (NfL) and glial fibrillary acidic protein (GFAP) are entering clinical use in neurology as markers of neuroaxonal and astrocytic injury, but their utility in psychiatry is unclear. ObjectiveTo determine whether psychiatric diagnoses are associated with altered plasma NfL and GFAP levels. Design, Setting, and ParticipantsThis population-based study examined plasma NfL and GFAP among 47,495 participants from the UK Biobank (54.0% female; 93.5% White; mean [SD] age 56.8 [8.2] years) who provided blood samples and sociodemographic and clinical data between 2006 and 2010. Normative modeling was applied to assess associations between 7 lifetime psychiatric diagnostic categories and deviations from expected NfL and GFAP levels, while accounting for neurological diagnoses, cardiometabolic burden, and substance use. Data were analyzed between July 2025 and March 2026. Main Outcomes and MeasuresDeviations in plasma NfL and GFAP levels from normative predictions. ResultsRelative to the reference population, plasma NfL levels were higher among individuals with bipolar disorder (d=0.20; 95% CI, 0.03-0.37; p=0.03), recurrent depressive disorder (d=0.23; 95% CI, 0.07-0.38; p=0.009), and depressive episodes (d=0.06; 95% CI, 0.02-0.10; p=0.01), lower among individuals with anxiety disorders (d=-0.07; 95% CI, -0.12 to -0.02; p=0.008), but did not differ in schizophrenia spectrum, stress-related, or other psychiatric disorders. Plasma GFAP levels were not elevated in any psychiatric disorders. Variability in NfL levels was greater among individuals with schizophrenia spectrum disorders (variance ratio [VR]=1.30; p=0.005), depressive episodes (VR=1.06; p=0.006), and anxiety disorders (VR=1.08; p=0.005). Variability in GFAP levels was increased only in anxiety disorders (VR=1.08; p=0.01). Plasma NfL levels exceeding percentile-based normative thresholds were more common among individuals with schizophrenia spectrum disorders, bipolar disorder, recurrent depressive disorder, and depressive episodes. Neurological diagnoses, cardiometabolic burden, and substance use were associated with plasma NfL and GFAP levels. Conclusions and RelevanceThis study provides population-level evidence of plasma NfL elevation in bipolar and depressive disorders and increased variability in schizophrenia spectrum, bipolar and depressive disorders, supporting its potential as a biomarker in psychiatry and informing its ongoing neurological applications. Plasma GFAP levels, in contrast, were largely unaltered across psychiatric disorders. Key PointsO_ST_ABSQuestionC_ST_ABSAre plasma neurofilament light chain (NfL) and glial fibrillary acidic protein (GFAP) levels altered in psychiatric disorders? FindingsIn this cohort study including 47,495 individuals, normative modeling revealed that plasma NfL levels were elevated in bipolar and depressive disorders, whereas plasma GFAP levels were not elevated in any psychiatric disorder. Plasma NfL levels also showed higher variability in schizophrenia spectrum, bipolar, and depressive disorders. MeaningPlasma NfL shows distinct alterations in schizophrenia spectrum and affective disorders, supporting its further investigation as a biomarker in clinical psychiatry and highlighting the need to consider psychiatric comorbidity in neurological applications.
Quide, Y.; Lim, T. E.; Gustin, S. M.
Show abstract
BackgroundEarly-life adversity (ELA) is a risk factor for enduring pain in youth and is associated with alterations in brain morphology and function. However, it remains unclear whether ELA-related neurobiological changes contribute to the development of enduring pain in early adolescence. MethodsUsing data from the Adolescent Brain Cognitive Development (ABCD) Study, we examined multimodal magnetic resonance imaging (MRI) markers in children assessed at baseline (ages 9-11 years) and at 2-year follow-up (ages 11-13 years). ELA exposure was defined at baseline to maximise temporal separation between early adversity and later enduring pain. Participants with enduring pain at follow-up (n = 322) were compared to matched pain-free controls (n = 644). Structural MRI, diffusion MRI (fractional anisotropy, mean diffusivity), and resting-state functional connectivity data were analysed. Linear models tested main effects of enduring pain, ELA, and their interaction on brain metrics, controlling for relevant covariates. ResultsELA exposure was associated with smaller caudate and nucleus accumbens volumes, and reduced surface area of the left rostral middle frontal gyrus. No significant effects of enduring pain or ELA-by-enduring pain interaction were observed across grey matter, white matter, or functional connectivity measures. ConclusionsELA was associated with alterations in fronto-striatal regions in late childhood, but these changes were not linked to enduring pain in early adolescence. These findings suggest that ELA-related neurobiological alterations may represent early markers of vulnerability rather than concurrent correlates of enduring pain. Longitudinal follow-up is needed to determine whether these alterations contribute to later chronic pain risk.
Trivedi, S.; Simons, N. W.; Tyagi, A.; Ramaswamy, A.; Nadkarni, G. N.; Charney, A. W.
Show abstract
Background: Large language models (LLMs) are increasingly used in mental health contexts, yet their detection of suicidal ideation is inconsistent, raising patient safety concerns. Objective: To evaluate whether an independent safety monitoring system improves detection of suicide risk compared with native LLM safeguards. Methods: We conducted a cross-sectional evaluation using 224 paired suicide-related clinical vignettes presented in a single-turn format under two conditions (with and without structured clinical information). Native LLM safeguard responses were compared with an independent supervisory safety architecture with asynchronous monitoring. The primary outcome was detection of suicide risk requiring intervention. Results: The supervisory system detected suicide risk in 205 of 224 evaluations (91.5%) versus 41 of 224 (18.3%) for native LLM safeguards. Among 168 discordant evaluations, 166 favored the supervisory system and 2 favored the LLM (matched odds ratio {approx}83.0). Both systems detected risk in 39 evaluations, and neither in 17. Detection was highest in scenarios with explicit suicidal ideation and lower in more ambiguous presentations. Conclusions: Native LLM safeguards frequently failed to detect suicide risk in this structured evaluation. An independent monitoring approach substantially improved detection, supporting the role of external safety systems in high-risk mental health applications of LLMs.
Imtiaz, Z.; Kopell, B. H.; Olson, S.; Saez, I.; Song, H. N.; Mayberg, H. S.; Choi, K. S.; Waters, A. C.; Figee, M.; Smith, A. H.
Show abstract
Background: Deep brain stimulation (DBS) of the anterior limb of the internal capsule (ALIC) is an effective treatment for severe obsessive-compulsive disorder (OCD). Identifying brain readouts of positive response may guide further DBS optimization. Methods: We measured local field potential (LFP) changes from bilateral DBS leads in 10 OCD patients implanted at a uniform tractographic network target derived from prior DBS responders. We consistently stimulated dorsal lead contacts in the ALIC white matter, while recording LFP from the ventral lead contacts in grey matter of the anterior globus pallidus externus (GPe), a key node in the basal ganglia non-motor indirect pathway. Results: After six months of DBS, OCD symptoms decreased on average by 40% across subjects, along with a significant decrease in alpha activity across both hemispheres. Only one patient did not have an improvement of symptoms, and this was also the only patient to never exhibit an alpha decrease in either hemisphere. Conclusions: Our findings suggest that therapeutic ALIC DBS coincides with a stable decrease in limbic-cognitive GPe alpha power, which should be further investigated as a potential biomarker of sustained response.
Meagher, N.; Hettiarachchi, D.; Hawkins, M. R.; Tavlian, S.; Spirkoska, V.; McVernon, J.; Carville, K. S.; Price, D. J.; Villanueva Cabezas, J. P.; Marcato, A. J.
Show abstract
BackgroundThe World Health Organization has developed several global template protocols for epidemiological investigations, including for household transmission investigations (HHTIs). These investigations facilitate rapid characterisation of novel or re-emerging respiratory pathogens and support evidence-based public health actions. Beyond technical readiness, community buy-in is central to the feasibility and acceptability of HHTIs. Research is needed to determine the perceived legitimacy among the community to inform local protocol adaptation and development of implementation plans that consider community attitudes and needs. MethodsIn 2025, we conducted a convenience survey of community members living in Victoria, Australia to explore: their understanding of emerging respiratory diseases; their willingness to take part in public health surveillance activities such as HHTIs; the acceptability of clinical and epidemiological data collection and respiratory/blood sample collection as main components of HHTIs, and; participant comfort towards including their companion animals in HHTIs. ResultsWe received 282 survey responses, of which 235 were included in the analysis dataset. Compared to the general Victorian population, our participants included a higher proportion of participants who reported being female, tertiary-educated, of Aboriginal and/or Torres Strait Islander heritage, born in Australia and speaking only English at home. Participants indicated overall high levels of comfort and acceptability towards participation in HHTIs, particularly in relation to clinical and epidemiological data collection, with lesser but still high levels of comfort with providing multiple respiratory specimens in a 14-day period. Participants were least comfortable with other specimens such as urine and blood. Involving companion animals in HHTIs was similarly acceptable as human-focused components. ConclusionsDespite our survey population being non-representative of the general Victorian population, our findings provide valuable descriptive insights into the acceptability of HHTIs in Victoria, Australia from which to benchmark future local and international surveys and community engagement activities.
Panapruksachat, S.; Troupin, C.; Souksavanh, M.; Keeratipusana, C.; Vongsouvath, M.; Vongphachanh, S.; Vongsouvath, M.; Phommasone, K.; Somlor, S.; Robinson, M. T.; Chookajorn, T.; Kochakarn, T.; Day, N. P.; Mayxay, M.; Letizia, A. G.; Dubot-Peres, A.; Ashley, E. A.; Buchy, P.; Xangsayarath, P.; Batty, E. M.
Show abstract
We used 2492 whole genome sequences from Laos to investigate the molecular epidemiology of SARS-CoV-2 from 2021 through 2024, covering the major waves of COVID-19 disease in Laos including time periods of travel restrictions and after relaxation of travel across international borders. We identify successive waves of COVID-19 caused by shifts in the dominant lineage, beginning with the Alpha variant in April 2021 and continuing through the Delta and Omicron variants. We quantify a shift from a small number of viral introductions responsible for widespread transmission in early waves to a larger number of introductions for each variant after travel restrictions were lifted, and identify potential routes of introduction into the country. Our study underscores the importance of genomic surveillance to public health responses to characterize viral transmission dynamics during pandemics.
Mullen, C.; Barr, R. D.; Strumpf, E.; El-Zein, M.; Franco, E. L.; Malagon, T.
Show abstract
BackgroundTimely cancer diagnosis in children and adolescents is critical to improving outcomes, yet substantial variation in diagnostic intervals persists across cancer types and care settings. We aimed to quantify time to diagnosis and assess variations by patient, demographic, and system-level factors. MethodsWe conducted a retrospective population-based study of children and adolescents aged 0-19 years diagnosed with one of 12 common cancers between 2010 and 2022 in Quebec, Canada. The diagnostic interval was defined as the time from first cancer-related healthcare encounter to diagnosis. We calculated medians and interquartile ranges (IQR) overall and by cancer type and used multivariable quantile regression to identify factors associated with time to diagnosis at the 25th, 50th, and 75th percentiles. ResultsAmong 2,927 individuals with cancer, diagnostic intervals varied by cancer type and age. Median intervals were longest for carcinomas (100 days; IQR 33-192) and shortest for leukemias (8 days; IQR 3-44). Compared with children living in Montreal, living in regional areas and other large urban centres was associated with longer 50th and 75th percentiles of time to diagnosis for hepatic and central nervous system (CNS) tumours. Diagnostic intervals were shorter in the post-pandemic period (2020-2022) across several cancer sites, with CNS tumours showing reductions across all quantiles. InterpretationDiagnostic timeliness differed by cancer type, age, and rurality, but not by sex, material, or social deprivation. The shorter diagnostic intervals observed in the post-pandemic period suggest that pandemic-related changes in care pathways may have expedited diagnosis for some cancers.
Baldry, G.; Harb, A.-K.; Findlater, L.; Ogaz, D.; Migchelsen, S. J.; Fifer, H.; Saunders, J.; Mohammed, H.; Sinka, K.
Show abstract
ObjectivesWe determined the frequency of sexually transmitted infection (STI) testing among people accessing sexual health services (SHS) in England. MethodsWe assessed STI testing frequency in face-to-face and online SHSs in England using data from the GUMCAD STI surveillance system. We quantified different combinations of tests (e.g. single chlamydia test or full STI screen), number of tests completed in 2024 and test positivity by sociodemographic and behavioural characteristics, as well as clinical setting and outcomes. ResultsOverall, there were 2,222,028 attendances at SHS in England in 2024 that involved tests for chlamydia, gonorrhoea, syphilis and/or HIV. Most of these attendances involved tests for all four of these STIs. Most people accessing SHS in England tested once (80.1%), and a small minority (1.9%) tested at least quarterly (4+ times). Some groups had a comparably larger proportion of quarterly testers; these included gay, bisexual, and other men who have sex with men (GBMSM) (6.7%), London residents (3.6%), online testers (2.5%), people using HIV-PrEP (13%), and people with 5+ partners in the previous 3 months (10.6%). Only 10.5% of GBMSM reporting higher-risk sexual behaviours tested quarterly despite recommendations for quarterly testing in this group. ConclusionsThe majority of those who tested for STIs in England in 2024 only tested once. The minority who tested at least quarterly had a higher proportion of GBMSM, people using HIV-PrEP, London residents and people reporting higher risk behaviours. Quarterly testing often appears to be aligned with current testing recommendations in England; however, we also observed that only a low proportion of behaviourally high-risk GBMSM and HIV-PrEP users are meeting these recommendations. It is important to acknowledge groups with lower or higher testing frequency when developing interventions and updating guidelines related to STI testing. WHAT IS ALREADY KNOWN ON THIS TOPICThe effectiveness of asymptomatic testing for chlamydia and gonorrhoea in gay, bisexual and other men who have sex with men (GBMSM), and the potential impact of the consequent increased antibiotic use on rising antimicrobial resistance and individual harm has recently been questioned. Testing and treatment remains a key pillar of STI prevention and management; despite this, there is limited evidence of STI testing frequency within sexual services (SHS) on a national level. WHAT THIS STUDY ADDSThis analysis shows that the majority of people attending SHSs in England in 2024 tested once, and only a small proportion of behaviourally high-risk people tested frequently. HOW THIS STUDY MIGHT AFFECT RESEARCH, PRACTICE OR POLICYAwareness of groups that are behaviourally high risk but testing infrequently is important to guide interventions and messaging regarding STI testing. The low levels of frequent testing, even among those who would be recommended quarterly testing under UK guidelines, provides important context for wider discussion around asymptomatic STI screening.
Huang, X.; Hsieh, C.; Nguyen, Q.; Renteria, M. E.; Gharahkhani, P.
Show abstract
Wearable-derived physiological features have been associated with disease risk, but most current studies focus on single conditions, limiting understanding of cross-disease patterns. This study adopts a trans-diagnostic approach to examine whether wearable data capture shared and condition-specific physiological signatures across multiple chronic conditions spanning physical and mental health, and then evaluates the utility of these features for disease classification. A total of 9,301 patients with at least 21 days of consecutive FitBit data from the All of Us Controlled Tier Dataset version 8 were analyzed. Disease subcohorts included cardiovascular disease (CVD), diabetes, obstructive sleep apnea (OSA), major depressive disorder (MDD), anxiety, bipolar disorder, and attention-deficit/ hyperactivity disorder (ADHD), chosen based on prevalence and relevance. Logistic regression and XGBoost models were fitted for each disease subcohort versus the control cohort. We found that compared to using just baseline demographic and lifestyle features, incorporating wearable-derived features enabled improved classification performance in all subcohorts for both models, except for ADHD where improvement was mainly observed for ROC-AUC in logistic regression model likely due to the smaller sample size in ADHD subcohort. The largest performance gains were observed in MDD (increase in ROC-AUC of 0.077 for Logistic regression, 0.071 for XGBoost; p < 0.001) and anxiety (increase in ROC-AUC of 0.077 for logistic regression, 0.108 for XGBoost; p < 0.001). This study provides one of the first comprehensive transdiagnostic evaluations of wearable-derived features for disease classification, highlighting their potential to enhance risk stratification in the real-world setting as a practical complement to clinical assessments and providing a foundation to explore more fine-grained wearable data. Author summaryWearable devices such as fitness trackers and smartwatches are becoming increasingly popular and affordable, providing continuous measurements of heart rate, physical activity, and sleep. Alongside the growing digitization of health records, this creates new opportunities for large-scale, real-world health studies. In this study, we analyzed wearable-derived physiological patterns across a range of chronic conditions spanning both physical and mental health to better understand how these signals relate to disease risk. We found that incorporating wearable-derived heart rate, activity and sleep features improved disease risk classification across several conditions, with particularly strong gains for major depressive disorder and anxiety. By examining how individual features contributed to model predictions, we also identified meaningful associations between physiological signals and disease risk. For example, both duration and day-to-day variation of deep and rapid eye movement (REM) sleep were associated with increased risk in certain conditions. Our study supports the development of real-time, automated tools to assess disease risk alongside clinical care.
Ahmed, W.; Gebrewold, M.; Verhagen, R.; Koh, M.; Gazeley, J.; Levy, A.; Simpson, S.; Nolan, M.
Show abstract
Wastewater surveillance (WWS) is established as a vital tool for monitoring polio and SARS-CoV-2 with potential to improve surveillance for many other infectious diseases. This study evaluated the feasibility of detecting measles virus (MeV) RNA in wastewater as part of a national WS preparedness trial in Brisbane, Australia, from March to June 2025. Composite and passive sampling methods were employed in parallel at three wastewater treatment plants serving populations between 230,000 and 584,000. Nucleic acids were extracted and analyzed using RT-qPCR targeting MeV N and M genes to distinguish wild-type and vaccine strains. MeV RNA were detected in both 24-hour composite and passive samples on May 26 to 27, 2025 from the largest catchment of 584,000 which also included an international airport. No measles cases were reported in this city or region within 4 weeks of the WS detections. These were confirmed as vaccine-derived measles virus (MeVV) strain via specific RT-qPCR assay. Extraction recoveries varied (11.5% to 70.5%), with passive sampling showing higher efficiency. This is the first report of use of passive samples for detection of MeV. These findings are consistent with other studies reporting WWS results of both MeVV genotype A and wild type genotype B and/or D. It demonstrates the potential for sensitive MeV WWS with rapid differentiation of MeVV from wild type MeV shedding, including in airport transport hubs and with different sample types. Use of WWS could strengthen measles surveillance by enabling rapid detection of MeV RNA and supporting outbreak preparedness and response. This requires optimised methods which are specific to or differentiate wild-type MeV from MeVV. Furthermore, the successful detection of MeV using passive sampling in this study highlights its potential for deployment in diverse global contexts which may include non-sewered settings.
Ni Chan Chin (Chengqin Ni), M.; Berrio, J. A.
Show abstract
BackgroundAccelerometer-derived behavioral phenotype captures multidimensional aspects of human behavior extending well beyond physical activity, encompassing light exposure, step counts, physical activity patterns, sleep, and circadian rhythms. Whether these five domains constitute a unified behavioral architecture underlying cancer risk and whether circadian organization and light exposure confer incremental predictive value beyond movement volume alone remains to be comprehensively established. MethodsWe conducted an accelerometer-wide association study (AWAS) encompassing the complete accelerometer-derived behavioral exposome across five behavioral domains in UK Biobank participants with valid wrist accelerometry data. Incident solid cancers were designated as the primary endpoint, with prespecified site-specific solid cancers and hematological malignancy as secondary outcomes. Cox proportional hazards models with age as the timescale were used. The minimal covariate set served as the primary reporting tier, followed by sensitivity analyses additionally adjusting for adiposity/metabolic factors, independent activity patterns, shift work history, and accelerometry measurement quality. Nominal statistical significance was defined as two-sided P < 0.05 ResultsAmong 89,080 participants, 6,598 incident solid cancer events were observed over a median follow-up of 8.39 years. In the minimally adjusted model, the pan-solid-tumor association atlas was dominated by signals from activity volume, inactivity fragmentation, and circadian rhythm. Higher overall acceleration (HR per SD: 0.91, 95% CI: 0.89-0.94) and higher daily step counts (HR: 0.93, 95% CI: 0.90-0.95) were independently associated with reduced solid cancer risk, while inactivity fragmentation metrics were consistently linked to higher risk. Notably, circadian rhythms, most prominently cosinor mesor (Midline Estimating Statistic of Rhythm under cosinor model), emerged as leading inverse risk signals, underscoring the independent contribution of circadian behavioral architecture. Site-specific analyses revealed pronounced heterogeneity across tumor sites. Lung cancer exhibited a robust inverse activity-risk gradient, while breast cancer showed reproducible associations with MVPA. Most strikingly, nocturnal light exposure demonstrated a tumor-site-specific association confined to pancreatic cancer, a signal absent across all other sites examined. Associations for uterine cancer were predominantly inactivity-related and substantially attenuated following adjustment for adiposity and metabolic factors. ConclusionsAcross five accelerometer-derived behavioral domains, solid cancers as a whole were most consistently associated with a high-movement, low-fragmentation, and circadian-coherent behavioral profile. While site-specific heterogeneity exists, the broad cancer risk landscape is dominated by movement volume, inactivity fragmentation, and circadian rhythmicity. Light exposure, although more localized in its contribution, demonstrates a potentially novel and specific association with pancreatic cancer risk. These findings support a five-domain behavioral exposome framework for cancer epidemiology and, importantly, position circadian rhythm integrity and nocturnal light exposure as critically understudied dimensions warranting dedicated mechanistic investigation.
Andrei, F.; Tizzoni, M.; Veltri, G. A.
Show abstract
Background: Dengue is rapidly emerging in parts of Europe. How households value vector control attributes, and whether inferences depend on decision models or message framing, is unclear. Methods: We conducted a split-ballot online experiment among adults in Italy and France, as well as a hotspot subsample from Marche, Italy. National samples included 1,505 respondents in Italy and 1,501 in France; 183 respondents were recruited in Marche. Participants were randomised to a discrete choice experiment (random utility maximisation) or a regret-based choice experiment (random regret minimisation) and to one of three pre-task messages (control, loss aversion, community values). Each respondent completed 12 choice tasks comparing two dengue control programmes and an opt-out. We estimated mixed logit and mixed random-regret models with random parameters and treatment effects. Results: Across frameworks, nearby cases and high mosquito prevalence were the dominant drivers of programme uptake, whereas cost and operational burden were secondary. In pooled analyses, loss-aversion messaging increased the weight on high mosquito prevalence in both models (from 0.483 to 0.547 in the utility model; from 0.478 to 0.557 in the regret model). Cost effects were small nationally but larger in the hotspot subsample. Conclusions: Risk salience dominates preferences for dengue vector control in these European settings. Random utility and random regret models yield consistent rankings of attributes but differ in behavioural interpretation and some secondary effects; messaging effects were modest and context dependent.
Koyra, A. B.; Mohammed, F.; Eshete, T.
Show abstract
BackgroundFamily-based HIV index case testing identifies family members with unknown HIV status and links them to care. Data are limited in southern Ethiopia. MethodsA facility-based cross-sectional study was conducted among 377 adults on antiretroviral therapy (ART) in Wolaita Zone, Southern Ethiopia, from November 2022 to May 2023. Participants were selected using systematic random sampling. Data were collected via interviewer-administered semi-structured questionnaire. Multivariable logistic regression identified factors associated with index case family testing. Adjusted odds ratios (AOR) with 95% confidence intervals (CI) were calculated, and statistical significance was declared at p < 0.05. ResultsThe proportion of index case family testing for HIV was 84.9% (95% CI: 81.2- 88.6). In multivariable analysis, urban residence (AOR = 2.8; 95% CI: 1.16-6.75), duration on ART greater than 12 months (AOR = 13.0; 95% CI: 4.6-36.9), disclosure of HIV status to family members (AOR = 5.6; 95% CI: 1.9-16.5), discussion of HIV status with family members (AOR = 6.6; 95% CI: 1.9-23.2), and being counselled by health professionals to bring families for testing (AOR = 6.3; 95% CI: 2.1-19.0) were significantly associated with index case family testing. ConclusionThe prevalence of family-based HIV index case testing in Wolaita Zone was 84.9%, below the national 95% target. Health professionals should strengthen counselling on ART adherence, status disclosure, family discussion, and active referral to improve testing uptake among family members of people living with HIV.
Fitzgerald, O.; Keller, E.; Illingworth, P.; Lieberman, D.; Peate, M.; Kotevski, D.; Paul, R.; Rodino, I.; Parle, A.; Hammarberg, K.; Copp, T.; Chambers, G. M.
Show abstract
Study questionWhat are the characteristics and treatment outcomes of women who undertook planned egg freezing (PEF) in Australia and New Zealand between 2009 and 2023? Summary answerThere has been an average yearly increase in the uptake of PEF of 35%, with most women undergoing a single PEF procedure in their mid-thirties. Given ten years follow-up a little over one in four women return, with nearly half of those using donor sperm and one-third achieving a live birth. What is known alreadyPEF, where women freeze their eggs as a strategy to preserve fertility, has increased dramatically in high income countries in the last decade. Despite the rapid uptake of PEF, there remains limited information to guide women, clinicians and policy makers regarding the characteristics of women undertaking this procedure and treatment outcomes. Study design, size, durationA retrospective population-based cohort study of all women who undertook PEF in Australia and New Zealand between 2009 and 2023, including their subsequent return to thaw their eggs and treatment outcomes. Where women returned to utilise their eggs, all subsequent embryo transfer procedures were linked enabling calculation of live birth rates per woman. Participants/materials, setting, methods20,209 women who undertook PEF in Australia and New Zealand between 2009 and 2023 including 1,657 women who returned to thaw their eggs. Main results and the role of chanceThere has been a huge increase in uptake of PEF, from 55 women in 2009 to 4,919 in 2023. Women who freeze their eggs are typically aged 34-38 years (interquartile range) and nulliparous (98.6%). For women with at least 10 years follow-up (i.e. undertook PEF in 2009-13; N=514), 27.9% returned and thawed their frozen eggs (average time to return: 4.9 years). This reduced to 22.1% in those with at least 5 years follow-up (i.e. undertook PEF in 2009-2018; N=4,288). Of those who used their frozen eggs, 47% used donor sperm. After at least two years follow up, 33.9% had a live birth, rising over time to 37.8% for eggs thawed between 2019-2021. Limitations, reasons for cautionIn the timeframe 2009-2019 we did not have information on whether egg freezing occurred because of a cancer diagnosis, a cohort we wished to exclude from the study. As a result, for this timeframe we weighted observations by the probability that egg freezing occurred due to cancer, with the prediction model developed on the years 2020-2023. Wider implications of the findingsThis study provides recent and comprehensive data on PEF to guide prospective patients and clinicians and inform policy. The exponential growth in PEF in Australia and New Zealand mirrors trends in other high-income countries, suggesting a doubling time of 2-3 years. Study findings highlight the need for setting realistic expectations about the likelihood of returning to use frozen eggs and live birth rates. Study funding/competing interest(s)2020-2025 MRFF Emerging Priorities and Consumer Driven Research initiative: EPCD000014