Kidney360
○ Ovid Technologies (Wolters Kluwer Health)
Preprints posted in the last 7 days, ranked by how well they match Kidney360's content profile, based on 22 papers previously published here. The average preprint has a 0.03% match score for this journal, so anything above that is already an above-average fit.
Ahmadi, A.; Rahaman, M.; Harsh, A.; Yang, J.; Ghanim, B.; Dasgupta, S.; Weinreb, R. N.; Rahman, T.; Houben, A. J. H. M.; Ix, J. H.; Malhotra, R.
Show abstract
Background: Microvascular dysfunction contributes to chronic kidney disease (CKD), but reproducible clinical measures are limited. Laser Doppler flowmetry (LDF) provides a noninvasive assessment of cutaneous microvascular blood flow and may reflect systemic microvascular health. Its relationship with kidney function and histopathology in CKD remains unclear. Methods: We assessed cutaneous microvascular function in 150 participants with CKD (eGFR <90 mL/min/1.73 m2) using a standardized forearm LDF protocol. Baseline perfusion was recorded at ~30{degrees}C, followed by local heating to 44{degrees}C to induce hyperemia. Percent change in perfusion units (PU) defined microvascular functional reserve. Associations of LDF measures with eGFR and urine protein-to-creatinine ratio (uPCR) were evaluated using multivariable linear regression. K-means clustering identified microvascular phenotypes. In a subset (n=20), associations with glomerulosclerosis (GS) and interstitial fibrosis/tubular atrophy (IFTA) were examined. Results: The mean (SD) age was 64 (14) years, 46% were female. The mean eGFR was 42 (21) mL/min/1.73m2 and median uPCR was 0.21 (interquartile range (IQR) 0.11 to 1.20) mg/mg. Higher baseline PU ({beta} = -12; 95% CI, -24 to -1) and reduced percentage change in PU ({beta} = 7; 95% CI, 2 to 13) were associated with lower eGFR, independent of covariates. Neither measure was associated with uPCR. Clustering identified four phenotypes with graded differences in perfusion and reserve. In biopsy participants, higher baseline PU and lower percent change were associated with greater GS and IFTA severity. Conclusion: CKD is characterized by elevated resting perfusion and impaired microvascular reserve, which are associated with lower eGFR and histopathologic injury.
Miura, A.; Okabe, M.; Okabayashi, Y.; Sasaki, T.; Haruhara, K.; Tsuboi, N.; Yokoo, T.
Show abstract
Background: Single-nephron glomerular filtration rate (GFR) represents a nephron-level functional index that may reveal key pathophysiological mechanisms driving progression in patients with diabetic nephropathy. However, its clinical relevance remains incompletely understood. This cross-sectional study assessed single-nephron estimated GFR (eGFR) across different chronic kidney disease (CKD) stages in patients with advanced diabetic nephropathy. Methods: Nephron number was estimated as the number of nonglobally sclerotic glomeruli per kidney using computed tomography-derived cortical volume combined with biopsy stereology. Single-nephron eGFR was calculated by dividing eGFR by the nephron number of both kidneys. Patients were stratified according to CKD stage at kidney biopsy. Associations between CKD stages and single-nephron eGFR were evaluated using multivariable linear regression models adjusted for age, sex, urinary protein excretion, and eGFR. Results: The study included 105 patients with biopsy-proven diabetic nephropathy and overt proteinuria (median age 59 years, 83% male, HbA1c 6.6%, 57% had nephrotic range proteinuria). The percentage of globally sclerotic glomeruli, mesangial expansion score, and prevalence of nodular lesions increased significantly with advancing CKD stage. Median nephron number declined from 529,178 to 224,458 per kidney, whereas glomerular volume remained constant. Single-nephron eGFR decreased markedly with CKD stage and remained significantly inversely associated with CKD stage after adjustment for clinicopathologic covariates (P for trend <0.001). Conclusion: In overt diabetic nephropathy, single-nephron eGFR decreased with advancing CKD stage, despite relatively preserved glomerular volume. At this stage of disease, structural alterations specific to diabetic nephropathy may impair effective single-nephron filtration capacity.
Ren, Y.; Shafi, T.; Segal, M. R.; Li, H.; Pico, A. R.; Shin, M.-G.; Schelling, J. R.; Hulleman, J. D.; He, J.; Li, C.; Choles, H. R.; Brown, J.; Dobre, M. A.; Mehta, R.; Deo, R.; Srivastava, A.; Taliercio, J.; Sozio, S. M.; Jaar, B.; Estrella, M. M.; Chen, W.; Chertow, G. M.; Parekh, R.; Ganz, P.; Dubin, R.; CRIC Study Investigators,
Show abstract
Background: Patients with kidney failure undergoing maintenance hemodialysis suffer high rates of major adverse cardiovascular events(MACE) that are not accurately predicted by traditional cardiovascular risk models. There is an urgent need to identify novel, modifiable cardiovascular risk factors for these patients. Methods: We analyzed associations of 6287 circulating proteins with MACE among 1048 participants undergoing hemodialysis in the Chronic Renal Insufficiency Cohort(CRIC) (14-year follow-up) with validation in the Predictors of Arrhythmic and Cardiovascular Risk in End-Stage Renal Disease study(PACE) (7-year follow-up). In both cohorts, proteins were measured shortly after dialysis initiation and one year later. We compared protein-based risk models derived by elastic net regression to the Pooled Cohort Equations(PCE) optimized for these cohorts(Refit PCE), and to an Expanded Refit PCE that included Troponin T and N-terminal pro-B-type natriuretic peptide. Results: In CRIC, 149 proteins were associated with MACE at false discovery rate<0.05. Among 22 proteins significant at Bonferroni p<8x10-6, proteins that validated in PACE included Sushi von Willebrand factor type A EGF and pentraxin domain-containing protein 1(SVEP1), Complement component C7, R-spondin 4, Tenascin, Fibulin-3 and Fibulin-5. Complement pathways were prominent in network analyses. SVEP1 surpassed other markers by statistical significance, with CRIC HR per log2 1.8 (p=2.1x10-12) and HR per annual doubling 1.6 (p=6.8x10-6). For 2-year MACE, AUC(95%CI) for SVEP1 alone was 0.72(0.59, 0.84) in CRIC, and 0.73(0.63, 0.81) in PACE. SVEP1 surpassed the Expanded Refit PCE in CRIC (0.61 (0.48, 0.73)) (p=0.038). In the pooled CRIC + PACE cohort, SVEP1 AUC(95%CI) (0.79(0.70, 0.88)) surpassed Refit PCE (0.61(0.51, 0.72)) (p=0.004). Conclusions: SVEP1, a 390 kDa protein unlikely to be renally cleared, surpassed over 6000 other proteins and by itself outperformed traditional clinical risk models in predicting MACE in two populations of patients undergoing maintenance hemodialysis. Future studies should provide mechanistic insights behind these findings.
Melville, S.; MacKinnon, M.; Michaud, J.
Show abstract
BackgroundLife-sustaining hemodialysis (HD) is onerous for patients, especially those with multiple co-morbidities and advanced age. A standard HD prescription is 720 minutes per week. Alternative HD regiments have been proposed in attempt to maintain quality of life (QOL). Studies are needed to investigate the efficacy and safety of less frequent HD prescriptions in this population. This is an institution-wide observational study in New Brunswick, Canada to compare HD prescriptions and the impact on QOL and mortality. ObjectiveThe purpose of this study is to assess the current HD prescribing practices at a provincial healthcare institution in relation to patient QOL. DesignProspective Observational Study. SettingSingle centre hospital and satellite hemodialysis units. PatientsVoluntarily consented patients undergoing in-centre hemodialysis treatment. MeasurementsObservational clinical data was collected for each study participant from their hospital and dialysis electronic medical records. The KDQOL-36TM questionnaire was used to assess patient-reported quality of life at the time of consent. MethodsAdults undergoing in-centre or satellite site HD for at least 3 months were eligible to participate. Consenting patient participants were grouped by HD prescription whether they were prescribed 720 minutes or more per week or less than 720 minutes per week. All participants completed the KDQOL-36 TM questionnaire to estimate QOL and groups were compared using the Mann-Whitney U statistical test. Emergency department visits, hospitalizations, and mortality were analyzed using a negative binomial regression or a logistic regression. ResultsWe enrolled 140 patient participants; 41 were undergoing less than 720 minutes per week of HD and 99 were undergoing 720 minutes or more of HD per week. Patients who were undergoing less than 720 minutes per week of HD were older [Median (IQR): 76 (72- 81) yrs. vs. 64 (55 - 75) yrs.; p < 0.001], had higher median (IQR) QOL scores on the Symptoms/ Problems List scale on the KDQOL-36 TM questionnaire [79.2 (70.8 - 88.5 vs. 70.8 (62.5 - 81.3); p = 0.0022], and were less likely to present to the emergency department (incident rate ratio 0.52, 95% confidence interval [CI] 0.33-0.81). Mortality was similar between groups, even when adjusted for age and comorbidity score (odds ratio 1.62, 95% CI 0.59-4.49). LimitationsPatient participant enrollment was limited by the single centre nature of this study. As this was an observational study, we did not account for how long the patients had been prescribed less than 720 minutes of hemodialysis. We did not include a frailty assessment of the study participants. A higher number of study participants may have identified significant trends in mortality. ConclusionsThe results of this study show that patients undergoing less than 720 minutes of weekly HD had a higher QOL score for the KDQOL-36 TM Symptoms/ Problems List scale, were less frequently in the emergency department and were not more likely to die than patients undergoing 720 minutes or more of weekly HD. Further studies are required to assess the feasibility and safety of a conservative model of HD prescribing to improve QOL of patients with palliative care treatment goals.
de Boer, S.; Häntze, H.; Ziegelmayer, S.; van Ginneken, B.; Prokop, M.; Bressem, K. K.; Hering, A.
Show abstract
Background: Medical imaging, especially computed tomography and magnetic resonance imaging, is essential in clinical care of patients with renal cell carcinoma (RCC). Artificial intelligence (AI) research into computer-aided diagnosis, staging and treatment planning needs curated and annotated datasets. Across literature, The Cancer Genome Atlas (TCGA) datasets are widely used for model training and validation. However, re-annotation is often necessary due to limited access to public annotations, raising entry barriers and hindering comparison with prior work. Methods: We screened 1915 CT scans from three TCGA-RCC databases and employed a segmentation model to annotate kidney lesion. After a meta-data-based exclusion step, we hosted a reader study with all papillary (n=56), chromophobe (n=27) and 200 randomly selected clear cell RCC cases. Two students quality checked and corrected the data as well as annotated tumors and cysts. Uncertain cases were checked by a board-certified radiologist. Results: After data exclusion and quality control a total of 142 annotated CT scans from 101 patients (26 female, 75 male, mean age 56 years) remained. This includes 95 CTs with clear cell RCC, 29 with papillary RCC and 18 with chromophobe RCC. Images and voxel-level annotations of kidneys and lesions are open sourced at https://zenodo.org/records/19630298. Conclusion: By making the annotations open-source, we encourage accessible and reproducible AI research for renal cell carcinoma. We invite other researchers who have previously annotated any of these cohorts to share their annotations.
Yang, H.; Liu, Y.; Kim, C.; Huang, C.; Sawano, M.; Young, P.; McPadden, J.; Anderson, M.; Burrows, J. S.; Krumholz, H. M.; Brush, J. E.; Lu, Y.
Show abstract
BackgroundHypertension is the leading modifiable risk factor for ischemic stroke, yet the adequacy of preventative hypertension care in routine clinical practice remains suboptimal. Whether gaps in hypertension management represent missed opportunities for stroke prevention remains unclear. ObjectiveTo evaluate the association between hypertension care delivery and the risk of incident ischemic stroke. MethodsWe conducted a retrospective, matched, nested case-control study among adults with hypertension using electronic health record data from a large regional health system (2010-2024). Patients with a first-ever ischemic stroke were matched 1:2 to controls on age, sex, race and ethnicity, and calendar time. Three care metrics were assessed during follow-up: (1) outpatient visits with blood pressure (BP) measurement per year; (2) number of antihypertensive medication ingredients; and (3) medication intensification score. Conditional logistic regression estimated adjusted odds ratios (aORs). ResultsThe study included 13,476 cases and 26,952 matched controls (N = 40,428). Mean (SD) age was 64.8 (12.2) years, 54.1% were female, and mean follow-up was 2,497 (1,308) days. Cases had fewer BP visits per year (median, 2.50 vs. 3.01; p < 0.001), similar number of medication ingredients (2.00 vs 2.00), and lower treatment intensification scores (-0.211 vs - 0.125). In adjusted models, >5 BP visits per year was associated with lower stroke odds (aOR, 0.55; 95% CI, 0.51-0.59) compared with [≤]1 visit. Use of 2-3 medication ingredients (vs 0) was also associated with reduced stroke odds (aOR, 0.80; 95% CI, 0.75-0.86), whereas >3 ingredients was not significant. The highest quartile of treatment intensification showed the strongest association (aOR, 0.47; 95% CI, 0.44-0.51). Findings were consistent across subgroup and sensitivity analyses, including strata defined by baseline SBP and follow-up SBP. ConclusionsGreater engagement in hypertension care was associated with lower odds of ischemic stroke, suggesting that gaps in routine management may represent missed opportunities for prevention.
Neely, M.; Wojdyla, D. M.; Hong, H.; Wang, P.; Anderson, M. R.; Arroyo, K.; Belperio, J.; Benvenuto, L.; Budev, M.; Combs, M.; Dhillon, G.; Hsu, J. Y.; Kalman, L.; Martinu, T.; McDyer, J.; Oyster, M.; Pandya, K.; Reynolds, J. M.; Rim, J. G.; Roe, D. W.; Shah, P. D.; Singer, J. P.; Singer, L.; Snyder, L. P.; Tsuang, W.; Weigt, S. S.; Christie, J. D.; Palmer, S. M.; Todd, J.
Show abstract
Background: We aimed to identify data-driven FEV1 trajectory phenotypes post-chronic lung allograft dysfunction (CLAD), relate these phenotypes to patient factors and future graft loss, and develop a classification approach for prospective patients. Methods: We studied adult first lung recipients with probable CLAD from two prospective multicenter cohorts: CTOT-20 (n=206) and LTOG (n=1418). FEV1 trajectories over the first nine months post-CLAD were characterized using joint latent class mixed models, jointly modelling time-to-graft loss to account for informative censoring. Models were fit independently in both cohorts and also only among LTOG bilateral recipients. A classification and regression tree (CART) model was derived in LTOG bilateral recipients and applied to CTOT-20 bilateral recipients. Findings: Four distinct early FEV1 trajectory classes were identified in CTOT-20, with large differences in nine month graft loss (72.3%, 31.1%, 2.2%, 0%). In LTOG, similar trajectory patterns were reproduced, with an additional class demonstrating early post-CLAD FEV1 improvement. Among bilateral recipients, trajectory classes showed a clear risk gradient, including a high-risk class with 100% graft loss and a low-risk class with no early graft loss. A CART model incorporating clinical and spirometric variables demonstrated good discrimination in LTOG bilateral recipients (multiclass AUC 0.85) and consistent class assignment and trajectory patterns when applied to CTOT-20. Interpretation: We identified reproducible, clinically meaningful early post-CLAD FEV1 trajectory phenotypes with differential graft loss risk. These phenotypes and a pragmatic classification tool may support risk stratification, trial enrichment, and improved prognostication for patients and clinicians.
Anthonio, O. G.; Olowu, B. I.; Olawuyi, D. A.; Aderemi, T. V.; Ajayi, O. J.
Show abstract
Background Polycyclic aromatic hydrocarbons (PAHs) and volatile organic compounds (VOCs) are combustion-derived pollutants linked to cardiovascular disease. Prior NHANES analyses have evaluated these chemicals individually, failing to capture the correlated co-exposure structures that characterize real-world environmental burden, thereby underscoring the need for application. In this study, we applied an unsupervised machine learning pipeline to urinary biomarker data to identify multi-chemical exposure clusters and quantify their differential cardiovascular risk profiles in a nationally representative US sample. Methods We analyzed 2,979 participants from NHANES between 2017-2018, representing an estimated 36.8 million US adults after complex survey weighting. Twenty-five urinary biomarkers (6 PAH, 19 VOC metabolites) were log-transformed, imputed using Multivariate Imputation by Chained Equations (MICE), and standardized. Uniform Manifold Approximation and Projection (UMAP) was used for dimensionality reduction, followed by Gaussian Mixture Model (GMM) clustering. Survey-weighted prevalence estimates with 95% confidence intervals (CIs) were calculated for hypertension and high total cholesterol within each cluster. Weighted multivariable logistic regression was used to estimate odds ratios (OR) for hypertension, adjusting for age, sex, race/ethnicity, and income. Results Four exposure clusters were identified with a mean assignment probability of 0.948. The High combustion cluster (n=370; estimated 5.1 million US adults) exhibited the highest multi-chemical burden and a weighted hypertension prevalence of 39.3% (95% CI 37.2-41.4%), compared to 28.7% (95% CI 21.9-35.5%) in the Low exposure reference group. After demographic adjustment, High combustion cluster membership was independently associated with 38.4% higher odds of prevalent hypertension (OR 1.38). The prediction model achieved a cross-validated area under the receiver operating characteristic curve (AUC) of 0.849 (SD 0.017). Non-Hispanic Black participants constituted approximately 40% of the High combustion cluster, exceeding their representation in lower-risk clusters. Conclusions Multi-chemical exposome profiling identifies four cardiovascularly distinct subpopulations in the US adult population. Membership in the High combustion exposure cluster was associated with higher odds of prevalent hypertension and disproportionately affected Non-Hispanic Black participants. These findings support the use of multichemical approaches over single-pollutant analyses and highlight the relevance of environmental exposure patterns for making policy and targeted cardiovascular risk stratification.
Hesen, S.; Kassem, K. F.; salah, M. S.
Show abstract
Type 2 diabetes mellitus (T2DM) is a progressive metabolic disorder characterized by persistent hyperglycemia, insulin resistance, and chronic low-grade inflammation. Despite the widespread use of established therapies such as metformin, long-term glycemic control remains suboptimal, and disease progression is often not adequately prevented. This highlights the need for novel therapeutic strategies that address both metabolic dysfunction and the underlying immunometabolic components of the disease. In this study, GLX10 (GLXM100) was evaluated as a novel immune modulator in a high-fat diet (HFD) and low-dose streptozotocin (STZ)-induced rat model of T2DM over a 91-day period. Glycemic outcomes were assessed using terminal random blood glucose and oral glucose tolerance testing (OGTT), with glucose exposure quantified by area under the curve (AUC 0-120). Complementary in vitro investigations were performed in hepatic and macrophage cell models to assess cytocompatibility, nitric oxide production, and modulation of pro-inflammatory cytokines, including IL-6 and TNF-. GLX10 treatment resulted in a significant reduction in random blood glucose levels and a marked improvement in glucose tolerance compared to diabetic control animals. Importantly, GLX10 demonstrated greater improvement in OGTT AUC compared to metformin under the same experimental conditions, indicating enhanced dynamic glucose regulation. In vitro, GLX10 maintained viability in normal hepatic cells while significantly suppressing nitric oxide production and inflammatory cytokine outputs in macrophages, supporting a favorable safety and immune profile. Collectively, these findings demonstrate that GLX10 exerts robust antidiabetic activity through a dual mechanism involving metabolic regulation and suppression of inflammatory signaling. The integration of in vivo efficacy with supportive in vitro safety and mechanistic data provides a strong preclinical foundation and supports the further development of GLX10 as a promising therapeutic candidate for T2DM.
Khattab, A.; Wang, Z.; Srinivasasainagendra, V.; Tiwari, H. K.; Loos, R.; Limdi, N.; Irvin, M. R.
Show abstract
BackgroundDiabetic kidney disease (DKD) is a leading cause of kidney failure in individuals with type 2 diabetes (T2D), yet risk identification in routine clinical practice remains incomplete. A critical and often overlooked barrier is risk observability: how much of a patients underlying risk is actually captured in their clinical record at the time of screening. Existing prediction models evaluate performance using model-specific thresholds, making it difficult to understand how additional data sources alter real-world screening behavior or which individuals benefit when models are expanded. MethodsWe developed a series of five nested machine learning models evaluated at a one-year landmark following T2D diagnosis using data from the All of Us Research Program (N = 39,431; cases = 16,193). Each successive model added a distinct information layer -- intrinsic risk, laboratory snapshots, medication exposure, longitudinal care trajectories, and social determinants of health (SDOH) -- while retaining all prior features. All models were evaluated under a fixed screening policy targeting 90% specificity, so that the false positive rate remained constant as the information available to the model grew. External validation was conducted in the BioMe Biobank (N = 9,818) without retraining. ResultsDiscrimination improved consistently across layers, from AUROC 0.673 (M1) to 0.797 (M5). Under the fixed screening policy, sensitivity nearly doubled from 0.27 to 0.49, with a cumulative recovery of 30.4% of cases missed by the base model. Gains were driven by distinct subgroups at each transition: laboratory features identified biologically high-risk individuals; medication features captured those with high treatment intensity reflecting advanced cardiometabolic burden; longitudinal care trajectory features rescued cases with biological instability observable only through repeated measurements; and SDOH features recovered individuals with limited clinical observability, with rescue probability highest among those with the fewest recorded monitoring domains. Sparse data in the clinical record indicated low observability, not low risk. Social and genetic features each contributed most when downstream physiologic signal was limited, supporting a contextual rather than universal role for each. In BioMe, discrimination was attenuated (M4 AUROC 0.659), but the relative ordering of information layers was fully preserved, and a systematic upward shift in predicted probability distributions underscored the need for recalibration before deployment in a new setting. ConclusionsDKD risk detection in T2D is substantially improved by integrating complementary information layers under a fixed clinical screening policy, with gains arising from distinct domains that identify at-risk individuals in different clinical contexts. The layered landmark framework introduced here reveals how risk observability -- shaped by monitoring intensity, healthcare engagement, and access -- determines what a screening model can detect, and provides a foundation for context-aware EHR-based screening that accounts for data availability at the time of risk assessment. O_FIG O_LINKSMALLFIG WIDTH=200 HEIGHT=140 SRC="FIGDIR/small/26351384v1_ufig1.gif" ALT="Figure 1"> View larger version (51K): org.highwire.dtl.DTLVardef@1cc7f4borg.highwire.dtl.DTLVardef@b92956org.highwire.dtl.DTLVardef@48ffbcorg.highwire.dtl.DTLVardef@8dc627_HPS_FORMAT_FIGEXP M_FIG O_FLOATNOGraphical abstract.C_FLOATNO Study design and layered DKD screening framework The top row defines the cohort timeline, in which predictors are derived from clinical data collected between T2D diagnosis and the 1-year landmark, and incident DKD is ascertained after the landmark. The second row depicts the nested model architecture, in which five successive models sequentially incorporate intrinsic risk, laboratory snapshot features, medication exposure, longitudinal care trajectories, and social determinants of health, while retaining all features from prior layers. The third row summarizes model development in the All of Us Research Program (N = 39,431) and external validation in the BioMe Biobank (N = 9,818), where the same trained models and risk thresholds were applied without retraining. The bottom row highlights the three evaluation domains: predictive performance, fixed-policy screening, and missed-case recovery context. DKD, diabetic kidney disease; T2D, type 2 diabetes; PRS, polygenic risk scores; AUROC, area under the receiver operating characteristic curve; AUPRC, area under the precision-recall curve; PPV, positive predictive value; SHAP, SHapley Additive exPlanations. C_FIG
Wright, R.; Martyn, T.; Keshishian, A.; Nagelhout, E.; Zeldow, B.; Udall, M.; Lanfear, D.; Judge, D. P.
Show abstract
Background: Progression of transthyretin (TTR) amyloid cardiomyopathy (ATTR-CM) can lead to worsening congestion requiring diuretic intensification (DI), heart failure (HF)-related hospitalizations (HFH), and death. Tafamidis was the only approved ATTR-CM therapy in the US from 2019 until the 2024 approval of acoramidis, which achieves near-complete ([≥]90%) TTR stabilization. As head-to-head trials are lacking, real-world comparative effectiveness (CE) data are needed to guide treatment selection. Objective: To evaluate real-world CE of acoramidis versus tafamidis in newly treated patients with ATTR-CM. Methods: Retrospective study using Komodo Healthcare Map (R) US claims data tokenized to Claritas. Patients newly initiating acoramidis or tafamidis between 12/11/2024 and 04/30/2025 with [≥]1 prescription claim (first defined as index date) and [≥]6 months of continuous enrollment preindex date were included and followed until disenrollment, death, treatment switch, or study end date (07/31/2025). Outcomes included DI (initiation or dose-equivalent escalation of oral loop diuretics, parenteral loop diuretic use, or addition of thiazide-like diuretic) and a composite of DI, HFH (inpatient admission with a HF-related ICD-10-CM diagnosis code in any position), and mortality. Propensity score weighting balanced baseline characteristics, disease severity, comorbidity burden, and baseline medication use. Time-to-event outcomes were assessed using weighted Cox proportional hazards models. Results: After weighting, acoramidis (n=170) and tafamidis (weighted sample size=448) patients were comparable at baseline (mean age, 78.6 vs 78.7 years; male, 80.0% vs 80.2%) with mean follow-up of 139 and 143 days, respectively. DI cumulative incidence curves separated early and remained divergent, with acoramidis significantly reducing the hazard of DI events by 43% compared with tafamidis (11.8% vs 20.5%; HR, 0.57; 95% CI, 0.35-0.92; P=0.021). Acoramidis also had a significantly lower risk of composite events, with a 34% reduction in hazard compared with tafamidis (17.6% vs 26.4%; HR, 0.66; 95% CI, 0.44-0.99; P=0.046). Conclusions: In this first real-world CE study of newly treated patients, acoramidis had significantly lower risk of DI events and composite events of DI, HFH, and mortality than tafamidis, potentially supporting improved clinical stability with acoramidis initiation. Additional evaluation with longer follow-up, larger cohorts, and/or prospective clinical outcomes is warranted.
Di Somma, S.; Gervais, R.; Bains, M.; Carter-Williams, S.; Messner, S.; Onsongo, N.
Show abstract
Background: Chronic conditions such as hypertension can significantly disrupt daily life and emotional wellbeing. The interaction between patients' perceptions, adherence to antihypertensive medication and quality of life (QoL) remains underexplored outside structured clinical settings. Objectives: To capture unprompted patient perspectives and assess whether hypertension affects QoL and to investigate if patient reported experiences are associated with self-reported antihypertensive medication adherence. Methods: Social media listening (SML) study analyzing 86,368 anonymized posts from individuals with hypertension in 12 countries, collected between January 2022 and May 2024. Posts from 11 countries (n=81,368) were analyzed using artificial intelligence-enabled natural language processing. Posts from China (n=5,000) were analyzed separately using a harmonized framework. Quantitative and qualitative methods assessed variations by country, age, and gender, and associations between emotional expression and antihypertensive medication adherence. Results: Across the 11-country core sample, 45% of posts mentioned at least one QoL impact, most commonly worry/anxiety (11%). Impacts varied across countries. Among 8,096 posts with age identified, individuals <40 years reported emotional balance impacts in 28% of posts versus 22% among those aged 40+. Work/Education impacts were mentioned in 17% of posts by those <40 years vs 12% in 40+. Among 7968 posts explicitly referencing adherence, expressed worry was associated with stricter adherence (62% association score), as were structured routines (79% score), home monitoring (77%), dietary changes (77%), and exercise (71%). In contrast, sadness/depression was associated with inconsistent adherence (71%), as were forgetfulness (79%), side effects (73%), and cost/insurance concerns (65%). Conclusions: These results emphasize the importance of the psychological and emotional impact of hypertension, including on adherence to medication regimens, reinforcing the value of a holistic approach to patient care.
Tokodi, M.; Kagiyama, N.; Pandey, A.; Nakamura, Y.; Akama, Y.; Takamatsu, S.; Toki, M.; Kitai, T.; Okada, T.; Lam, C. S.; Yanamala, N.; Sengupta, P.
Show abstract
Backgound: Accurate assessment of diastolic function and left ventricular (LV) filling pressure is central to heart failure diagnosis and risk stratification. Contemporary guideline algorithms rely on complex parameters that are not consistently available in routine clinical practice. Objective: To compare the diagnostic and prognostic performance of the 2016 American Society of Echocardiography/European Association of Cardiovascular Imaging (ASE/EACVI) and 2025 ASE guidelines with a deep learning model based on routinely acquired echocardiographic variables. Methods: This study evaluated the guideline-based algorithms and a deep learning model in participants from the Atherosclerosis Risk in Communities (ARIC) cohort (n=5450) for prognostication and two invasive hemodynamic validation cohorts from the United States (n=83) and Japan (n=130) for detection of elevated left ventricular filling pressure. Results: In the ARIC cohort, the deep learning model demonstrated superior prognostic performance compared with the 2016 and 2025 guidelines (C-index: 0.676 vs. 0.638 and 0.602, respectively; both p<0.001). Similar findings were observed among participants with preserved ejection fraction (C-index: 0.660 vs. 0.628 and 0.590; both p<0.001), with improved performance compared with the H2FPEF score (C-index: 0.660 vs. 0.607; p<0.001). In the US hemodynamic validation cohort, the deep learning model showed higher diagnostic performance than the 2025 guidelines (AUC: 0.879 vs. 0.822; p=0.041) and similar performance compared with the 2016 guidelines (AUC: 0.879 vs. 0.812; p=0.138). In the Japanese hemodynamic validation cohort, the deep learning model outperformed both guidelines (AUC: 0.816 vs. 0.634 and 0.694; both p<0.05). Conclusions: A deep learning model leveraging routinely available echocardiographic parameters demonstrated improved diagnostic and prognostic performance compared with contemporary guideline-based approaches, potentially offering a scalable alternative for assessing diastolic function and left ventricular filling pressures.
Carlquist, J.; Scott, S. S.; Wright, J. C.; Jianing, M.; Peng, J.; Mokadam, N. A.; Whitson, B. A.; Smith, S.
Show abstract
PurposeObstructive sleep apnea (OSA) is a common comorbidity in heart failure (HF) patients with prevalence increasing as HF severity worsens. While CPAP/BiPAP has been shown to reduce disease burden and mortality in the general HF population, it is unclear whether these benefits extend to patients with left ventricular assist devices (LVADs). We sought to determine whether OSA affects long-term survival in newly implanted LVAD patients and whether CPAP/BiPAP treatment confers mortality benefits. MethodsThis single-center retrospective study included patients who underwent LVAD implantation between January 2007 and February 2022. Recipients were stratified by OSA status (OSA vs No-OSA), and those with OSA were further categorized based on CPAP/BiPAP compliance. Comparative statistics and Kaplan-Meier survival analyses were performed, with log-rank tests used to compare groups and assess survival differences. A Cox proportional hazards model was conducted to evaluate the association between risk factors and survival among patients with OSA and No-OSA. ResultsBefore LVAD implantation, patients with OSA had higher body mass index, hypertension, and a higher rate of implantable cardioverter-defibrillator placement than those without OSA. OSA was not associated with increased postoperative complications. Although survival did not differ significantly between OSA and No-OSA patients (p=0.33), CPAP/BiPAP-compliant OSA patients had significantly better survival than noncompliant patients (p=0.0099). ConclusionsLVAD patients with OSA who consistently use CPAP/BiPAP have better survival than those who do not. CPAP/BiPAP is a simple, low-risk treatment that can reduce mortality in this population. Therefore, increased perioperative screening for OSA should be considered for patients receiving LVADs. Multicenter studies are needed to confirm our findings further.
Adams, J. C.; Pullmann, D.; Belostotsky, H.; Mestvirishvili, T.; Chiu, E.; Oh, C.; Rabbani, P. S.
Show abstract
ObjectiveThis study evaluates the impact of systemic GLP-1 receptor agonist (GLP-1RA) use on surgical wound healing in high-risk surgical populations, including patients with diabetes, and implications for perioperative planning and healing outcomes. ApproachThis pilot retrospective cohort study compared adult surgery patients with non-healing postoperative wounds by their GLP-1RA use. Outcomes included healing status, time to wound closure, and number of surgical interventions. ResultsThe cohort included 35 non-GLP-1RA users and 16 GLP-1RA users with comparable baseline characteristics, except for significant higher prevalence of venous insufficiency among users. Though median time to closure was similar for all patients, users required fewer surgical interventions and their wounds reached closure in significant difference from non-users. Among patients with diabetes, all GLP-1RA users healed significantly compared to non-users. InnovationThe impact of GLP-1RA therapy on wound healing in high-risk reconstructive and soft-tissue surgery remains poorly defined. This pilot cohort addresses that gap, offering an early signal that GLP-1RA use is associated with improved wound healing and fewer postoperative interventions. These findings may inform perioperative practice by identifying a systemic pharmacologic factor that optimizes surgical outcomes in high-risk populations. ConclusionGLP-1RA use was associated with higher healing rates and fewer interventions, particularly among patients with diabetes. These findings support a beneficial role in surgical wound healing and warrant larger multi-site studies.
Zhang, H.; Dromard, E.; Tsang, K. C. H.; Guemes, A.; Guo, Z.; Baldeweg, S. E.; Li, K.
Show abstract
Non-invasive glucose monitoring (NIGM) has been pursued for decades, yet no device has achieved regulatory approval despite numerous studies reporting high accuracy. This systematic review and meta-analysis of 32 studies (38 cohorts: 20 NIGM, 18 iCGM; N = 1,693) investigated methodological factors underlying this accuracy-regulatory gap. The pooled Mean Absolute Relative Difference (MARD) for NIGM (10.21%; 95% CI: 8.73-11.69%) showed no significant difference from iCGM (11.82%; 95% CI: 10.36-13.29%; p = 0.13), with extreme heterogeneity (I^2 = 95.2%). Meta-regression revealed that study duration was the strongest predictor of NIGM accuracy ({beta} = 3.94, p < 0.001), with MARD degrading from 8.7% in short-term to 15.2% in long-term studies, while iCGM accuracy remained stable. Only 15% of NIGM cohorts validated in the hypoglycemia range, compared to 89% of iCGM studies (p < 0.001). These findings suggest that reported NIGM accuracy is substantially influenced by methodological asymmetries.
Goldwater, J. C.; Harris, Y.; Das, S. K.; Fernandez Galvis, M. A.; Maru, D.; Jordan, W. B.; Sacaridiz, C.; Norwood, C.; Kim, S. S.; Neustrom, K.
Show abstract
OBJECTIVE: To evaluate the return on investment (ROI) of a community based Diabetes Self Management Program (DSMP) enhanced with health related social needs (HRSN) screening and referrals, implemented by the New York City (NYC) Department of Health and Mental Hygiene with three community based organizations in highly impacted, under resourced neighborhoods. RESEARCH DESIGN AND METHODS: A retrospective cost benefit analysis from a public sector payer perspective was conducted among 171 adults with type 2 diabetes who completed a six week, peer led DSMP delivered by community health workers (CHWs) in English, Spanish, and Korean during 2018 2019. A time driven, activity based costing model captured direct implementation costs, CHW workforce turnover, and administrative overhead. Monetized benefits included avoided diabetes related complications, reductions in self reported emergency department (ED) visits and hospitalizations, and quality adjusted life year (QALY) gains from improved medication adherence. Univariate sensitivity analyses tested robustness under conservative assumptions. RESULTS: Total program costs were $179,224; monetized benefits totaled $1,824,213, yielding a net benefit of $1,644,989 and an ROI of 918%, approximately $10 returned per $1 invested. Excluding QALY gains, ROI remained 551%. Self reported ED visits declined from 149 to 82 and hospitalizations from 93 to 24 in the six months following intervention. Over 80% of participants reported housing instability; 72% were Medicaid covered and 16% uninsured. Sensitivity analyses confirmed a positive ROI under all conservative scenarios. CONCLUSIONS: A CHW led, community based DSMP integrated with HRSN screening and referrals delivered substantial economic and public health value among adults facing housing instability and structural barriers to care. Findings support inclusion of DSMP as a covered benefit in Medicaid managed care, value based payment arrangements, and housing access initiatives to advance equitable diabetes outcomes.
Than, M.; Pickering, J. W.; Joyce, L. R.; Buchan, V. A.; Florkowski, C. M.; Mills, N. L.; Hamill, L.; Prystowsky, J.; Harger, S.; Reed, M.; Bayless, J.; Feberwee, A.; Attenburrow, T.; Norman, T.; Welfare, O.; Heiden, T.; Kavsak, P.; Jaffe, A. S.; apple, f.; Peacock, W. F.; Cullen, L.; Aldous, S.; Richards, A. M.; Lacey, C.; Troughton, R.; Frampton, C.; Body, R.; Mueller, C.; Lord, S. J.; George, P. M.; Devlin, G.
Show abstract
BACKGROUND Point-of-care (POC) high-sensitivity cardiac troponin (hs-cTn) testing has the potential to expedite decision-making and reduce emergency department (ED) length of stay for patients presenting with possible myocardial infarction (MI) by ensuring that results are consistently available when looked for by clinicians. We assessed the real-life effectiveness and safety of implementing POC hs-cTn testing in the ED. METHODS We conducted a pragmatic, stepped-wedge cluster randomized trial. The control arm was usual care with an accelerated diagnostic pathway utilizing a single-sample rule-out step with a central laboratory hs-cTn assay. The intervention arm used the same pathway with a POC hs-cTnI. The primary effectiveness outcome was ED length of stay assessed using a generalized linear mixed model, and the safety outcome was 30-day MI or cardiac death. RESULTS Six sites participated with 59,980 ED presentations (44,747 individuals, 61{+/-}19 years, 49.5% female) from February 2023 to January 2025, in which 31,392 presentations were during the intervention arm. After adjustment for co-variates associated with length of stay, the intervention reduced length of stay by 13% (95% confidence intervals [CI], 9 to 16%. P<0.001), corresponding to a reduction of 47 minutes (95%CI, 33 to 61 minutes) from a mean length of stay in the control arm of 376 minutes. The 30-day MI or cardiac death rate was similar in the control and intervention arms (0.39% and 0.39% respectively, P=0.54). CONCLUSIONS Implementation of whole-blood hs-cTnI testing at the POC into an accelerated diagnostic pathway was safe and reduced length of stay in the ED compared with laboratory testing.
Zhang, R.
Show abstract
Aims The oral glucose tolerance test (OGTT) is effective for detecting post-load dysglycemia, but it is burdensome and therefore not routinely used. Continuous glucose monitoring (CGM) offers a convenient way to capture real-world glucose patterns, yet it remains unclear whether CGM-derived metrics reflect OGTT-defined dysglycemia. We therefore aimed to evaluate CGM-derived and clinical metrics for predicting OGTT 2-hour glucose, classifying OGTT-defined dysglycemia, and assessing day-to-day repeatability. Methods We analyzed a cohort with paired free-living CGM and OGTT. Multiple CGM-derived metrics and clinical measures were compared for prediction of OGTT 2-hour glucose, classification of OGTT-defined dysglycemia, and day-to-day stability. Predictive performance was assessed primarily by leave-one-out (LOO) R^2, and day-to-day repeatability by intraclass correlation coefficients (ICC). Results The glycemic persistence index (GPI), a metric integrating the magnitude and duration of glycemic elevation, was the strongest single predictor of OGTT 2-hour glucose (LOO R^2 = 0.439). GPI also showed strong day-to-day repeatability (ICC = 0.665) and ranked first on a combined prediction-stability score. For classification of OGTT-defined dysglycemia, HbA1c had a slightly higher AUC than GPI, but GPI plus HbA1c performed best overall, indicating complementary information. Conclusions GPI was a strong predictor of OGTT 2-hour glucose and showed a favorable balance between predictive performance and day-to-day stability, supporting its potential utility as a CGM-derived marker of dysglycemia.
Haines, M. H.; Ronayne, S. M.; Pickles, K.; Begg, D. A.; Hurley, P. J.; Ferraccioli, M.; Desmond, P.; Opie, N. L.
Show abstract
This research demonstrates that the trans-aqueduct approach is a feasible, minimally invasive access pathway to the third ventricle, offering a potential route to the deep brain for therapeutic technologies. Further pre-clinical investigation is required to thoroughly evaluate physiological tolerance, trauma risk, and the long-term implications of intraventricular implantation. The third ventricle is a high-value site for neuromodulation due to its proximity to deep-brain targets, including the subthalamic nucleus (STN) and globus pallidus internus (GPi). This study defined the anatomical pathway; and evaluated the technical feasibility of retrograde access to the third ventricle via the cerebral aqueduct using minimally invasive interventional techniques. Evaluation was conducted in three phases using human MRI datasets (n=16; mean age 48.4 years) and cadaveric specimens (n=6; mean age 88.2 years). Phase 1 involved morphometric MRI analysis of the aqueduct and ventricles. Phase 2 tested trans-aqueduct access on cadaver specimens via fluoroscopically guided guidewires and catheters. Phase 3 utilized direct anatomical dissections on cadaver specimens (n=3) to morphometrically measure the third ventricular cavity and its relationship to deep-brain nuclei. Measurements across the sample groups showed a mean aqueduct diameter of 1.6 mm (SD=0.14). Third ventricle dimensions averaged 27.6 mm (ventral-dorsal), 19.9 mm (caudal-cranial), and 5.7 mm (lateral). Successful access to the third ventricle was achieved in 83% (5/6) of cadaveric specimens. The optimal technical configuration utilized a 0.018'' angled-tip guidewire and 5-6 Fr catheters; the aqueduct accommodated diameters up to 2.0 mm with minimal resistance. The STN and GPi were localized within 5-20 mm of the ventricular volumetric centroid. The trans-aqueduct approach is a technically feasible, minimally invasive pathway for accessing the third ventricle. This route offers a potential alternative for the delivery of therapeutic neurotechnologies. Further research is required to assess physiological tolerance, trauma risk, and the long-term safety of intraventricular implantation.