Transplantation
○ Ovid Technologies (Wolters Kluwer Health)
All preprints, ranked by how well they match Transplantation's content profile, based on 13 papers previously published here. The average preprint has a 0.03% match score for this journal, so anything above that is already an above-average fit. Older preprints may already have been published elsewhere.
Zirngibl, F.; Gebert, P.; Materne, B.; Launspach, M.; Kuenkele, A.; Hundsoerfer, P.; Cyrull, S.; Deubzer, H. E.; Kuehl, J. S.; Eggert, A.; Lang, P.; Oevermann, L.; von Stackelberg, A.; Schulte, J. H.
Show abstract
Allogeneic hematopoietic stem cell transplantation (HSCT) serves as a therapeutic intervention for various pediatric diseases. Acute kidney injury afflicts 21-84% of pediatric HSCT cases, significantly compromising clinical outcomes. This retrospective single-institution analysis scrutinized the practice of substituting nephrotoxic ciclosporin A with the everolimus/mycophenolate mofetil combination as graft-versus-host disease (GVHD) prophylaxis in 57 patients following first allogeneic matched donor HSCT. The control cohort comprised 74 patients not receiving everolimus during the same timeframe. Study endpoints encompassed the emergence of retention parameters subsequent to the switch to everolimus, overall survival, relapse incidence of the underlying disease and acute and chronic GVHD in both treatment groups. Our findings reveal a significant improvement in renal function, evidenced by reduced creatinine and cystatin C levels 14 days after ceasing ciclosporin A and initiating everolimus treatment. Crucially, the transition to everolimus did not adversely affect overall survival post-HSCT (HR 1.4; 95% CI: 0.64 - 3.1; p=0.39). Comparable incidences of grade 2-4 and grade 3-4 acute GVHD as well as severe chronic GVHD were observed in both groups. Patients with an underlying malignant disease exhibited similar event-free survival in both treatment arms (HR 0.87, 95% CI: 0.39 - 1.9, p=0.73). This study provides compelling real-world clinical evidence supporting the feasibility of replacing CsA with everolimus and for the use of the everolimus/mycophenolate mofetil combination to manage acute kidney injury following HSCT in children. KEY POINTSO_LIEverolimus with or without MMF restores kidney function in children with acute kidney injury after allogeneic HSCT. C_LIO_LIEverolimus with or without MMF effectively prevent acute and chronic GvHD and leads to similar overall survival compared to standard therapy. C_LI
Charles, P. D.; Fawaz, S.; Vaughan, R. H.; Davis, S.; Joshi, P.; Vendrell, I.; Tam, K. H.; Fischer, R.; Kessler, B. M.; Sharples, E. J.; Santos, A.; Ploeg, R. J.; Kaisar, M.
Show abstract
BackgroundOrgan availability limits kidney transplantation, the best treatment for end-stage kidney disease. Globally, deceased donor acceptance criteria have been relaxed to include older donors, which comes with a higher risk of inferior posttransplant outcomes. Donor age, although negatively impacts transplant outcomes, lacks granularity in predicting graft dysfunction. Better donor kidney assessment and characterization of the biological mechanisms underlying age-associated donor organ damage and transplant outcomes is key to improving donor kidney utilisation and transplant longevity. Methods185 deceased pretransplant biopsies (from brain and circulatory death donors aged 18-78 years) were obtained from the Quality in Organ Donation (QUOD) biobank and proteomic profiles were acquired by mass spectrometry. Machine learning exploration using prediction rule ensembles guided LASSO regression modeling of kidney proteomes that identified protein signatures and biological mechanisms associated with 12-m posttransplant outcome. Data modeling was validated on held-out data and contextualised against published spatially resolved kidney injury related transcriptomes. ResultsOur analysis highlighted that outcomes were best modeled using combination of donor age and protein abundance signatures, revealing 539 proteins with these characteristics. Modeled age:protein interactions demonstrated stronger associations with transplant outcomes than age and protein alone and revealed mechanisms of kidney injury including metabolic changes and innate immune responses correlated with poor outcome. Comparison to single-cell transcriptome data suggests protein-outcome associations to specific cell types. ConclusionsMolecular signatures resulted from integration of donor age and proteomic profiles in deceased donor kidney biopsies offer the potential to develop improved pretransplant organ assessment and aid decisions on perfusion interventions.
Rothwell, A.; Nita, G.; Howse, M.; Ridgway, D.; Hammad, A.; Mehra, S.; Jones, A. R.; Goldsmith, P.
Show abstract
The development of de novo donor-specific antibodies (DSAs) against HLA is associated with premature graft failure in kidney transplantation. However, rates and factors influencing de novo DSA formation vary widely across the literature. We aimed to identify pre-transplant factors influencing the development of de novo HLA-specific antibodies following kidney transplantation using machine learning. Data from 460 kidney transplant recipients at a single centre between 2009-2014 were analysed. Pre-transplant variables were collected, and post-transplant sera were screened for HLA antibodies. Positive samples were investigated using Single Antigen Bead (SAB) testing. Machine learning models (Classification and Regression Trees, Random Forest, XGBoost, CatBoost) were trained on a training set of pre-transplant data to predict de novo DSA formation, with and without SMOTE oversampling. Model performance was evaluated on an independent testing set using F1 scores, and feature importance was assessed using SHAP. In the full cohort analysis, XGBoost models performed the best, with F1 scores of 0.54-0.59 without SMOTE and 0.72-0.79 with SMOTE. The strongest predictors were pre-transplant HLA antibodies, number of kidney transplants, cold ischemia time (CIT), recipient age and female gender. SHAP dependence plots showed that pre-existing HLA antibodies and past transplants increased the risk of de novo DSA development. In the unsensitised subgroup analysis, model performance was poor. Machine learning models can be used to identify pre-transplant risk factors for de novo HLA-specific antibody development in kidney transplantation. Monitoring and risk-stratifying patients based on these factors may help guide preventive immunological strategies and recipient selection to improve long-term allograft outcomes. Translational statementThis study identified pre-transplant risk factors for the development of de novo HLA-specific antibody in kidney transplantation. Monitoring and risk-stratifying patients based on these factors may help guide preventive immunological strategies and recipient selection to improve long-term allograft outcomes.
Mauduit, V.; Durand, A.; Morin, M.; Collins, K.; Roder, M.; Most, P. v. d.; Maudet, K.; Silva, N. D. S. B.; Rousseau, O.; Shanmugam, A.; Gilbert, E.; Lord, G.; Cavalleri, G.; Snieder, H.; Bakker, S.; Gourraud, P.-A.; Ribatet, M.; Jean, G.; Kerleau, C.; Giral, M.; Vince, N.; de Borst, M. H.; Viklicky, O.; Conlon, P.; Limou, S.
Show abstract
BACKGROUNDDespite a sharp rise in kidney graft short-term survival rates and better donor-recipient HLA matching, mid- and long-term survival have not sufficiently improved over the past decades. Several studies suggest that non-HLA factors could be involved in kidney allograft injury, but no validated marker has yet been identified. Here, we aimed at finding genetic variations and mismatches associated with graft function and survival. METHODSUsing genome-wide strategies, we tested the recipients and donors common variations (SNPs and CNVs), and the donor-recipient genetic mismatches for association with 1-year kidney graft function and with time-to-death-censored kidney graft failure in a monocentric European cohort of 1,482 complete donor-recipient pairs. We validated our findings through a meta-analysis in two independent European cohorts gathering a total of 1,842 additional complete pairs. RESULTSWe did not identify any significant association with 1-year graft function. However, we discovered four non-HLA mismatches (3 SNPs and 1 CNV) associated with time-to-kidney graft failure. One signal in a regulatory region upstream the TOM1L1 gene (p=6.3x10-9, HR=4.1) was successfully replicated in the validation cohorts (pmeta-analysis=6.7x10-9, HR=2.9) and ranked among the top 50 rejection-specific genes in a pan-organ transcriptomics study. This locus was also associated with time-to-cellular and humoral rejection (p=0.02) in the discovery cohort in patients achieving primary graft function. CONCLUSIONSBy running one of the largest ever performed kidney transplantation genomic analyses, we identified and confirmed a novel donor-recipient genetic mismatch in a biologically relevant non-HLA locus associated with kidney allograft failure. Lay SummaryDespite significant advances in immunosuppression, kidney transplant recipients remain at risk of graft rejection and, in more severe cases, graft failure, which can lead to retransplantation, return to dialysis, or even patient death. Donor-recipient HLA compatibility is integrated into kidney transplant clinical care, as this genetic region is key for immunity and graft tolerance. However, several studies suggest that compatibility outside of the HLA region could also influence graft survival. We aimed at investigating this hypothesis in a cohort of 1,482 donor-recipient pairs and found a novel genetic region involved in kidney graft dysfunction that was validated after meta-analysis of two independent cohorts. These findings contribute to a better understanding of the impact of donor-recipient genetic compatibility on kidney transplant outcomes.
Einsamtrakoon, T.; Tharabenjasin, P.; Pabalan, N.; Tasanarong, A.
Show abstract
AimAllograft survival post-kidney transplantation (KT) are in large part attributed to genetics, which render the recipient susceptible or protected from allograft rejection. KT studies involving single nucleotide polymorphisms (SNPs) have reported the association of interleukin-18 (IL-18) with KT and its role in allograft rejection. However, the reported outcomes been inconsistent, prompting a meta-analysis to obtain more precise estimates. MethodsWe posed two hypotheses about the IL-18 SNPs: their association with KT (H1), and increase or decrease in the risks of allograft rejection (H2). Using standard genetic models, we estimated odds ratios [ORs] and 95% confidence intervals by comparing the IL-18 genotypes between two groups: (i) patients and controls for H1 (GD: genotype distribution analysis); (ii) rejectors and non-rejectors for H2 (allograft analysis). Multiple comparisons were corrected with the Holm-Bonferroni (HB) test. Subgrouping was ethnicity-based (Asians and Caucasians). Heterogeneity was outlier-treated and robustness of outcomes was sensitivity-treated. ResultsThis metaanalysis generated eight significant outcomes, which HB filtered into four core outcomes, found in the dominant/codominant models. Two of the four were in GD, indicating associations of the IL-18 SNPs with KT (ORs 1.34 to 1.39, 95% CIs 1.13-1.70, PHB = .0007-.004). The other two were in allograft analysis indicating reduced risk with HB P-values of .03 for overall (OR 0.74, 95% CI 0.56-0.93) and Asian (OR 0.70, 95% CI 0.53-0.92). In contrast to the protected Asian subgroup, Caucasians showed non-significant increased risk (OR 1.20. 95% CI .82-1.75, Pa = .35). Sensitivity treatment conferred robustness to all the core outcomes. ConclusionsOverall association of IL-18 SNPs with KT was significant (up to 1.4-fold) and Asians KT recipients were protected (up to 30%). Enabled by outlier treatment, these findings were supported by non-heterogeneity and robustness. More studies may confirm or modify our findings.
Bocchi, F.; Beldi, G.; Kuhn, C.; Storni, F.; Mueller, N.; Sidler, D.
Show abstract
The demographics of donor and recipient candidates for kidney transplantation (KT) have substantially changed. Recipients tend to be older and polymorbid and KT to marginal recipients is associated with delayed graft function (DGF), prolonged hospitalization, inferior long-term allograft function, and poorer patient survival. In parallel, donors are also older, suffer from several comorbidities, and donations coming from circulatory death (DCD) predominate, which in turn leads to early and late complications. However, it is unclear how donor and recipient risk factors interact. In this retrospective cohort study, we assess the overall and combined impact of a KT from marginal donors to marginal recipients. We focused on: 1) DGF; 2) hospital stay and number of dialysis days after KT and 3) allograft function at 6 months. Among the 369 KT included, the overall DGF rate was 25% (n = 92) and median time from reperfusion to DGF resolution was 7.8 days (IQR: 3.0-13.8 days). Overall, patients received four dialysis sessions (IQR: 2-8). The combination of pre-KT anuria (< 200 ml/24h, 32%) and DCD procurement (14%) was significantly associated with DGF, length of hospital stay, and severe perioperative complications, predominantly in recipients 50 years and older.
Morin, M.; Mauduit, V.; Bugnon, A.; Danger, R.; Brouard, S.; Durand, A.; Masset, C.; Ville, S.; Rousseau, O.; Kerleau, C.; Renaudin, K.; Blancho, G.; Vince, N.; Giral, M.; Limou, S.
Show abstract
BACKGROUNDChronic antibody-mediated rejection (CAMR) is the main cause of late kidney allograft loss, and no specific effective treatment has been identified so far. Here, we proposed to explore the non-invasive peripheral blood transcriptome signature of CAMR. METHODSFirst, we compared PBMC gene expression from bulk RNA-seq between 35 patients experiencing late CAMR (mean=7.1 years) vs. 43 patients without graft dysfunction at late stages (Stable, mean=4.4 years) to identify the molecular drivers of CAMR. Second, we explored the 1-year gene expression signature in stable patients exhibiting (n=11) or not (n=51) a subsequent CAMR to define possible predictive biomarkers of CAMR. RESULTSWe reported 188 differentially regulated genes during late CAMR (q<0.05). Importantly, CAMR is associated with an upregulation of genes from the degranulation pathway (e.g. MMP9, MMP8 and LCN2) and from the C1q complement complex (e.g. C1QA, C1QB and C1QC), as well as with a downregulation of genes associated with subclinical rejection (e.g. TCL1A). The upregulated degranulation and complement signatures were validated in six independent cohorts gathering a total of 360 stable and 131 chronic rejection patients. Contrary to the injury effect observed during late stages, MMP9 was downregulated at 1-year in PBMCs of patients who later experienced CAMR. CONCLUSIONSThese results suggest a dual role for MMP9 expression with an early protective effect against CAMR and deleterious effects in the later stage. MMP9 peripheral expression appears as a promising biomarker candidate for kidney transplantation follow-up. Translational StatementCAMR is the main cause of late kidney allograft loss, and we aimed to identify molecular targets and biomarkers. By comparing CAMR and stable kidney-transplanted patients prior and during diagnosis, we identified MMP9 as a potential biomarker for both CAMR prognosis and diagnosis. We also highlighted the C1q complement complex and TCL1A in CAMR as potential diagnostic biomarkers. These results provide new insights into CAMR pathophysiology and may guide the development of innovative treatment targeting MMP9 expression in kidney transplanted patients. Ultimately, this work laid the foundation for exploring MMP9 expression kinetics and develop new ways of treating CAMR patients.
Baghai Arassi, M.; Feisst, M.; Krupka, K.; Awan, A.; Benetti, E.; Duzova, A.; Guzzo, I.; Kim, J. J.; Koenig, S.; Litwin, M.; Oh, J.; Pape, L.; Buecher, A.; Peruzzi, L.; Shenoy, M.; Testa, S.; Weber, L. T.; Zieg, J.; Hoecker, B.; Fichtner, A.; Toenshoff, B.
Show abstract
BackgroundData on age-related differences in rejection rates, infectious episodes and tacrolimus exposure in pediatric kidney transplant recipients (pKTR) on a uniform tacrolimus-based immunosuppressive regimen are scarce. MethodsWe therefore performed a large-scale analysis of 802 pKTR from the CERTAIN registry from 40 centers in 14 countries. Inclusion criteria were a tacrolimus-based immunosuppressive regimen and at least two years of follow-up. The patient population was divided into three age groups (infants <6 years, school-aged children 6-12 years, and adolescents >12 years) to assess age-related differences in outcome. ResultsMedian follow-up was 48 months (IQR, 36-72). Within the first 2 years post-transplant, infants had a significantly higher incidence of infections (80.6% vs. 55.0% in adolescents, P<0.001) and a significantly higher number of cumulative hospital days (median 13 days vs. 7 days in adolescents, P < 0.001). Adolescents had a significantly higher rate of biopsy-proven acute rejection episodes in the first year post-transplant (21.7%) than infants (12.6%, P=0.007). Infants had significantly lower tacrolimus trough levels, lower concentration-to-dose ratios as an approximation for higher tacrolimus clearance, and higher intra-patient variability (all P < 0.01) than adolescents. ConclusionsThis largest study to date in European pKTR on a tacrolimus-based immunosuppressive regimen shows important age-related differences in rejection rates, infection episodes, tacrolimus exposure and clearance. These data suggest that immunosuppressive therapy in pKTR should be tailored according to the age-specific risk profiles of this heterogeneous patient population.
Poppelaars, F.; Eskandari, S. K.; Damman, J.; Seelen, M. A.; Faria, B.; Gaya da Costa, M.
Show abstract
BackgroundDespite current matching efforts to identify optimal donor-recipient pairs in kidney transplantation, alloimmunity remains a major proponent of late transplant failure. While kidney allocation based on human leukocyte antigen (HLA) matching has markedly prolonged short-term graft survival, new data suggests that additional genetic parameters in donor-recipient matching could help improve the long-term outcomes. Here, we studied the impact of a recently discovered non-muscle myosin heavy chain 9 gene (MYH9) polymorphism on kidney allograft failure. MethodsWe conducted a prospective observational cohort study, analyzing the DNA of 1,271 kidney donor-recipient transplant pairs from a single academic hospital for the MYH9 rs11089788 C>A polymorphism. The association of the MYH9 genotype with the risk of graft failure (primary outcome), biopsy-proven acute rejection (BPAR), and delayed graft function (DGF) (secondary outcomes) were determined. ResultsThe MYH9 polymorphism in the donor was not associated with 15-year death-censored kidney graft survival, whereas a trend was seen for the association between the MYH9 polymorphism in the recipient and graft failure (recessive model, P=0.056). Having the AA-genotype of the MYH9 polymorphism in recipients was associated with a higher risk of DGF (P=0.031) and BPAR (P=0.021), although the significance was lost after adjustment for potential confounders (P=0.15 and P=0.10, respectively). The combined presence of the MYH9 polymorphism in donor-recipient pairs was significantly associated with long-term kidney allograft survival (P=0.036), in which recipients with an AA-genotype receiving a graft with an AA-genotype had the worst outcome. After adjustment for covariates, this combined genotype remained significantly associated with 15-year death-censored kidney graft survival (HR 1.68, 95%-CI: 1.05 - 2.70, P=0.031). ConclusionsOur results reveal that recipients with an AA-genotype MYH9 polymorphism receiving a donor kidney with an AA-genotype, have a significantly elevated risk of graft failure after kidney transplantation. Key pointsO_LIIn recipients, the MYH9 SNP was associated with delayed graft function and biopsy-proven acute rejection after kidney transplantation, although the significance was lost in multivariable analysis. C_LIO_LIPresence of the MYH9 variant in both the donor and recipient significantly associated with long-term kidney allograft survival in multivariable analysis. C_LIO_LIOur present findings suggests that matching donor-recipient transplant pairs based on the MYH9 polymorphism may attenuate the risk of graft loss. C_LI
Hoyos-Domingo, A.; Ruiz-Lopez, F.; Garcia-Bueno, B.; de la Torre-Alamo, M. M.; Mateo, S. V.; Vidal-Correoso, D.; Guzman Martinez-Valls, P. L.; Lopez-Abad, A.; Garcia-Rivas, F.; Lopez-Cubillana, P.; Baroja-Mazo, A.
Show abstract
BackgroundKidney transplantation is the preferred treatment for end-stage renal disease, but delayed graft function remains a significant complication. Cold ischemia during organ preservation can lead to the release of danger-associated molecular patterns (DAMPs), which may influence graft outcomes. This study aimed to quantify DAMPs in kidney preservation fluid and assess their correlation with delayed graft function (DGF). MethodsPreservation fluid samples from 88 deceased kidney donors were analyzed for various DAMPs, including mitochondrial DNA (mitDNA), cytochrome c, nucleosomes, hyaluronan, and inflammasome-related molecules (IL-18 and IL-1{beta}). The influence of donor type (DBD vs. DCD) and cold ischemia time (CIT) on DAMP concentrations was evaluated. Additionally, the correlation between DAMP levels and DGF was assessed. ResultsMultiple DAMPs were detected in preservation fluid, including mitDNA, cytochrome c, nucleosomes, and hyaluronan. The type of donation (DBD vs. DCD) had minimal impact on DAMP concentrations, except for HSP70, which was significantly higher in DCD donors. CIT positively correlated with hyaluronan and nucleosome levels. Cytochrome c emerged as a potential biomarker for DGF, showing a significant increase in patients with early dysfunction and correlating with post-transplant creatinine levels. ConclusionsQuantifying DAMPs in kidney preservation fluid is feasible and may provide valuable insights into graft quality and early post-transplant outcomes. Cytochrome c, in particular, shows promise as a biomarker for predicting delayed graft dysfunction. These findings highlight the importance of minimizing cold ischemia time and suggest that DAMP analysis could improve graft assessment prior to transplantation.
Batko, K.; Saczek, A.; Banaszkiewicz, M.; Krzanowski, M.; Małyszko, J.; Koc-Zorawska, E.; Zorawski, M.; Niezabitowska, K.; Siek, K.; Betkowska-Prokop, A.; Krzanowska, K.
Show abstract
IntroductionLimited tools exist for predicting kidney function in long-term kidney transplant recipients (KTRs). Elabela and apelin are APJ receptor agonists that constitute the apelinergic axis, which is a recently discovered system regulating vascular and cardiac tissue, in opposition to renin-angiotensin-aldosterone. MethodsLongitudinal, observational cohort of 102 KTRs who maintained graft function [≥]24 months, with no acute rejection history or current active or chronic infection. Serum apelin, elabela, fibroblast growth factor 23 (FGF-23) and -Klotho were tested using enzyme-linked immunoassay and compared with a control group of 32 healthy volunteers. ResultsMedian (IQR) follow-up time was 83 (42, 85) months. Higher serum FGF-23 and elabela, but lower Klotho concentrations were observed in KTRs. Most KTRs had stable trajectories of renal function. All candidate markers were significantly associated with mean two-year eGFR over follow-up, which itself was validated respective to death with functioning graft censored dialysis requirement. Using a cross-validation approach, we demonstrated eGFR at initial visit as the most salient predictor of future renal function. Machine learning models incorporating both clinical and biochemical (candidate markers) assessments were estimated to explain 15% of variance in future eGFR when considering eGFR-independent predictions. ConclusionsUtilization of machine learning tools that incorporate clinical information and biochemical assessments, including serum amrkers of the apelinergic axis, may help stratify risk and aid decision making in the care of long term KTRs.
Fang, Y.; You, L.; Kimenai, H. J. A. N.; Chien, M.-P.; Dor, F.; de Bruin, R. W. F.; Minnee, R. C.
Show abstract
BackgroundAlthough living donor kidney transplantation (LDKT) generally achieves excellent outcomes, 5-12% of recipients experience early graft dysfunction, which is associated with poorer long-term survival. Current predictive tools rely mainly on clinical parameters and lack intraoperative applicability. Laser speckle contrast imaging (LSCI) enables real-time, non-contact assessment of renal microcirculation, and may provide complementary insight when integrated with machine learning (ML). MethodsIn this prospective cohort study, we performed intraoperative LSCI measurement in 110 adult LDKT recipients at Erasmus Medical Center. Early graft function was assessed by estimated glomerular filtration rate (eGFR) at 1 week posttransplant, with patients classified as Group I (eGFR [≥] 30 mL/min/1.73 m{superscript 2}) and Group II (eGFR < 30 mL/min/1.73 m{superscript 2}). Two predefined feature sets were used for model development: (i) a Clinical Model (selected clinical variables) and (ii) a Combined Model (clinical + convolutional neural network [CNN]-derived LSCI features). Four ML algorithms (support vector machine [SVM], logistic regression, random forest [RF], and XGBoost) were trained using 5-fold cross-validation with Synthetic Minority Oversampling Technique (SMOTE) and evaluated on independent test sets across 30 repeated iterations. ResultsOf 110 recipients, 15 (17%) had eGFR < 30 mL/min/1.73 m{superscript 2} at 1 week. Patients in Group II received kidneys from older donors with lower predonation eGFR, had higher BMI, more cardiovascular comorbidity, and greater intraoperative blood loss. The Combined Model consistently outperformed the Clinical Model across all algorithms. For example, SVM achieved higher accuracy (0.89 [95% CI, 0.85-0.92] vs. 0.79 [0.75-0.84]) and logistic regression yielded higher recall (0.86 [0.83-0.88] vs. 0.76 [0.74-0.79]). In independent test sets, Combined Models maintained better performance, with SVM achieving the highest F1 score (0.60 [0.50-0.71]) and RF achieving the highest recall (0.88 [0.50-1.00]). Grad-CAM visualizations confirmed that CNN-extracted features localized to physiologically relevant perfusion regions. LSCI also enabled real-time detection and correction of vascular complications in two cases. ConclusionsIntegrating intraoperative LSCI features with clinical variables using ML significantly improved prediction of low one-week eGFR (< 30 mL/min/1.73m{superscript 2}) in LDKT compared with clinical data alone. LSCI also enabled real-time detection of vascular complications, underscoring its role as both a predictive and intraoperative guidance tool. Larger multicenter studies are warranted to validate its generalizability and explore applications in other transplant scenarios.
Benning, L.; Akifova, A.; Oellerich, M.; Osmanodja, B.; Morath, C.; Beck, J.; Schuetz, A.; Bornemann-Kolatzki, K.; Schrezenmeier, E. V.; Tran, T. H.; Schwenger, V.; Shipkova, M.; Wieland, E.; Schuetz, E.; Budde, K.
Show abstract
IntroductionDonor-derived cell-free DNA (dd-cfDNA) is a standard-of-care biomarker in kidney transplantation, reported as percentage of total cfDNA or copies/mL. We hypothesized that combining both metrics into a continuous composite score would reduce false classifications thus improving the accuracy of rejection diagnosis. MethodsWe analyzed 443 dd-cfDNA measurements in 383 individual patients from five independent kidney transplant cohorts with both percentage and copies/mL. A random discovery set of 31 biopsy-proven rejection and 29 non-rejection cases was used to derive a continuous composite score (CM-Score) integrating dd-cfDNA (%) and copies/mL. Fixed thresholds from the discovery group were validated in 75 rejections and 279 non-rejections for diagnostic performance and compared to twelve published cohorts. ResultsThe CM-Score outperformed dd-cfDNA percentage and cp/mL alone. At the predefined threshold (at 25% prevalence), the CM-Score retained a high NPV of 91% (89-93%), while significantly improving PPV to 80% (78-83%; P<0.002), compared to the published values (weighted average NPV: 88%; 89-90%; PPV: 51% 50-53%, N=6,861). In addition, a decision curve analysis yielded a significantly higher net benefit for the CM-Score (P<0.003; prevalence 10-25% at threshold 15-50%). This superior diagnostic performance of the CM-Score can broaden the applicability of dd-cfDNA to real-world transplantation populations. ConclusionsThe CM-Score is a robust and clinically meaningful tool that improves diagnostic accuracy for both ruling-out and ruling-in rejection using dd-cfDNA and establishes itself as a superior stand-alone metric with clear potential to improve decision-making in kidney transplantation. Lay Summary (Clinical)Kidney transplant failure remains a major concern, and early detection of injury is crucial to prevent long-term damage. A less invasive alternative to a biopsy is a simple blood test that measures tiny fragments of DNA from the transplanted kidney, called donor-derived cell-free DNA (dd-cfDNA). These results can be reported either as a percentage of total DNA or as the number of DNA copies in the blood. We examined whether combining both measurements could improve the detection of rejection. Using data from five transplant cohorts, we developed a combined measure (CM-score) and validated it in 279 samples without rejection and 75 with rejection. At an assumed rejection rate of 25%, it correctly identified rejection with a positive predictive value of 80% and ruled it out with a negative predictive value of 91%. Overall, the CM-score may help doctors diagnose rejection more accurately and make timely treatment decisions.
Mohsin, B.; Zabani, N.; odah, n.; Yamani, F.; Hefni, l.; Alhowaiti, n.; Almutairi, A.; kausar, M. T.; Butt, N. S.; Habhab, W.
Show abstract
BackgroundABO-incompatible (ABOi) kidney transplantation is increasingly utilized to address donor shortages in end-stage kidney disease (ESKD) patients. However, the impact of immunosuppressive regimens on infection risks remains a concern. This study examines the spectrum of infections, associated risk factors, and their influence on graft outcomes over a 5-year period. MethodsA retrospective analysis of 24 adult ABOi kidney transplant recipients (2015-2019) was conducted, with follow-up until December 2024. Desensitization included rituximab, plasma exchange (PLEX), and IV immunoglobulin (IVIG). Infections were classified as bacterial, viral, fungal, or other opportunistic infections and their associations with graft survival and rejection were assessed. ResultsA total of 49 infectious episodes were recorded in 19 patients (79.2%); 5 patients had infection free follow-up of plus 5 years. Urinary tract infections (UTIs) were most common (23/49), followed by COVID-19 (11/49) and Influenza A (7/49). No episode of fungal infection was observed. Infection incidence was highest in females (52.6%), diabetics (47.4%), and patients with prior rejection episodes (10.5%). Kaplan-Meier analysis showed significantly lower infection-free survival in patients with graft rejection (p=0.0086). Despite frequent infections, overall graft survival remained high (91.7%), with no direct statistical association between infections and rejection. ConclusionInfections are prevalent in ABOi kidney transplant recipients, particularly in high-risk subgroups including females, patients with diabetes and prior graft rejection. However, long-term graft survival remains favorable with no association between infections and graft rejection. Optimized immunosuppression and infection surveillance are crucial for improving patient outcomes. Larger multicenter studies are warranted to validate these findings.
Dedinska, I.; Dadhania, D. M.; Li, C.; Hauser, N.; Lamba, P.; Lee, J. R.; Muthukumar, T.; Suthanthiran, M.
Show abstract
The long-term impact of SARS-CoV-2 infection on kidney allograft survival remains incompletely understood, particularly regarding the influence of vaccination, acute kidney injury (AKI), and post-infection immunosuppression. We conducted a retrospective analysis of 129 kidney transplant recipients with confirmed SARS-CoV-2 infection between March 2020 and March 2022 with a median follow-up of 50 months. Among 129 recipients, 106 (82%) received vaccination at any time before or after SARS-CoV-2 infection (82%) while 23 (18%) remained unvaccinated. Unvaccinated patients experienced significantly lower long-term graft survival (52% vs. 85%; p = 0.0004) and patient survival (83% vs. 99%; p = 0.0003) compared with vaccinated recipients. AKI occurred in 15% of recipients and independently predicted graft failure (aHR 2.88; p = 0.0341). Post-SARS-CoV-2 serum creatinine and albuminuria were strong prognostic markers of graft loss. Unvaccinated status independently predicted graft failure in both transplantation-anchored (aHR 2.80; p = 0.0342) and SARS-CoV-2-anchored models (aHR 5.31; p = 0.0004). Continuation of mycophenolate mofetil at post-infection assessment was associated with reduced graft-failure risk (aHR 0.99; p = 0.0193). These findings underscore the importance of sustained vaccination in preserving long-term allograft function.
Lovinfosse, P.; Bouquegneau, A.; Massart, A.; Pipeleers, L.; Bonvoisin, C.; Carp, L.; Everaert, H.; Jadoul, A.; Dendooven, A.; Geers, C.; Grosch, S.; Erpicum, P.; Hellemans, R.; Seidel, L.; Weekers, L.; Hustinx, R.; Jouret, F.
Show abstract
BackgroundSubclinical kidney allograft acute rejection (SCR) corresponds to "the unexpected histological evidence of acute rejection in a stable patient". The diagnosis of SCR relies on surveillance biopsy. Positron emission tomography (PET/CT) after injection of F18-fluorodeoxyglucose ([18F]FDG) has been proposed as a non-invasive screening approach. In the present multicenter prospective study, we assess the diagnostic yield [18F]FDGPET/CT to rule out SCR in stable KTR at 3 months post KTx. MethodsFrom 01/2021 to 03/2025, we prospectively combined surveillance biopsy and [18F]FDGPET/CT at [~]3 months post transplantation in adult kidney transplant recipients from 4 independent imaging centers. The mean standardized uptake value (mSUV) was measured in kidney cortex and referenced as a ratio to psoas muscle mSUV (mSUVR). ResultsOur multicentric 185-patient cohort was categorized upon Banff-2022: normal (n=158); borderline (n=18); SCR (n=9, including 6 T-cell-mediated rejection and 3 microvascular inflammation). No significant correlation was observed between the mSUVR and ti score (R=0.032, p-value=0.67). The mSUVR reached 2.33 [1.97-2.93], 2.71 [2.50-3.33] and 2.42 [2.27-3.14] in normal, borderline and SCR groups, respectively. In multivariate models stratified by center, the risk of non-normal histology (n=27, including borderline and SCR) increased with donor age (OR=1.05 [1.01-1.1], p=0.02) but not with the mSUVR (OR=4.11 [0.91-18.48], p=0.07). The risk of biopsy-proven SCR (n=9) was not significantly associated with mSUVR. ConclusionsThe mSUVR of [18F]FDG PET/CT does not reliably rule out SCR on surveillance biopsy.
Keslar, K. S.; Xu, W.; Zmijewska, A. A.; Margeta, D.; Bromberg, J. S.; Friedewald, J. J.; Abecassis, M. M.; Newell, K. A.; Morrison, Y.; Bridges, N. D.; Heeger, P. S.; Fairchild, R. L.
Show abstract
Non-invasive approaches to detect kidney graft injury and distinguish donor-specific immune-mediated injury from other causes of inflammation are needed to guide recipient therapy while avoiding the morbidities of transplant biopsies. We used a multi-plex platform to interrogate RNA isolated from kidney transplant recipient urine to investigate gene expression patterns distinguishing grafts with no injury vs. ongoing injury and further differentiate acute T cell-mediated rejection (TCMR) from BK virus nephropathy (BKVN). As a training set we quantified expression of 796 immune function genes from 25 control recipients with stable graft function, 17 with biopsy-proven acute TCMR, and 13 with biopsy-proven BKVN. We identified a 20-gene signature that differentiated intragraft injury from grafts with stable function (area under the curve (AUC), 0.991) and a distinct 40-gene signature distinguishing acute TCMR from BKVN (AUC = 1.00). Validation in separate 118 urine RNA samples obtained at time of surveillance or for-cause biopsies from Clinical Trials in Organ Transplantation (CTOT)-08 and CTOT-19 studies showed AUC of 0.77 for the 20-gene injury signature and AUC of 0.79 for the 40-gene signature. Our results highlight the utility of this flexible, non-invasive biomarker platform for rapid detection and differentiation of immune processes causing ongoing kidney graft injury.
Emara, M. M.; Elsedeiq, M.; Elmorshedi, M.; Neamatallah, H.; Abdelkhalek, M.; Yassen, A.; Nabhan, A.
Show abstract
BackgroundManagement of COVID-19 in transplant patients is a big challenge. Data on immunosuppression management, clinical picture, and outcomes are lacking. ObjectivesTo summarize the current literature on COVID-19 in transplant patients especially the data regarding the immunosuppression protocols, clinical presentation, and outcomes. Search strategyA systematic search of MEDLINE, EBSCO, CENTRAL, CINAHL, LitCovid, Web of Science, and Scopus electronic databases. The references of the relevant studies were also searched. The search was last updated on June 3, 2020. Selection CriteriaPrimary reports of solid organ transplant patients who developed COVID-19. An overlap of cases in different reports was checked. Data collection and analysisA descriptive summary of immunosuppression therapy (before and after COVID-19), clinical presentation (symptoms, imaging, laboratory, and disease severity), management (oxygen therapy, antiviral, and antibacterial), major outcomes (Intensive care admission, invasive mechanical ventilation, acute kidney injury), and mortality. Main resultsWe identified 74 studies reporting 823 cases of solid organ transplantation with COVID-19. Among 372 patients, 114 (30.6%) were mild COVID-19, 101 (27.2%) moderate, and 157 (42.2%) severe or critical. Major outcomes included intensive care unit admission, invasive ventilation, and acute kidney injury, which occurred in 121 (14.7%), 97 (11.8%), and 63 (7.7%) of patients, respectively. Mortality was reported in 160 (19.4%) patients. Missing individual data hindered making clinical correlations. ConclusionCOVID-19 in solid organ transplant patients probably has a more disease severity, worse major outcomes (Intensive care admission, invasive ventilation, acute kidney injury), and higher mortality than in non-transplant patients.
Poppelaars, F.; Gaya da Costa, M.; Faria, B.; Eskandari, S. K.; Petr, V.; Holers, V. M.; Daha, M. R.; Berger, S. P.; Damman, J.; Seelen, M. A.; Thurman, J. M.
Show abstract
BackgroundGenetic analysis in transplantation offers potential for personalized medicine. Given the crucial role of the complement system in renal allograft injury, we investigated in kidney transplant pairs the impact of complement polymorphisms on long-term outcomes. MethodsIn this observational cohort study, we analyzed polymorphisms in C3 (C3R102G), factor B (CFBR32Q), and factor H (CFHV62I) genes of 1,271 donor-recipient kidney transplant pairs and assessed their association with 15-year death-censored allograft survival. ResultsIndividually, only the presence of the CFB32Q variant in the donor and the combined presence in donor-recipient pairs were associated with better graft survival (P=0.027 and P=0.045, respectively). In the combined analysis, the C3R102G, CFBR32Q, and CFHV62I variants in the donor independently associated with the risk of graft loss (HR 1.32; 95%-CI, 1.08- 1.58; P=0.005). Thus, donor kidneys carrying the genetic variants that promote the highest complement activity exhibited the worst graft survival, whereas those with the genetic variants causing the lowest complement activity showed the best graft survival (15-year death-censored allograft survival: 48.8% vs 87.8%, P=0.001). ConclusionOur study demonstrates that the combination of complement polymorphisms in the donor strongly associates with long-term allograft survival following kidney transplantation. These findings hold significance for therapeutic strategies involving complement inhibition in kidney transplantation.
Mamber Czeresnia, J.; Tsai, H.; Ajaimy, M.; Tow, C. Y.; Patel, S. R.; Jorde, U. P.; Madan, S.; Hemmige, V.
Show abstract
The COVID-19 pandemic has reduced access to solid organ transplantation, compounding organ shortages and waitlist mortality. A continued area of uncertainty is the safety of transplanting organs recovered from SARS-CoV-2 infected donors, as autopsies of patients who died with COVID-19 show that the virus can be found in extra-pulmonary organs1. Case reports and series on transplantation of these organs have been published 2, 3, but population-level data is lacking. We queried a national transplant database for recipients of organs recovered from donors recently infected by SARS-CoV-2. For organs with more than 50 cases, these were then propensity-score matched at a ratio of 1:10 to similar recipients of organs recovered from donors who tested negative for SARS-CoV-2 (controls). Data were extracted from the Scientific Registry of Transplant Recipients (SRTR - v2203 - updated March 2022), which collects detailed information on all solid organ transplants in the United States since 1986. Cases were defined as adult ([≥] 18 years) recipients of organs recovered from deceased donors who tested positive for SARS-CoV-2 by nasopharyngeal or lower respiratory sample polymerase chain reaction or antigen assay within 7 days of organ transplantation. Multiple organ transplants were excluded. There were 775 kidney, 330 liver, 123 heart, 44 kidney-pancreas, 16 lung, 5 pancreas, and 3 small bowel transplants of organs recovered from 393 deceased donors recently infected by COVID-19. For kidney, liver, and heart transplants, Kaplan-Meier curves of both overall and graft survival at 90 days were similar between cases and controls. Our data shows that transplanting kidneys, livers, and hearts recovered from deceased donors recently infected by SARS-CoV-2 was not associated with increased recipient mortality or worse graft-survival. This should help transplant providers make decisions regarding acceptance of these organs, and counsel transplant candidates on the safety of receiving them. The limited number of kidney-pancreas, lung, pancreas, and intestinal cases precludes significant conclusions for these organs. Our data also strongly supports the notion that donors with recent COVID-19 infection should not be automatically excluded from the donor pool. The limited number of kidney-pancreas, lung, pancreas, and intestinal cases precludes significant conclusions for these organs. Limitations include lack of data on donor infection timeline and estimates of viral load (PCR cycle thresholds), description of donor COVID-19 symptomatology at organ procurement, donor or recipient vaccination or prior COVID-19 infection status, which are not tracked in the database. We did not have information regarding transmission of COVID-19 to transplant recipients. Future analysis of updated versions of the database should help address. Our data strongly support the notion that donors with recent COVID infection should not be automatically excluded from the donor pool. Prospective studies are needed to confirm our findings and provide insights on optimal post-transplant management of these recipients.