Gigabyte
● GigaScience Press
Preprints posted in the last 7 days, ranked by how well they match Gigabyte's content profile, based on 60 papers previously published here. The average preprint has a 0.06% match score for this journal, so anything above that is already an above-average fit.
Valestrino, K. J.; Ihediwa, C. V.; Dorius, G. T.; Conger, A. M.; Glinka-Przybysz, A.; McCormick, Z. L.; Fogarty, A. E.; Mahan, M. A.; Hernandez-Bello, J.; Konrad, P. E.; Burnham, T. R.; Dalrymple, A. N.
Show abstract
ObjectivesEpidural spinal cord stimulation (SCS) is an emerging therapy for motor rehabilitation following spinal cord injury (SCI) and other motor disorders. Conventionally, SCS leads are placed along the dorsal spinal cord (SCSD), where stimulation activates large diameter afferent fibers, which indirectly activate motoneurons through reflex pathways. This leads to broad activation of flexor and extensor muscles and limited fine-tuned control of motor output. Targeting the ventral spinal cord (SCSV) may enable more direct activation of motoneuron pools, potentially improving the specificity of muscle activation; however, there is currently no established method to place leads ventrally. To address this, we evaluated the feasibility of four modified percutaneous implantation techniques to target the ventrolateral thoracolumbar spinal cord. Materials and methodsPercutaneous SCSV implantation was performed in three human cadaver torso specimens under fluoroscopic guidance. The following approaches were evaluated: sacral hiatus, transforaminal, interlaminar contralateral, and interlaminar ipsilateral. The leads in the latter 3 approaches were inserted between L1 and L5. Eighteen implants were attempted, with nine leads retained for analysis. Lead and electrode position were assessed using computed tomography (CT) with three-dimensional reconstruction, along with anatomical dissection to verify lead and electrode placement within the epidural space. ResultsSuccessful ventral epidural lead placement was achieved using all four implantation approaches. The sacral hiatus (16/16 electrodes) and transforaminal (8/8 electrodes) approaches resulted in exclusively ventrolateral placement. The interlaminar contralateral approach led to 27/32 electrodes positioned ventrolaterally and 5/32 dorsally. The interlaminar ipsilateral implantation approach led to 14/32 electrodes positioned ventrolaterally and 18/32 positioned ventromedially. ConclusionsThese findings demonstrate that ventral epidural SCS lead placement can be achieved using modified percutaneous implant techniques. The four approaches outlined here provide a clinically feasible pathway to SCSV and establishes a foundation for future clinical studies investigating SCSV for motor rehabilitation following SCI.
Sarwin, G.; Ricciuti, V.; Staartjes, V. E.; Carretta, A.; Daher, N.; Li, Z.; Regli, L.; Mazzatenta, D.; Zoli, M.; Seungjun, R.; Konukoglu, E.; Serra, C.
Show abstract
Background and Objectives: We report the first intraoperative deployment of a real-time machine vision system in neurosurgery, derived from our previous anatomical detection work, automatically identifying structures during endoscopic endonasal surgery. Existing systems demonstrate promising performance in offline anatomical recognition, yet so far none have been implemented during live operations. Methods: A real-time anatomy detection model was trained using the YOLOv8 architecture (Ultralytics). Following training completion in the PyTorch environment, the model was exported to ONNX format and further optimized using the NVIDIA TensorRT engine. Deployment was carried out using the NVIDIA Holoscan SDK, the system ran on an NVIDIA Clara AGX developer kit. We used the model for real-time recognition of intraoperative anatomical structures and compared it with the same video labelled manually as reference. Model performance was reported using the average precision at an intersection-over-union threshold of 0.5 (AP50). Furthermore, end-to-end delay from frame acquisition to the display of the annotated output was measured. Results: A mean AP50 of 0.56 was achieved. The model demonstrated reliable detection of the most relevant landmarks in the transsphenoidal corridor. The mean end-to-end latency of the model was 47.81 ms (median 46.57 ms). Conclusion: For the first time, we demonstrate that clinical-grade, real-time machine-vision assistance during neurosurgery is feasible and can provide continuous, automated anatomical guidance from the surgical field. This approach may enhance intraoperative orientation, reduce cognitive load, and offer a powerful tool for surgical training. These findings represent an initial step toward integrating real-time AI support into routine neurosurgical workflows.
Morrissey, D.; Sharif, F.; Fearon, A.; Neal, B. S.; Bremer, T.; Swinton, P.; Newman, P.; Lack, S.; Cooper, K.; Rabello, R.; D2P Group,
Show abstract
IntroductionMusculoskeletal conditions have high, and increasing, incidence and prevalence. Although there are many clinical guidelines available for common conditions, most are poor quality and sparsely adopted into practice. We aim to improve patient outcomes by developing robust Best Practice Guidelines (BPG) to get research findings into practice for a range of common musculoskeletal conditions. Methods and analysisMixed methods with systematic review of high-quality studies and qualitative elicitation of both patients perspectives and expert clinical reasoning through in-depth interviews will form the basis for the BPGs. A segregated convergent synthesis, informed throughout by stakeholder engagement, will guide the format and structure of the BPGs. Ethics, outputs and disseminationEthical approval for the qualitative studies and implementation events will be obtained from university and health service research ethics committees. Educational packages for each BPG condition will be hosted online and be available for students, clinicians, and education providers. Dissemination will follow traditional routes including publications and presentations; alongside innovative approaches such as collaboration with higher education institutions, online hosting, adoption by professional bodies, and a social media campaign. Implementation will occur adaptively in multiple national contexts to reflect local requirements and resources, deploying participatory and implementation methods that are contextually and culturally appropriate. KEY MESSAGESO_LIWhat is already known on this topic - Clinical guidelines for the management of musculoskeletal conditions are common, but have limitations regarding quality, applicability, editorial independence, and patient perspective. They are rarely adopted into clinical practice. C_LIO_LIWhat this study adds - We have developed a robust (supported by Patient and Participant Involvement) mixed-methods approach that integrates the three components of evidence-based medicine: synthesis of high-quality evidence, patients perspectives/values, and expert clinical reasoning. We have also developed an education, dissemination, and implementation approach to facilitate international adoption of these guidelines. C_LIO_LIHow this study might affect research, practice or policy - The guideline development methods will integrate the three pillars of evidence-based practice and ensure they are robust and clinically applicable. Creation of educational material combined with an implementation and dissemination plan will support adoption into clinical practice of different countries and cultures, designed to lead to improved patient outcomes. C_LI
Seckin, E.; Colinet, D.; Bailly-Bechet, M.; Seassau, A.; Bottini, S.; Sarti, E.; Danchin, E. G.
Show abstract
Orphan genes, lacking homologs in other species, are systematically found across genomes. Their presence may result from extensive divergence from pre-existing genes or from de novo gene birth, which occurs when a gene emerges from a previously non-genic region. In this study, we identified orphan genes in the genomes of globally distributed plant-parasitic nematodes of the genus Meloidogyne and investigated their origins, evolution, and characteristics. Using a comparative genomics framework across 85 nematode species, we found that 18% of Meloidogyne genes are genus-specific, transcriptionally supported orphans. By combining ancestral sequence reconstruction and synteny-based approaches, we inferred that 20% of these orphan genes originated through high divergence, while 18% likely emerged de novo. Proteomic and translatomic evidence confirmed the translation of a subset of these genes, and feature analyses revealed distinctive molecular signatures, including shorter length, signal peptide enrichment, and a tendency for extracellular localization. These findings highlight orphan genes as a substantial and previously underexplored component of the Meloidogyne genome, with potential roles in their worldwide parasitism.
Yamasaki, F.; Seike, M.; Hirota, T.; Sato, T.
Show abstract
Background: Deep brain stimulation (DBS) is a treatment option for Parkinson disease (PD). However, the effect of DBS on the arterial pressure (AP) remains unexplored. We aimed to develop an artificial baroreflex system for treating orthostatic hypotension (OH) due to central baroreflex failure in patients with PD. To achieve this, we developed an appropriate algorithm after estimating the dynamic responses of the AP to DBS using a white noise system identification method. Methods: We randomly performed DBS while measuring the AP tonometrically in 3 trials involving 3 patients with PD treated with DBS. We calculated the frequency response of the AP to the DBS using a fast Fourier transform algorithm. Finally, the feedback correction factors were determined via numerical simulation. Results: The frequency responses of the systolic AP to random DBS were identifiable in all 3 trials, and the steady state gain was 8.24 mmHg/STM. Based on these results, the proportional correction factor was set to 0.12, and the integral correction factor was set to 0.018. The computer simulation revealed that the system could quickly and effectively attenuate a sudden AP drop induced by external disturbances such as head-up tilting. Conclusion: An artificial baroreflex system with DBS may be a novel therapeutic approach for OH caused by central baroreflex failure.
Farre, R.; Salama, R.; Rodriguez-Lazaro, M. A.; Kiarostami, K.; Fernandez-Barat, L.; Oliveira, V. D. C.; Torres, A.; Farre, N.; Dinh-Xuan, A. T.; Gozal, D.; Otero, J.
Show abstract
BackgroundThe COVID-19 pandemic exposed critical shortages of mechanical ventilators, particularly in low-resource settings. Disruptions in global supply chains and dependence on specialized components highlighted the need for scalable, locally manufacturing alternatives for emergency respiratory support. AimTo describe and evaluate a simplified, supply-chain-independent mechanical ventilator assembled from widely available automotive and simple hardware components, and intended as a last-resort solution. MethodsThe ventilator is based on a reciprocating air pump driven by an automotive windshield wiper motor coupled to parallel shaft bellows and readily assembled passive membrane valves, only requiring materials available from standard hardware retailers, minimal tools, and basic manual skills. Ventilator performance was assessed through bench testing using a patient model simulating severe lung disease in an adult (R=20 cmH2O{middle dot}s/L, C=15 mL/cmH2O) and pediatric (R=50 cmH2O{middle dot}s/L, C=10 mL/cmH2O) patients. Realistic proof of concept was performed in four mechanically ventilated 50-kg pigs. ResultsThe device delivered tidal volumes up to 600 mL and respiratory rates up to 45 breaths/min with PEEP up to 10 cmH2O, covering pediatric and adult ventilation ranges. In vivo testing showed that the ventilator maintained arterial blood gases within the targeted range. Technical details for ventilator construction are provided in an open-source video tutorial. DiscussionThis low-cost ventilator demonstrated adequate performance under demanding conditions. Although not a substitute for commercial intensive care ventilators, its simplicity, autonomy, and independence from fragile supply chains provide a potentially life-saving option in resource-constrained emergency scenarios.
Sherwani, M.; Azhar, M. K.; Khan, S.; Ali, D.; Husain, S.; Khan, A.
Show abstract
IntroductionComparison of rectal cancer characteristics in Pakistani Americans and native Pakistanis remains poorly investigated, as migrant studies have predominantly concentrated on East and Southeast Asian groups. This research aims to compare clinicopathological characteristics between the two groups. We hypothesize that significant differences will exist between these cohorts, mediated by gene-environment interactions. MethodsThis was a retrospective cohort study utilizing two multi-institutional databases to identify adult patients with rectal cancer: the National Cancer Database in the U.S (2018-2022) and the Rectal Cancer Surgery and Epidemiology Study in Pakistan (2020-2021). Non-Hispanic Whites (NHWs) were included as a reference population for comparative analysis. Clinicopathological characteristics were compared using Wilcoxon rank-sum and chi-square tests. ResultsA total of 523 Pakistani Americans and 608 native Pakistanis were included in the study. The median age at diagnosis was 57 years in Pakistani Americans (IQR 48-68), 42 years (IQR 33-54) in native Pakistanis and 63 years in NHWs (IQR 54-73) (p < 0.001). Native Pakistanis presented with early-stage disease less often than Pakistani Americans and NHWs (5.3%, 25.1%, and 20.5%, respectively; p < 0.001) and had markedly higher rates of signet cell carcinoma (20.1%, 0.6%, and 0.4%, respectively; p < 0.001) and poorly differentiated tumors (29.0%, 10.4%, and 11.4%, respectively; p < 0.001). ConclusionsThis study found that Native Pakistanis with rectal cancer presented at a younger age and with more aggressive tumor characteristics compared to both Pakistani Americans and NHWs. Notably, Pakistani Americans displayed a distinct clinical profile, intermediate between both groups.
Christiana, K. A.; Anselme, M.; Juliette, T.-D.; Aristote Wendpanga, D. N.; Boukary, D.; Issouf, K.; Samuel, K. D.; Lydie, T. Y.; Madi, K.; Abdoulaye, O.; Madi, S.; Sanata, B.; Jacques, Z.; Therese, K.; Abdoul-Salam, O.; Baptiste, A. J.; Macaire, O.; Pascal, N.
Show abstract
Social stigma surrounding chronic skin Ulcer leads patients to hide their wounds or delay seeking medical care. The aim of this study was to explore the types and causes of chronic skin ulcers among patients seen in the dermatology departments of two university hospitals in Burkina Faso. This was a cross-sectional, retrospective study covering an 11-year period, from 2013 to 2023. A review of consultation records allowed for the collection of sociodemographic and clinical data from 104 patients who were seen for chronic skin ulcers over the 11-year period, averaging 9 patients per year. The patients were primarily adults (n=60) and older adults (n=21). Leg ulcers were the condition observed in most patients (n=59). Eight cases of Buruli ulcer (7.69%) were identified among the 104 patients. Five of the eight cases, or 62.50%, were aged between 0 and 19 years. Half of the eight patients resided in Ouagadougou. These results highlight low utilization of dermatology services for chronic skin ulcers. Furthermore, indigenous cases of Buruli ulcer have been identified in Burkina Faso. Consequently, our findings call for the implementation of strategies focused on addressing social perceptions of these ulcers and on the screening and management of this disease.
Miao, H.; LeBoutillier, B.; Lantis, J. C.; Fife, C.
Show abstract
ObjectiveTo evaluate the real-world effectiveness of Intact Fish Skin Graft (IFSG) compared with standard of care (SOC) in the treatment of Stage 3-4 pressure ulcers, using clinically meaningful outcomes including wound healing rate and percent area reduction (PAR). Materials and MethodsA retrospective matched cohort study was conducted using deidentified electronic health record (EHR) data from the U.S. Wound Registry. Patients with Stage 3-4 pressure ulcers treated with IFSG (n=40) were compared to a matched SOC control group (n=40). 1:1 covariate matching was performed to reduce confounding across key patient and wound characteristics, including age, mobility status, comorbidities (e.g., diabetes, peripheral artery disease), and wound features (age, size, location, and depth). Outcomes included healed status, healed or improved rate, and percent area reduction (PAR). ResultsThe study population represented a high-risk, real-world cohort (n=40 per group), with only 37.5% ambulatory patients and a high prevalence of multiple concurrent wounds. IFSG treatment demonstrated superior clinical outcomes compared to SOC: O_LIHealed or improved: 67.5% (IFSG) vs 55.0% (SOC) (p=0.0379) C_LIO_LIHealed: 45.5% (IFSG) vs 33.3% (SOC) C_LIO_LIPercent area reduction (PAR): 49% (IFSG) vs 34% (SOC) (p=0.0028) C_LI These findings indicate statistically significant improvements in percent area reduction and in the proportion of wounds that were healed or improved with IFSG. The proportion achieving complete healing was numerically higher with IFSG than with SOC, but this difference did not reach statistical significance. ConclusionIn this real-world matched cohort analysis, Intact Fish Skin Graft demonstrated superior effectiveness compared to standard of care in the management of Stage 3-4 pressure ulcers, with improvements in healing-related outcomes and percent area reduction. These results support the use of IFSG as an effective advanced therapy for hard-to-heal pressure ulcers.
Nguyen, T. T. T.; Nguyen, V. L.; Vo, N. N. Y.; Nguyen, H. C. D.; Nguyen, H. T. T.
Show abstract
Background Type 2 diabetes mellitus (T2DM) is a chronic disease that imposes a significant burden on healthcare systems and society. In Vietnam, the prevalence of T2DM is rapidly increasing; however, evidence on treatment expenditure derived from large administrative databases remains limited. This study was carried out provides an overview of total treatment expenditures for T2DM across hospital tiers between 2018 and 2022. Methods This cross-sectional descriptive study utilized retrospective health insurance (HI) data from 2018-2022. Data was collected and analyzed based on cost components (medications, diagnostic tests, hospital beds, etc.) across healthcare facilities classified by hospital level. Costs were converted to 2024 USD using the CCEMG-EPPI-Centre cost converter. Results Total expenditure increased from 227.17 million USD in 2018 to 425.53 million USD in 2022 with spending concentrated in Class I and Class II healthcare facilities, although their shares declined over time, while the proportions attributed to unclassified and special-class facilities increased. Drugs accounted for the largest share of expenditure (49.65%-78.95%), followed by laboratory tests (7.31%-19.89%) across all hospital classifications. Other components, including hospital beds, diagnostic imaging, procedures/surgeries, and medical supplies, contributed smaller proportions but increased over time in several facility groups. Conclusion The study indicates that medication costs constitute the largest share of treatment expenditure for type 2 diabetes mellitus at healthcare facilities, reflecting the long-term treatment requirements of this chronic disease. In addition, health expenditure remained concentrated in Class I and Class II healthcare facilities, although their shares declined over the study period, while the proportions attributed to unclassified and special-class facilities increased. These findings suggest the need to strengthen diabetes screening, treatment, and follow-up at lower-level healthcare facilities in order to reduce the burden on higher-level hospitals and improve the efficiency of healthcare resource allocation.
Hoque, A.; Rahman, M.; Basak, S. K.; Mamun, A. A.
Show abstract
BackgroundIn the absence of structured donor registries, social media platforms have become a dominant mechanism for blood donor recruitment in many low-resource settings. However, the implications of this shift for transfusion timeliness and system reliability remain unclear. ObjectiveTo evaluate the impact of social media-sourced donors on transfusion delay, donor reliability, and hemovigilance-related outcomes compared with conventional donor pathways. MethodsThis prospective analytical study included 400 transfusion episodes across tertiary hospitals in Bangladesh. Donor sources were categorized as social media (SM) or conventional (CON). The primary outcome was delay-to-transfusion. Secondary outcomes included donor-related irregularities, documentation completeness, near-miss events, and acute transfusion reactions. Multivariable logistic regression identified predictors of delay [≥]4 hours. ResultsSocial media-sourced donors were associated with significantly longer transfusion delays (5.98 vs 2.97 hours; p<0.001). Delay [≥]4 hours occurred in 83.6% of SM cases versus 17.6% of CON cases (OR 23.78). Donor-related irregularities were observed in 85% of SM episodes and absent in CON donors. Safety outcomes did not differ significantly between groups. Social media donor sourcing remained the strongest independent predictor of delay (adjusted OR 18.09). ConclusionUnregulated social media-based donor recruitment introduces substantial delays and undermines system reliability without improving access. Integration of digital tools into regulated donor systems is essential to strengthen transfusion timeliness and hemovigilance in resource-limited settings.
Alqaderi, H.; Kapadia, U.; Brahmbhatt, Y.; Papathanasiou, A.; Rodgers, D.; Arsenault, P.; Cardarelli, J.; Zavras, A.; Li, H.
Show abstract
BackgroundDental caries and periodontal disease represent the most prevalent global oral health conditions, collectively affecting several billion people. The diagnostic interpretation of dental radiographs, a cornerstone of modern dentistry, is associated with considerable inter-observer variability. In routine clinical practice, clinicians are required to evaluate a high volume of radiographic images daily, a cognitively demanding task in which diagnostic fatigue, time constraints, and the inherent complexity of overlapping anatomical structures can lead to the inadvertent oversight of early-stage pathologies. Artificial intelligence (AI) offers a transformative opportunity to augment clinical decision-making by providing rapid, objective, and consistent radiographic analysis, thereby serving as a tireless adjunct capable of flagging findings that may be missed during routine human inspection. MethodsThis study developed and validated a deep learning system for the automated detection of dental caries and alveolar bone loss using a dataset of 1,063 periapical and bitewing radiographs. Two separate YOLOv8s object detection models were trained and evaluated using a rigorous 5-fold cross-validation methodology. To align with the clinical use-case of a screening tool where high sensitivity is paramount, a custom image-level evaluation criterion was employed: a true positive was recorded if any predicted bounding box had a Jaccard Index (IoU) > 0 with any ground truth annotation. Model performance was systematically evaluated at confidence thresholds of 0.10 and 0.05. ResultsAt a confidence threshold of 0.05, the caries detection model achieved a mean precision of 84.41% ({+/-}0.72%), recall of 85.97% ({+/-}4.72%), and an F1-score of 85.13% ({+/-}2.61%). The alveolar bone loss model demonstrated exceptionally high performance, with a mean precision of 95.47% ({+/-}0.94%), recall of 98.60% ({+/-}0.49%), and an F1-score of 97.00% ({+/-}0.46%). ConclusionThe YOLOv8-based models demonstrated high accuracy and high sensitivity for detecting dental caries and alveolar bone loss on periapical radiographs. The system shows significant potential as a reliable automated assistant for dental practitioners, helping to improve diagnostic consistency, reduce the risk of missed pathology, and ultimately enhance the standard of patient care.
Gupta, V.; Podder, D.; Saha, S.; Shah, B.; Ghosh, S.; Kumar, J.; Jacoby, A. P.; Nag, A.; Chattopadhyay, D.; Javed, R.; Rath, A.; Chakraborty, S.; Demde, R.; Vinarkar, S.; Parihar, M.; Zameer, L.; Mishra, D.; Chandy, M.; Nair, R.
Show abstract
Waldenstrom macroglobulinemia (WM) is a rare indolent neoplasm characterized by presence of more than 10% lymphoid cells in BM that exhibit plasmacytoid or plasma cell differentiation that secretes an IgM monoclonal protein. This is a retrospective analysis of 89 patients of WM that describes the clinical and laboratory characteristics, treatment patterns and outcome of patients of WM. The median age of the entire cophort was 66 years with male predominance (67.4%). Most common presentations were symptoms pertaining to anemia (77.5%) and constitutional symptoms (33.7%). Median bone marrow lymphoplasmacytic cells were 41%. Positivity for MYD88 and CXCR4 mutations were seen in 81.8% and 2.4% cases. BR was the most common regimen used (52.8%). Overall response rates were seen at 87.8%. Median overall survival, progression free survival and time to next treatment is 8.49 years, 2.15 years and 3.88 years. BR regimen was associated with highest event free survival.
Shen, Q.; Wang, G.; Fu, M.; Yao, K.; Yang, Y.; Zeng, Q.; Guo, Y.
Show abstract
Background: Lateral lymph node metastasis (LLNM) is associated with poor prognosis in patients with rectal cancer and may influence the indication for lateral lymph node dissection. Accurate preoperative identification of LLNM remains challenging. This study aimed to develop and internally validate a clinicoradiological model for preoperative prediction of LLNM in rectal cancer. Methods A retrospective cohort of 64 patients undergoing lateral lymph node dissection (LLND) for rectal cancer was analysed; 21 (32.8%) had pathological lateral lymph node metastasis (LLNM). A prespecified preoperative clinicoradiological model was fitted using penalised logistic regression with L2 regularisation (ridge), incorporating MRI-measured lateral lymph node short-axis diameter (LLN-SAD), dichotomised clinical T stage (T3-4 vs T1-2), dichotomised clinical N stage (N+ vs N0), and log(CA19-9+1). Model performance was evaluated using the area under the receiver operating characteristic curve (AUC), calibration analysis, and bootstrap internal validation. Results The model showed good discrimination (AUC 0.914), with an optimism-corrected AUC of 0.887 on bootstrap validation. Calibration remained acceptable after optimism correction (calibration intercept -0.127; slope 1.045). Decision curve analysis suggested net benefit across clinically relevant threshold probabilities, particularly between 0.10 and 0.30. The model was implemented as a web-based calculator to facilitate clinical use. Conclusion This clinicoradiological model showed good discrimination, acceptable calibration, and potential clinical utility for preoperative assessment of LLNM risk in rectal cancer. It may assist individualized risk stratification and treatment planning, although external validation is required before routine clinical implementation.
Streicher, N. S.
Show abstract
Background and ObjectivesPatient portals have become essential infrastructure for healthcare delivery following the 21st Century Cures Act, yet adoption remains inequitable. Understanding demographic and geographic determinants of portal activation is critical for addressing digital health disparities, particularly among neurology patients who face unique access barriers. We examined the demographic, geographic, and neighborhood-level factors associated with patient portal activation among neurology patients at multiple geographic scales in the Washington, DC metropolitan area. MethodsWe conducted a retrospective cohort study of 72,417 adult neurology patients seen at two academic medical centers sharing an electronic health record in Washington, DC (February 2021-February 2026). We examined portal activation using multivariable logistic regression and geographic analysis at four nested scales: the metropolitan catchment area, DCs eight wards, individual census tracts (via geocoded patient addresses), and individual DC residents. ResultsPortal activation was 64.7% overall. Activation varied by race/ethnicity (Non-Hispanic White 76.1%, Non-Hispanic Black 57.0%, Non-Hispanic Asian 57.6%, Hispanic 55.0%) and geography (DC Ward 2: 82.0% vs. Ward 7: 48.0%). Ward-level educational attainment (r = 0.948), broadband access (r = 0.889), and income (r = 0.811) were strongly correlated with activation. Within individual wards, Non-Hispanic White patients activated at 84-91% while Non-Hispanic Black patients activated at 48-64%, demonstrating that neighborhood resources alone do not explain disparities. DiscussionPatient portal activation is shaped by demographic, socioeconomic, and geographic factors operating at multiple levels. Persistent within-ward racial disparities indicate that geographically targeted interventions must be paired with culturally tailored approaches to achieve digital health equity.
Saeed, F. U.; Kubio, C.; Kutame, R.; Boateng, G.
Show abstract
BackgroundLaboratory services are essential to the provision of health service delivery across the world. In resource-constrained settings such as in low- and middle-income countries like Ghana, maintenance of a strong capacity could be more challenging. This study assessed the capacity and gaps in laboratory service delivery in three districts of the Savannah Region of Ghana. MethodsThe WHO laboratory assessment tool (LAT) was adapted to collect data in 10 health facilities based on 11 operational system modules. Data were collected through interviews. Capacity was defined based on a 100-point score scale and interpreted as weak (<50%), moderate (50-80%) and strong (>80%). Differences in median scores were determined using Friedman and Kruska-Wallis tests. Statistical significance was set at p < 0.05. A scale (0-5) was used to identify the needs of the laboratory. ResultsOverall, capacity score was moderate, mean 50% {+/-} 25.7 with a median score of 52.5%, IQR: 30.0-68.5%. Testing module received the highest score, 71.5%, while document module scored the lowest, 14.5%. Scores did not differ significantly between system components after multiple comparisons, p>adjusted alpha. Hospital-level laboratories performed significantly higher than polyclinics (adjusted p = 0.044) and health centers (adjusted p<0.001). The biggest needs were biosafety, equipment maintenance, human and financial resources (median gap score: 3-4). ConclusionThe laboratory capacity in the health system of the Savannah Region was moderate, requiring improvements in all operational areas. The biggest needs include safety, equipment, human and financial support systems. Addressing these critical gaps would have direct impact on public health and patient outcomes.
Blythe, R.; Senanayake, S.; Bylstra, Y.; Roberts, J.; Choi, C.; Yeo, M. J.; Goh, J.; Graves, N.; Koh, A. L.; Jamuar, S. S.
Show abstract
BackgroundCarrier screening for inherited genetic disorders can reduce the burden of conditions that lead to childhood morbidity and mortality, including thalassaemia, cystic fibrosis, and spinal muscular atrophy. To be successful, national carrier screening programs should aim to maximise uptake, which may depend on population preferences for screening characteristics. In this study, we aimed to determine how expanded carrier screening in Singapore should be designed based on operational factors including suggested copayments, wait times, and disorders included in screening panels. MethodsWe elicited stated preferences for the design of a hypothetical national carrier screening program with seven attributes from 500 Singaporeans of reproductive age (18 to 54). A discrete choice experiment was applied using 30 choice tasks with 3 alternatives per task, divided between 3 blocks. The mixed multinomial logit model was used to estimate willingness-to-pay for each attribute level. Predicted uptake for three plausible screening programs was assessed, with copayment amounts from $0 to $1,200 in increments of $30. Impact on the annual national budget was calculated as a function of 25,000 expected eligible couples per year. All costs were reported in 2026 SGD. ResultsRespondents showed the strongest preferences for cost, followed by the number of diseases included in the panel, then wait times, with limited impact of remaining attributes. With no copayments, predicted uptake ranged from 85% [95% CI: 83% to 87%] to 90% [88% to 92%] for the basic and utility-maximising screening programs, respectively. This declined to 61% [56% to 66%] and 69% [65% to 73%] and, respectively, at a copayment of $1,200 per test. The model predicted higher uptake if a selection of screening alternatives were available, compared to a single program. The budget impact was highly dependent on population eligibility, copayments, and couples decision-making processes, but was unlikely to exceed $22.5m [$19.0m to $26.6m] per year unless expanded beyond married couples. ConclusionsThere was high predicted demand for carrier screening even as copayments increased. Successful strategies to improve uptake may include reducing copays and wait times, increasing the number of screening options available to prospective parents, and increasing program eligibility beyond pre-conception married couples.
Matimo, C. R.; Kacholi, G.; Mollel, H. A.
Show abstract
BackgroundDigital health plays an indispensable role in facilitating data analysis and use for enhancing healthcare delivery across health settings. However, there is scant information on the extent to which digital health influences the improvement of primary health services delivery through data use. This study examined the determinants that influence the use of digital health to improve health service delivery in council hospitals in Tanzania. MethodsA cross-sectional design was employed in six regions, involving 12 council hospitals. We used a self-administered questionnaire to collect data from 203 members of hospital quality improvement teams. Descriptive analysis was used to determine the frequency, proportion, and mean of responses, while bootstrapping analysis was conducted to test the statistically significant influence of digital health factors on data use for improving health service delivery. ResultsResults show moderate agreement on data compatibility for planning and decision-making, with 40.4% of respondents agreeing it supports ordering commodities, 43.8% for staff allocation, and 38.4% for planning. However, dissatisfaction was higher for user-friendliness (47.8%), reliability (up to 65.5%), and usefulness (up to 63.5%). Overall, 50.2% (M=2.74{+/-}0.87) disagreed that digital systems effectively support data use. Structural model analysis confirmed significant positive influence of usefulness ({beta}=0.199, p<0.001) and access to quality data ({beta}=0.729, p<0.001) on data use, which strongly impacted service delivery ({beta}=0.593, p<0.001), despite some factors showing no direct influence. ConclusionThe study finds that current digital health initiatives only modestly improve the user-friendliness, reliability, and usefulness of data systems, partly due to fragmented, non-interoperable platforms that burden data management. However, compatibility, usability, reliability, and usefulness of digital tools significantly enhance access to quality data and data-driven decisions. The study recommends strengthening and integrating existing systems and providing continuous digital health training to institutionalize data-informed decision-making.
Panapruksachat, S.; Troupin, C.; Souksavanh, M.; Keeratipusana, C.; Vongsouvath, M.; Vongphachanh, S.; Vongsouvath, M.; Phommasone, K.; Somlor, S.; Robinson, M. T.; Chookajorn, T.; Kochakarn, T.; Day, N. P.; Mayxay, M.; Letizia, A. G.; Dubot-Peres, A.; Ashley, E. A.; Buchy, P.; Xangsayarath, P.; Batty, E. M.
Show abstract
We used 2492 whole genome sequences from Laos to investigate the molecular epidemiology of SARS-CoV-2 from 2021 through 2024, covering the major waves of COVID-19 disease in Laos including time periods of travel restrictions and after relaxation of travel across international borders. We identify successive waves of COVID-19 caused by shifts in the dominant lineage, beginning with the Alpha variant in April 2021 and continuing through the Delta and Omicron variants. We quantify a shift from a small number of viral introductions responsible for widespread transmission in early waves to a larger number of introductions for each variant after travel restrictions were lifted, and identify potential routes of introduction into the country. Our study underscores the importance of genomic surveillance to public health responses to characterize viral transmission dynamics during pandemics.
Ahmed, W.; Gebrewold, M.; Verhagen, R.; Koh, M.; Gazeley, J.; Levy, A.; Simpson, S.; Nolan, M.
Show abstract
Wastewater surveillance (WWS) is established as a vital tool for monitoring polio and SARS-CoV-2 with potential to improve surveillance for many other infectious diseases. This study evaluated the feasibility of detecting measles virus (MeV) RNA in wastewater as part of a national WS preparedness trial in Brisbane, Australia, from March to June 2025. Composite and passive sampling methods were employed in parallel at three wastewater treatment plants serving populations between 230,000 and 584,000. Nucleic acids were extracted and analyzed using RT-qPCR targeting MeV N and M genes to distinguish wild-type and vaccine strains. MeV RNA were detected in both 24-hour composite and passive samples on May 26 to 27, 2025 from the largest catchment of 584,000 which also included an international airport. No measles cases were reported in this city or region within 4 weeks of the WS detections. These were confirmed as vaccine-derived measles virus (MeVV) strain via specific RT-qPCR assay. Extraction recoveries varied (11.5% to 70.5%), with passive sampling showing higher efficiency. This is the first report of use of passive samples for detection of MeV. These findings are consistent with other studies reporting WWS results of both MeVV genotype A and wild type genotype B and/or D. It demonstrates the potential for sensitive MeV WWS with rapid differentiation of MeVV from wild type MeV shedding, including in airport transport hubs and with different sample types. Use of WWS could strengthen measles surveillance by enabling rapid detection of MeV RNA and supporting outbreak preparedness and response. This requires optimised methods which are specific to or differentiate wild-type MeV from MeVV. Furthermore, the successful detection of MeV using passive sampling in this study highlights its potential for deployment in diverse global contexts which may include non-sewered settings.