Eye
○ Springer Science and Business Media LLC
Preprints posted in the last 30 days, ranked by how well they match Eye's content profile, based on 11 papers previously published here. The average preprint has a 0.04% match score for this journal, so anything above that is already an above-average fit.
Javed, K. M. A. A.; Ozturk, B.; Anwar, S.; Butt, G.; Low, L.; Said, D. G.; Dua, H. S.; Rauz, S.; Ting, D. S. J.
Show abstract
Background/Aims: To evaluate the impact of socioeconomic deprivation on clinical presentation and outcomes of bacterial keratitis (BK) in the United Kingdom. Methods: A retrospective multicentre cohort study of 320 patients with BK presenting to two UK tertiary ophthalmic centres. Demographic, clinical and microbiological data were extracted from electronic health records. Socioeconomic status was assigned using residential postcodes mapped to the 2019 English Index of Multiple Deprivation (IMD) and grouped into quintiles (Q1 most deprived; Q5 least deprived). Presenting severity and outcomes were compared across IMD quintiles. Results: The mean age was 54.0{+/-}20.9 years; 50.6% were male and 83.4% were White. Mean presenting CDVA was 1.10{+/-}1.01 logMAR and time to presentation was a median of 3 days (IQR 1-6). Most cases had a small infiltrate (<3 mm; 68.4%), small epithelial defect (<3 mm; 63.4%) and no hypopyon (72.5%). Hospitalisation was required in 50.0%, and 17.5% underwent surgery. Culture positivity was 36.3%. There were no significant differences in presenting CDVA, time to presentation, clinical severity, admission, microbiological profile, surgical intervention or final CDVA across IMD quintiles (all p>0.05). Final CDVA improved to 0.75{+/-}0.96 logMAR (p<0.001). On multivariable analysis, poorer final CDVA was associated with worse presenting CDVA, increasing age and Gram-positive organisms, but not IMD. Conclusion: Socioeconomic deprivation did not influence the clinical presentation or outcomes in BK. Clinical severity at presentation and microbiological profile were the principal determinants of outcome. In this acute, painful sight-threatening condition, deprivation-related disparities may be attenuated by prompt presentation and universal access to emergency ophthalmic care.
Windle, T.; Maliko, F.; Burgiss-Kasthala, S.; Blaikie, A.
Show abstract
Background The World Health Organisation (WHO) advocates integrating primary eye care (PEC) within community health systems, supported by task-shifting and frugal technologies. While low-cost tools such as the Arclight device and Wilson anterior segment loupe have demonstrated training and diagnostic value, their long-term impact on community health worker (CHW) roles and professional networks remain poorly understood. Methods We conducted a qualitative follow-up study 3 years after implementation of an Arclight Project enabled cascade training of the trainers (ToT) PEC programme in central Malawi. Ophthalmic Clinical Officers (OCOs) trained using the Arclight training and diagnostic package subsequently cascaded PEC training to Health Surveillance Assistants (HSAs). Semi-structured interviews were undertaken 3 years later with OCOs and HSAs to explore device use, evolving professional roles, training diffusion, and communication patterns. Data were analysed thematically, informed by concepts from social network analysis to examine changes in advice-seeking, mentorship and peer collaboration. Results Frugal eye-care technologies functioned not only as diagnostic tools but as mechanisms of professional repositioning. HSAs equipped with low-cost diagnostic devices became recognised as community eye focal persons, receiving referrals from colleagues and community members. OCOs who delivered training emerged as central hubs for clinical advice and ongoing training, creating strong vertical networks between district and community levels. However, horizontal peer-to-peer networks among HSAs remained weak and largely informal. Communication relied heavily on ad-hoc phone calls and WhatsApp messaging, with limited structured communities of practice. Despite sustained use of devices and retention of key skills, network activity often declined over time without active reinforcement. Conclusions Frugal eye-care technologies act as social as well as clinical interventions, reshaping CHW networks and professional hierarchies. Designing PEC programmes with explicit attention to strengthening and sustaining professional networks, rather than focusing solely on skills transfer, may further enhance alignment with WHO Integrated People-Centred Eye Care and improve long-term programme sustainability and impact.
Linntam, D.; Palumaa, K.; Palumaa, T.
Show abstract
Background: Despite strong evidence from controlled trials, uncertainty remains about the real-world use of 0.05% atropine in patients with lighter irises due to tolerability concerns, and predictors of treatment response are poorly understood. Here, we evaluated the effectiveness, tolerability, and early biometric response to 0.05% atropine in clinical practice among patients with predominantly light irises. Methods: This prospective cohort study included 33 patients treated with 0.05% atropine (82% with light irises). Cycloplegic spherical equivalent refraction (SER) was measured at baseline and 3-month intervals. Axial length (AL), photopic pupil diameter, accommodation amplitude, and subjective side effects were monitored more frequently initially. Results: Median age at treatment initiation was 11.97 years, SER -5.38 D, and AL 25.42 mm. Over 12 months, SER changed by -0.078 {+/-} 0.349 D (mean {+/-} SD), and AL increased by 0.052 {+/-} 0.115 mm. Eighty-eight percent of participants had a SER change of <0.5 D, and 91% had axial elongation of <0.2 mm, indicating clinically limited myopia progression. Photopic pupil diameter was larger, and accommodation amplitude was reduced throughout follow-up. Early in treatment, side effects, including photophobia and near-work difficulties, were common but minimally disruptive. Their incidence decreased rapidly and rarely required treatment modification. In exploratory analyses, early AL changes predicted 12-month AL outcomes, with associations detectable as early as 1 week and strengthening over time. Conclusions: 0.05% atropine was well tolerated and effective in this population with light irises. Early AL changes may predict 12-month treatment response. These findings support the implementation of 0.05% atropine in routine clinical practice in populations with light irises and highlight the potential for early AL monitoring to guide timely treatment adjustments.
Antwi-Adjei, E. K.; Datta, S.; Girkin, C. A.; Owsley, C.; Rhodes, L. A.; Fifolt, M.; Racette, L.
Show abstract
Purpose To evaluate patient satisfaction and preferences for portable versus table-mounted visual field (VF) devices in a rural telemedicine setting and identify influencing factors. Methods We conducted a sequential explanatory mixed methods study at three Federally Qualified Health Centers (FQHCs) within the Alabama Screening and Intervention for Glaucoma and eye Health through Telemedicine (AL-SIGHT) study. Participants completed VF testing with table-mounted Humphrey Field Analyzer (HFA), tablet-based Melbourne Rapid Fields (MRF), and virtual reality (VR)-based VisuALL perimeters. Participants rated satisfaction, comfort, ease of use, and future testing preference. Chi-square tests assessed differences in device preferences. Twelve participants completed semi-structured interviews to explore reasons underlying preferences. Qualitative data were analyzed in NVivo 14 using reflexive thematic analysis. Results Among 271 respondents (mean age 60.4 years; 62.4% women), 50.6% preferred VR-based, 35.1% tablet-based, and 14.4% table-mounted for future testing ({chi}2 (2) = 53.52, p<0.001, Cramers V = 0.31). Satisfaction was highest for VR-based (56.9% very satisfied), followed by tablet-based (49.4%), and HFA (38.0%). VR-based perimeter was most frequently selected as the most comfortable (55.7%; {chi}2 (2) = 63.33, p<0.001, V = 0.34) and easiest to use (54.6%; {chi}2 (2) = 71.96, p<0.001, V = 0.36). Preferences did not vary significantly across demographic variables (all p>0.05). Qualitative themes identified four key drivers: comfort and physical experience, visual experience, ease of use and interaction, and psychological and motivational factors. Portability and community suitability were valued. Conclusion Rural underserved patients strongly preferred portable visual field devices, particularly VR-based, over table-mounted HFA. Comfort, ergonomic flexibility, immersive visual experience, and simplicity of interaction were central determinants of preference. Portable perimetry may enhance patient-centered glaucoma monitoring within telemedicine programs and access in resource-limited settings.
Giachos, I.; Oreaba, A. H.; Kanj, U.; Anwar, S.; Chahal, R.; Aralikatti, A.; Ting, D. S. J.
Show abstract
Purpose: To highlight the roles of intraoperative optical coherence tomography (iOCT) in managing acute corneal hydrops (ACH) and outcomes of iOCT-guided pneumodescemetopexy and corneal compression sutures. Methods: This was a retrospective, consecutive, interventional case series of patients with keratoconus who presented with significant ACH and underwent iOCT-guided pneumodescemetopexy (18% sulfur hexafluoride gas) and compression sutures at Birmingham and Midland Eye Centre, UK, between Aug 2023 and May 2025. Results: Five patients were included; mean age was 32.3+/-6.6 years old and 3 (60%) were male. The mean follow-up duration was 16.3+/-5.6 months. At presentation, the mean corrected-distance-visual-acuity (CDVA) was 1.90+/-0.67 logMAR, central corneal thickness (CCT) was 1187.6+/-372.6um, maximal corneal thickness was 1624.0+/-383.5um and maximal height and diameter of pre-Descemet layer/Descemet membrane (PDL/DM) detachment was 1014.6+/-366.4um and 4456.0+/-839.4um, respectively. The surgery successfully achieved complete PDL/DM attachment in all cases, with a mean time from surgery to ACH resolution of 17.8+/-8.0 days. iOCT successfully visualized the area of PDL/DM break/detachment, revealed the involvement of PDL (evidenced by a persistent taut type 1 DM detachment after gas tamponade), and guided the placement of compression sutures. Compared to preoperative, there was a significant improvement in the mean CDVA (0.52+/-0.32 logMAR; p=0.014) at last follow-up. One patient required a repeat procedure to fully attach the PDL/DM. Conclusions: This study demonstrated favorable outcomes of iOCT-guided pneumodescemetopexy and corneal compression sutures. iOCT revealed the involvement of PDL in ACH and provided plausible explanations why pneumodescemetopexy alone may not be able to resolve significant ACH rapidly in certain cases.
Szabo, A.; Arpadffy-Lovas, T.; Toth-Molnar, E.
Show abstract
Purpose:To improve determination of the treatment area for the personalization of subliminal transscleral cyclophotocoagulation (SL-TSCPC) procedures in glaucoma treatment, we designed a biometry based model of the human eye to find the estimated cilary body (CB) arc length (ECBAL) and the calculated CB distance (CCBD). Methods: We developed a rotationally symmetric modified two-sphere eye model based on axial length (AL), mean keratometry (mean K), anterior chamber depth (ACD), lens thickness (LT), and white-to-white (WTW). ECBAL and CCBD were calculated for each eye. Fluence was calculated with standardized parameters. Results: Publicly accessible biometric measurements for 24,001 eyes were pooled for analysis. The mean ECBAL was 23.99+-1.8 mm. The correlations of ECBAL with AL and ACD were 0.723 and 0.754 respectively (p < 0.01). The number of eyes with an ECBAL 21.7-22.0 mm was 131 of 24,001 (0.55%). The mean CCBD was 4.21+-0.8 mm. The number of eyes with a CCBD of 3.8 mm was 1,445 of 24,001 (6.02%). Mean fluence was 120.33+-9.0 J/cm2. A mean difference of -8.18+-6.9%, ranging from -22.66% to +29.07% in fluence was observed with treating only the recommended 22 mm versus the ECBAL. Conclusions: This study demonstrated that use of 22.0 mm as the standard treatment arc length may under or overdose laser treatment in many eyes. Precise estimation or exact localization of the CB treatment area is required to accurately calculate fluence. Translational Relevance:The model proves that CB arc length is a variable while current guidelines consider it a constant
Maurin, C.; Poinard, S.; Travers, G.; Gontier, E.; Karpathiou, G.; Decoeur, F.; He, Z.; Gain, P.; THURET, G.; French Fuchs Study Group,
Show abstract
Aim: To evaluate the potential of a three-dimensional microscope combining Laser scanning confocal imaging and white-light interferometry for quantitative topographic characterisation of Descemet's membrane (DM) in Fuchs endothelial corneal dystrophy (FECD). Methods: Descemet's membranes were collected from 38 FECD patients undergoing endothelial keratoplasty and 4 healthy donors. After flat-mounting on glass slide and drying, specimens were analysed using the VK-X3000 system (KEYENCE). Entire samples were reconstructed by image stitching at low magnification (x10) in white-light interferometry mode (0.01nm axial resolution). Higher magnifications (x20-x150) in confocal mode (12nm axial resolution) enabled detailed structural analysis. Three-dimensional height maps were generated to calculate standardised surface roughness parameters. Guttae and other DM features were classified according to spatial organisation and elevation profiles. Results: White-light interferometry enabled full-field mapping of whole 8mm diameter DMs with nanometric vertical resolution (~2 hours/sample). Surface roughness (Sa) was higher in FECD than in controls (median{+/-}IQR: 0.571{+/-}0.259 m vs 0.239{+/-}0.161 m ; p = 0.0018). In FECD, three zones were identified: central (guttae buried in the posterior fibrillar layer; Sa 0.442 {+/-} 0.112 m), paracentral (large uncovered guttae; Sa 0.562{+/-}0.170 m ; p = 0.0423), and outer zone (no confluent guttae; Sa 0.261{+/-}0.143 m ; p < 0.0001). Confocal 3D imaging revealed radial striae, embossments and furrows in the DM, confluent central guttae, and fused or buried structures. Conclusions: Combining white-light interferometry and confocal microscopy enables label-free, high-resolution surface characterisation of DM in FECD, providing quantitative metrics to compare histological subtypes and supporting the predominance of radial structural organisation.
Kumanan, K.; Hassani, A.; Husnain, M.; Papaefstratiou, E.; Estevez, J. J.
Show abstract
Purpose To evaluate associations between optical coherence tomography angiography (OCT-A) metrics and diabetic retinopathy (DR) and compare their discrimination against conventional clinical risk factors. Methods In this cross-sectional study, 108 adult eyes (right eye if both eligible) with diabetes were recruited from tertiary ophthalmology/optometry clinics. DR was clinically graded using ETDRS categories and dichotomised as no DR vs >= mild NPDR (primary outcome). Macular 6x6 mm OCT-A (Zeiss AngioPlex) was acquired; scans with signal strength >7 and without major artefact were included. Quantitative metrics from the superficial capillary plexus included vessel density (VD) and perfusion density (PD) (central/inner/outer/full regions); structural OCT measures and FAZ parameters were secondary. Associations with >= mild NPDR were assessed using multivariable logistic regression adjusted for age, sex, HbA1c, and diabetes duration. Discrimination was evaluated with ROC curves/AUC (95% CI) and DeLong comparisons of AUCs. Results DR was present in 63% of eyes. DR was associated with lower VD (central, inner, outer, full) and lower PD (central, inner, full) (all p<=0.04). After adjustment, central VD (OR 0.82, 95% CI 0.68-0.98) and central PD (OR 0.92, 95% CI 0.86-0.99) remained independently associated with DR. The OCT-A model outperformed the clinical model (AUC 0.73 vs 0.60); the combined model yielded AUC 0.76. Conclusion VD and PD from the superficial plexus are independently associated with DR and show superior discrimination versus conventional clinical factors alone, supporting OCT-A as an adjunct for earlier DR detection.
Saxena, R.; Jethani, J.; Roy, L.; Matalia, J.; Verkicharla, P. K.; Ganesh, S.; Parthasarathy, A.; Nayak, S.; Gupta, V.; Narendran, K.; Panmand, P.; Ghosh, P.; Muthu, S.; Srivastava, K.; Prenat, O.
Show abstract
Objective: The study aims to evaluate the real-world effectiveness of Highly Aspherical Lenslets spectacle (HAL; Essilor(R) Stellest(R)) in slowing myopia progression among Indian children and adolescents aged between 4 and 16 years. Methods and analysis: This was a multicentre retrospective study conducted across 10 leading ophthalmic centers. The study participants comprised children aged between 4 and 16 years who were prescribed HAL spectacles (Essilor(R) Stellest(R)). Data were extracted from electronic medical records at three time points: T1: One year prior to intervention; T2: Baseline at HAL spectacle prescription; T3: 6 to 24 months after prescription. The primary endpoint was the myopia progression and axial elongation in the year following prescription, compared with the untreated year and with published meta-regression models. Only data from the right eye were analysed, with the expected physiological progression estimated based on the individual progression trajectory after adjusting for age-related slowing as reported in published meta-regression models. Results: A total of 372 myopic children were included in the study. The annual myopia progression was -0.72 {+/-} 0.47 D/year during the untreated period, reducing to -0.11 {+/-} 0.29 D/year with HAL spectacle wear. The expected progression without treatment was -0.65 D/year, based on trajectory-adjusted modelling, indicating a treatment effect of 0.54 D/years and an estimated 83% slowing in myopia progression compared to expected progression. The expected axial elongation under physiological conditions was 0.29 mm/year, estimated using age-adjusted meta-regression models; with HAL lens wear, axial elongation was 0.11 {+/-} 0.16 mm/year, corresponding to a [~]62% relative slowing of elongation. Conclusion: The present study demonstrates the real-world evidence validating the efficacy of HAL lenses as an effective myopia control intervention in Indian children and adolescents. The retrospective design and limited follow-up period warrant future prospective, long-term studies to validate these findings.
Yeh, T.-C.; Lin, J. B.; Mruthyunjaya, P.; Leng, T.; DeBoer, C.; Sepah, Y.; Almeida, D. R.; Mahajan, V. B.
Show abstract
Background and Objective As optical coherence tomography (OCT) has enabled the identification of an expanding set of age related macular degeneration (AMD) risk biomarkers and become central to routine clinical practice, there remains a need for a simplified grading scheme that allows physicians to communicate and synchronize AMD grading directly from standard OCT imaging rather than relying on traditional color fundus imaging. This study aims to establish a standardized OCT based AMD classification that balances diagnostic accuracy with practicality for use across clinical and research settings. Patients and Methods Spectral domain optical coherence tomography scans were independently graded by two retinal specialists following the newly proposed Stanford OCT Based AMD Classification (SOAC). Discrepancies were adjudicated by a third independent retinal specialist. Intergrader agreement was assessed using weighted kappa coefficients. Results Among the 109 eyes from 108 patients, AMD staging based on SOAC was distributed as follows: normal aging in 9 patients (8.3%), early AMD in 16 (14.7%), intermediate AMD in 32 (29.4%), neovascular AMD (nAMD) in 18 (16.5%), geographic atrophy (GA) in 20 (18.3%), and combined nAMD and GA in 14 (12.8%). The overall intergrader agreement demonstrated robust consistency, with a weighted kappa value of 0.95 (95% CI: 0.92 to 0.98), signifying excellent intergrader reliability and reinforcing the validity of SOAC. Conclusion SOAC provides a standardized, OCT based framework for AMD grading that demonstrates high intergrader agreement. By enabling consistent classification from commonly acquired OCT scans, SOAC supports reliable disease staging and facilitates integration across clinical studies and translational research. As imaging and molecular data continue to expand, SOAC can serve as a common OCT based reference for phenotype refinement and longitudinal AMD studies.
Toral, M. A.; Ng, B.; Velez, G.; Yang, J.; Tsang, S. H.; Bassuk, A. G.; Mahajan, V. B.
Show abstract
PurposeAnti-vascular endothelial growth factor (anti-VEGF) therapy is the standard of care for neovascular age-related macular degeneration (AMD), yet many patients exhibit persistent retinal degeneration, fibrosis, and incomplete therapeutic response. The molecular pathways underlying this incomplete response remain poorly understood. We sought to identify VEGF-independent signaling pathways active in the vitreous of anti-VEGF-treated AMD patients. MethodsWe performed multiplex antibody-based proteomic profiling of 1,000 human proteins in vitreous samples from patients with neovascular AMD receiving anti-VEGF therapy (n=8) and comparative controls (n=6). Differential protein expression was assessed using one-way ANOVA, followed by gene ontology and pathway enrichment analyses. Drug-target relationships were evaluated to identify potential opportunities for therapeutic repositioning. ResultsWe identified 107 differentially expressed proteins (p<0.05), including key regulators of immune signaling, angiogenesis, and metabolism. Notably, multiple components of cytotoxic lymphocyte pathways were dysregulated, including IL-21R, SIGLEC-7, CTLA4, and IL-2-associated signaling. Enrichment analyses revealed significant activation of pathways related to T-cell activation, interleukin signaling, and leukocyte-mediated cytotoxicity. These immune signatures persisted despite suppression of VEGF signaling. Several clinically available immunomodulatory agents--including abatacept, sirolimus, and dupilumab--targeted pathways identified in this dataset. ConclusionsAnti-VEGF-treated neovascular AMD exhibits persistent cytotoxic immune signaling in the vitreous, suggesting that VEGF-independent immune mechanisms may contribute to ongoing retinal damage and incomplete therapeutic response. These findings provide a rationale for combination therapeutic strategies targeting both angiogenic and immune pathways in AMD.
Woredekal, A. T.
Show abstract
Purpose Diabetic retinopathy (DR) is one of the most important complications of diabetes mellitus (DM), representing the leading cause of blindness among working age adults in developed countries. This study was aimed to investigate the epidemiology and risk factors of DR in patients with diabetes mellitus in a hospital setting in Somalia. Methods The study was an observational, descriptive cross-sectional and hospital-based study and data were collected from January 2023 to May 2023. A structured questionnaire was used to collect relevant demographic and clinical data. Both univariate and bivariate tables were used for analysis. Data analysis included frequency distribution, cross-tabulation, co-relation and association, and statistically significant tests between variables (X2, p-value, and CI). Results A total of 384 DM patients were studied and 76% (n=293) of them had type 2 DM. The average duration of diabetes mellitus was 9.7 SD 6.9 years and the mean age was 47.24 SD 19.36 years (range 18 -100 years old). A majority 66% (n=253) were female, about a third of them had normal body mass index (BMI) (n=172, 44.8%) and 170 (44.3%) had concomitant hypertension. About 51% of the patients (n=197) had DR out of which 17% had non-proliferative diabetic retinopathy (NPDR) (n=67) and 26% had Macular oedema (n=98). Age above 40 years (p=0.020), marital status (P=0.010), employment status (P=0.002) and literacy status (P=0.020) were significantly associated with the presence of DR. Patients aged below 40 had 37% lesser risk of having diabetic retinopathy than patients aged above 40 years. Longer duration of diabetes (p=0.001) and the presence of concomitant cardiac illness (p=0.001) were strongly associated with the presence of diabetic retinopathy. Patients with duration of diabetes more than 10 years had approximately 2 times higher chance of developing DR than those with duration less than 10 years. Conclusion: The very high prevalence of DR (51%) among our patients implies the needs for a good health policy to manage DM and DR patients in Somalia. Effective regular eye screening and treatment for all diabetes patients should get priority.
Reddy, K. N.; Ibukun, F.; Huang, K.; Yi, J.; Jain, E.; Kuyyadiyil, S.; Parmar, G. S.; Shekhawat, N. S.
Show abstract
Purpose: To compare hypopyon detection using anterior segment optical coherence tomography (ASOCT) versus slit lamp examination (SLE) in microbial keratitis, and to evaluate intra-and inter-grader agreement for ASOCT hypopyon measurement. Methods: Two masked graders independently evaluated ASOCT images for hypopyon presence or absence in eyes with microbial keratitis, with disagreements resolved by consensus. A subset of hypopyon eyes underwent triplicate height measurement using two methods (endothelial length, vertical height). Cohen's kappa, intraclass correlation coefficients (ICC), sensitivity, and specificity were calculated comparing diagnostic performance of ASOCT versus SLE. Results: Inter-grader agreement for hypopyon detection on ASOCT was excellent (k=0.94; 95% CI 0.84-1.00) and intra-grader agreement was excellent (k=0.89-1.00). ASOCT detected hypopyon in 67.1% of eyes versus 57.0% by SLE (sensitivity 83.0%, specificity 96.2% using ASOCT as reference). Intra-grader reproducibility was excellent for both endothelial length and vertical height measurements (ICC 0.977-0.996). Inter-grader agreement was good for endothelial length (ICC 0.831) and vertical height (ICC 0.827), though a statistically significant inter-grader bias was identified for vertical height only (Wilcoxon p=0.008). Conclusions: ASOCT detected hypopyon with greater sensitivity than SLE and demonstrated excellent intra-grader and good inter-grader measurement reproducibility. Endothelial length showed slightly superior inter-grader concordance to vertical height measurement.
Jones, L.; Higgins, B.; Devraj, K.; Crabb, D.; Thomas, P.; Moosajee, M.
Show abstract
This study evaluated the feasibility of collecting passive and active digital phenotyping data using the OverSight iOS application in individuals with inherited retinal diseases (IRDs), and explored associations between digital behavioural markers, visual function, and mental health. Participants with IRDs were recruited from Moorfields Eye Hospital (UK) and followed for 12 months. OverSight passively captures mobility data through HealthKit and typing-derived metrics through SensorKit. Participants completed patient-reported outcome measures (EQ-5D, NEI-VFQ-25, HADS, and MRDQ) within the app. Passive data included step count, walking speed, typing speed, total words typed, autocorrections, and sentiment word categories (anxiety, down, and health-related terms). Feasibility indices included enrolment, retention, and completeness of passive datastreams. Twenty-five participants were enrolled and 92% were retained at 12 months. Seventeen participants met the validity threshold for HealthKit data and 16 also met SensorKit thresholds. Median daily step count was 6,087, walking speed 1.18 m/s, and typing speed 2.19 characters/s. Age was negatively correlated with typing speed and anxiety-related word use, and photopic peripheral visual difficulty was negatively correlated with anxiety- and down-related word use. Digital phenotyping using OverSight was feasible over 12 months. Exploratory analysis suggest mobility, typing behaviour and sentiment markers may represent useful adjunctive indicators of functional vision and psychological outcomes in patients with IRDs.
Wang, L.; Yang, Y.; Ng, T. K.; Chen, J.; Sun, X.
Show abstract
Purpose: To identify the ocular biometric parameters associated with refractive outcomes in Chinese Primary angle closure glaucoma (PACG) patients receiving phacoemulsification and intraocular lens (IOL) implantation (PEI) surgery. Methods: 165 Chinese PACG patients receiving PEI and goniosynechialysis (GSL) and 53 cataract patients as controls only receiving PEI surgery were recruited. The prediction accuracy of IOL power calculation was assessed by the prediction error (PE), mean absolute error (MAE), median absolute error (MedAE), and proportions of eyes with a PE within {+/-} 0.25 diopters (D), {+/-} 0.50 D, {+/-} 0.75 D, and {+/-} 1.00 D. The association of different ocular biometric parameters with the PE of IOL calculation were evaluated. Results: The PACG patients had significantly higher absolute of PE as compared to the control subjects, especially the acute PACG patients. The axial length (AL), changes in aqueous depth pre- and post-surgery ({bigtriangleup}AD), and the ratio of {bigtriangleup}AD/AL were significantly associated with the PE in acute PACG patients. The association of {bigtriangleup}AD with the PE of IOL power calculation was found in PACG patients with AL [≥] 22 mm. Conclusions: This study revealed the association of AL and {bigtriangleup}AD with the PE of IOL calculation in Chinese PACG patients. Precisely predict the {bigtriangleup}AD is necessary for acute PACG patients, especially for those with AL [≥] 22 mm, to improve the refractive outcomes.
Rhode, L.; Reddy, K. N.; Ibukun, F.; Kuyyadiyil, S.; Jain, E.; Parmar, G. S.; Chellappa, R.; Shekhawat, N. S.
Show abstract
Purpose: To develop and evaluate deep learning models for automated detection of corneal perforation in microbial keratitis using anterior segment optical coherence tomography (ASOCT) images. Methods: We enrolled 150 patients with microbiologically confirmed keratitis. Contralateral healthy eyes served as controls. Four convolutional neural network models using ResNet architecture were trained and evaluated using ASOCT images to classify the presence or absence of corneal perforation at the eye level. Ground truth labels for perforation were established following consensus grading by two masked ophthalmologist graders. Models differed in inclusion of healthy controls and masking of non-corneal anterior segment anatomy. Results: The best-performing model (Model 1), which included healthy controls and randomly applied masking of the inferior image portion during training, achieved an AUC of 0.965 (95% CI, 0.911-0.995), sensitivity of 84.0% (95% CI, 70.0%-97.1%), and specificity of 97.8% (95% CI, 96.1%-99.3%) for detection of corneal perforation. Models including healthy controls outperformed those without, and lens masking improved discrimination. Conclusions: Deep learning models achieved high diagnostic accuracy for detecting corneal perforation on ASOCT imaging in eyes with microbial keratitis. These findings support the potential role of automated ASOCT analysis as a clinical decision support tool for identifying this vision-threatening complication.
Hagen, L. A.; Svarverud, E.; Krastina, I.; MacKenzie, G.; Baraas, R. C.
Show abstract
Purpose: To assess the repeatability of a prototype super acuity test chart for measuring visual acuity at 12.5 cm, and its ability to detect hyperopia in adolescents and young adults. Methods: Repeatability was estimated as within-subject standard deviation of three repeated super acuity measurements performed in 41 university students (19-26 years). Associations between super acuity and cycloplegic refractive errors, ocular biometry, distance visual acuity, accommodation, age, and sex were assessed in 119 high school students (16-18 years) using linear mixed-effects models. ROC curves and Youden index were used to estimate the best super acuity thresholds to detect rest hyperopia. Results: Mean super acuities in the university and high school cohorts were 0.14 {+/-} 0.13 and 0.12 {+/-} 0.11 logMAR, respectively. Repeatability was 0.031. Super acuity was poorer in those with uncorrected hyperopia [spherical equivalent refractive error (SER) [≥] 1.00 D] than the others [SER < 1.00 D; P = 0.039]. There were significant associations between poorer super acuity and more positive ametropia (SER; P = 0.026), poorer accommodation amplitude (P < 0.001), shorter axial length (P = 0.013), male sex (P < 0.001), and age (P = 0.037). Sensitivity and specificity for detecting hyperopia (SER [≥] 1.00 D) were 63.2% and 64.2%, respectively, at a super acuity threshold of 0.09 logMAR. Discussion: The super acuity prototype shows promise as a screening indicator for hyperopia. Further studies are needed to optimize the test and testing protocol, and to assess its ability to detect uncorrected hyperopia in children.
Dolin, P.; Keogh, K. A.; Rowell, J.; Edmonds, C.; Kielar, D.; Meyers, J.; Esterberg, E.; Nham, T.; Chen, S. Y.
Show abstract
Purpose: We evaluated healthcare resource utilization (HCRU) and costs in patients with eosinophilic granulomatosis with polyangiitis (EGPA). Methods: Patients with newly diagnosed EGPA (2017--2021), [≥]12 months' pre-diagnosis health plan enrollment, and [≥]1 inpatient or [≥]2 outpatient claims with an EGPA diagnosis were included. Follow-up was from EGPA diagnosis until disenrollment or database end. HCRU and health insurer payment costs during follow-up were compared with those for matched cohorts of general insured patients without EGPA (comparison A) and without EGPA but with severe uncontrolled asthma (SUA; comparison B). Results: In comparison A, all-cause HCRU was higher in the EGPA cohort (n = 213) versus matched patients (n = 779) for all clinical encounters/pharmacy claim types; annualized, mean total all-cause costs were 16-fold higher ($117,563/patient) versus matched patients ($7,520/patient). In comparison B, all-cause HCRU was higher for the EGPA cohort (n = 182) versus the matched SUA cohort (n = 640) for all clinical encounters/pharmacy claim types, with 5-fold higher mean total all-cause costs ($118,127/patient vs $22,286/patient). In both EGPA cohorts, HCRU and associated costs increased between the baseline and follow-up periods. Conclusions: These findings highlight the need for more effective treatments to reduce the clinical and economic burden of EGPA.
SHI, M.; Afolabi, S. O.
Show abstract
Abstract Background Diabetic Retinopathy (DR) is one of the leading cause of vision loss and blindness. AI models have been instrumental in providing an alternative solution to real-life medical treatment which are costly and sometimes not readily available in developing and underdeveloped nations. However, most of the existing AI models are developed with high-quality clinical images that makes it difficult to use such models in low-resource settings. For this reason, this research focus on bridging this gap by developing a low-resource, mobile-friendly, and deployable deep learning (DL) model for the detection of DR using an imbalance-aware optimal transport (OT) learning approach. Methods We trained our proposed framework with both high-quality hospital- grade images and low-resource smartphone-acquired images, and evaluated with the original test set from the smartphone domain. We also curated three levels of smart- phone image-degradation quality and reported results from multiple experiments with bootstrapping. All model evaluations were assessed using the AUC, Sensitivity, and Specificity. Our results were compared with empirical risk minimization (ERM), Prototype OT, and Sinkhorn OT methods. Results We used four strong backbone architectures in the assessment. With our framework, Mobilevit-s achieved the best performance: an AUC of 87%, sensitivity of 89%, and specificity of 95%. Meanwhile, the statistical significance performance test (95% CI) shows that the AUC results are in the range of approximately 84% to 89%. For sensitivity, the range is 81% to 96%, and for specificity, 93% to 96%. This result indicated a performance increase of more than 3-5% compared to baseline methods. Conclusion Our framework shows promising results for low-resource DR screening, which has a potential to benefit less-advantaged groups and developing nations. Keywords Diabetic retinopathy, cost-effective AI, optimal transport, smartphone screening, deep learning.
Fleet, D. M.; Messenger, A.; Bryden, A.; Harris, M. j.; Holmes, S.; Farrant, P.; Leaker, B.; Takwale, A.; Oakford, M.; Kaur, M.; Mowbray, M.; Macbeth, A.; Gangwani, P.; Gkini, M. a.; Jolliffe, V.
Show abstract
Background In clinical trials for alopecia areata (AA) the treatment effect (percentage of hair loss) is estimated using the Severity of Alopecia Tool (SALT) score. Trials in patients with severe AA (>=50% hair loss) employed a local rating of the SALT score performed at trial sites by different investigators. However, in mild-to-moderate AA (<= 50% hair loss) where SALT scores are lower, potential inter rater variability and margin of error may compromise the results. Objectives To compare Centralised and Local measurement of hair loss in mild moderate AA. Methods In a Phase 2 clinical trial a centralised measurement of hair loss was performed from photographic images taken using a standardised protocol and professional camera equipment. Local scoring was also undertaken at screening/baseline for eligibility. We assessed: the repeatability of the central system (screening vs baseline values), the reproducibility of the central versus the local rating system and the potential impact of each method on the endpoints using a Monte-Carlo simulation method. Results There was good agreement and consistency of scoring with Central rating. This provided much smaller margins of error, 50% lower than Local rating. The simulations demonstrated that substituting Local rating for Central rating would result in a reduction of the likelihood of a statistically significant outcome by at least 50% depending on the SALT score defined clinical response endpoint. Conclusions Central rating is most appropriate in the Phase 2 learning stage of clinical development and provides an accurate representation of the quantity of hair loss, minimising error and ensuring consistency in measurements.