COVID
○ MDPI AG
Preprints posted in the last 90 days, ranked by how well they match COVID's content profile, based on 12 papers previously published here. The average preprint has a 0.05% match score for this journal, so anything above that is already an above-average fit.
Abumueis, S. I.; Alqadi, S.; Al Tarteer, A.; Alrefai, W.; Alzoughool, F.; Jew, S.; Qudah, T.
Show abstract
BackgroundVitamin D supplementation has been investigated for potential associations with cardiometabolic risk factors related to cardiovascular disease (CVD); however, findings from randomized controlled trials (RCTs) remain inconsistent. This meta-analysis aimed to assess the effects of vitamin D supplementation on cardiometabolic risk factors--including lipid profile, blood pressure, and glycaemic parameters--and to explore whether age and baseline serum vitamin D concentrations modify these associations. Research Design and MethodsWe conducted a systematic review and meta-analysis of RCTs comparing oral vitamin D supplementation with placebo in adults. PubMed, the Cochrane Library, and ClinicalTrials.gov. Risk of bias was evaluated using the Cochrane tool, and pooled effect sizes with 95% confidence intervals (CIs) were calculated using random-effects models. Results14,051 abstracts were retrieved, of which 45 were used for data analysis. Vitamin D supplementation reduced low-density lipoprotein cholesterol (LDL-C) by 0.136 mmol/L (95%CI: -0.215, -0.56), systolic blood pressure by 2.79 mm Hg (95% CI: -4.648, -0.938), fasting blood glucose by -0.11 (95%CI:-0.185, -0.036), and hemoglobin A1c by 0.164% (95%CI: -0.322, -0.006) compared with placebo. Subgroup analyses revealed reductions in SBP and LDL cholesterol among participants aged [≥]55 years and reductions in fasting blood glucose in participants with age < 55 years. While favourable effects on fasting blood glucose and hemoglobin A1c were observed with a baseline blood level of vitamin D of concentrations (<50 nmol/L). ConclusionsVitamin D supplementation may be associated with modest modifications in selected cardiometabolic risk factors; including systolic blood pressure, LDL-cholesterol, fasting blood glucose, and hemoglobin A1c. Age and baseline vitamin D status appear to modulate these effects. The clinical relevance of these modest effects remains uncertain. Well-designed RCTs with standardized protocols are required to clarify potential effect modification by age and baseline vitamin D status. Trial RegistrationPROSPERO (CRD42020165293) FundingThis research received funding from the Hashemite University, P.O. Box 330127, Zarqa 13133, Jordan
Mack, J.; Li, M.; Hurford, A.
Show abstract
The risk of highly pathogenic avian influenza infection to humans is challenging to estimate because many human avian influenza virus (AIV) infections are undetected as they may be asymptomatic, symptomatic but not tested, and as contact tracing is difficult because human-to-human spread is rare. We derive equations that consider the evolutionary mechanisms that give rise to pandemics and are parameterized to be consistent with records of past pandemics. We estimate that thousands of human AIV infections occur worldwide in an average year and estimate the infection fatality ratio as 32 deaths per 10,000 infections (95% confidence interval: [9.6, 75]). We estimate that preventing 20% of animal-to-human influenza spillovers annually would delay pandemic emergence by an average of 9.4 years. There is a high level of uncertainty in our estimates due to the few records of past pandemics, but even so this infection fatality ratio is comparable to SARS-CoV-2 during the recent pandemic and is higher than seasonal human influenza. Preventing human infections with AIV is necessary given the high risk of severe outcomes to individuals and to reduce the risk of pandemics occurring in the future.
Demir, T.; Tosunoglu, H. H.
Show abstract
In this research, we create a new fractional-order SEIHRD framework to examine how the Nipah virus moves from one species to another (zoonotic spillover) and how it later spreads throughout a community (via contact with one another) or in a hospital or isolation situation (via entering into a hospital or being placed under quarantine). We used the fractional-derivative formulation of the SEIHRD model to demonstrate memory-based effects related to the progression of an infection and also reflect time-distributed effects associated with surveillance and control measures placed on an infected patient. We first demonstrated that the basic epidemiologic properties of the model were consistent by showing that the solutions of the SEIHRD differential equations will always yield positive and bounded solutions within biologically relevant parameter ranges. We then established the well-posedness of this model by transforming the SEIHRD differential equations into an equivalent integral operator and applying various fixed-point arguments to demonstrate that there will always be unique solution(s) to the SEIHRD differential equations. To evaluate the threshold parameter for the transmission of Nipah virus within a given population we calculated the threshold level through the next generation method to determine the expected number of secondary infections from a new or chronically infected host. One of the main contributions of this work is to include an analysis of the robustness of a given solution to all potential perturbations (i.e., Ulam-Hyers and generalized Ulam-Hyers stability). In addition, we provide analytic results guaranteeing that small perturbations due to approximate modeling, numerical approximation (discretization), or the lack of data fidelity will produce controlled deviations in the solutions. To finish this project, we perform a global sensitivity analysis on uncertain coefficients to evaluate their contribution to the uncertainty of each coefficient and to find out the coefficients that most strongly influence major outcome metrics. This will allow us to develop a priority order for prioritizing spillover control (reduction of human contact and/or isolation), contact reduction, and expenditure of resources towards isolation-related interventions. The resulting framework converts fractional epidemic modeling from a descriptive simulation to a replicable method with robustly defined behavior and equal response prediction.
Wagle, U.; Sirur, F. M.; Lath, V.; Lingappa, D. J.; R, R.; Kulkarni, N. U.; Kamath, A.
Show abstract
Background The Hump-nosed pit viper is a recognized but neglected medically significant species causing morbidity and mortality, with non-availability of a specific antivenom. There are many gaps in our understanding of its envenomation, including burden, clinical syndrome, complications and management. Methodology The study is a retrospective sub analysis of the Prospective VENOMS registry and hospital records of Hump Nosed Pit Viper envenomation from a single tertiary care center in coastal Karnataka from May 2018 to March 2024. Epidemiology, syndrome, complications and treatment strategies have been described. A linear mixed model analysis was conducted to study the effect of different therapeutic interventions in combating venom induced consumptive coagulopathy (VICC) Principal Findings Of 46 cases, 24 patients had VICC. The most common complications were AKI (21.7%), TMA (10.9%) and stroke (4.4%). Anaphylaxis to ASV (23.9%) was the most common therapeutic complication. Therapeutic interventions included ASV, administration of blood products and therapeutic plasma exchange along with supportive care. The linear mixed model revealed that administration of blood products (p=<0.001) had the strongest influence on the INR value, however, often resulting in a transient decline in INR value. ASV (p=0.052) caused only marginally significant change in INR. The role of TPE could not be statistically inferred, however, individual cases with severe VICC improved without complications, therefore it required further study but can be considered in critical cases. Conclusions/Significance This study describes the syndrome of hump-nosed pit viper envenomation, while highlighting the urgent need for a species-specific antivenom, recommends treatment strategies that can be used in the interim. Additionally, geo-spatial mapping draws attention to hotspots and the hypothesis that HNPV in coastal Karnataka have regionally distinct toxicity trends.
Khalid, S.; Hassan, M.
Show abstract
BackgroundConsanguineous unions are defined as the matrimony between individuals who are blood relatives. Researchers in all over the world worked on this issue and they checked the ratio of prevalence and effects of consanguinity in different regions of world. This research was conducted in the District Faisalabad, upper Punjab. ObjectiveTo find rate of consanguinity, coefficient of inbreeding (F) and its impacts. MethodsThe data was collected from six tehsils of District Faisalabad by interviewing the subjects. The data collected within the time span of six months. Total of 2366 subjects were interviewed after their consent approval. ResultsThe rate of consanguinity was noted 41.83% with 0.03053 coefficient of inbreeding. High rate of consanguinity (23.36%) was noted among first cousins. The distantly related and not related unions were 35.64% and 22.56% respectively. The rate of consanguineous unions in six tehsils ranged from 33.99% in Jaranwala to 53.85% in Tandlianwala. Consanguineous marriages were noted high in Punjabi speaking subjects, in housewives, in reciprocal marital types, in grand-parents and one couple family types and Rajpoot castes. There was found no significant differences of consanguinity in rural and urban areas. The rate of still births was noted high (82.25%) in consanguineous unions while neonatal, post neonatal and child mortality was low such less as 6.45%, 8.06% and 3.22% respectively. The prenatal mortality was noted slightly high 44.94% in consanguineous unions as compared to non-consanguineous unions. The congenital malformation rate was 6.29% in all marital unions but this rate was high (59.06%) in consanguineous unions as compared to non-consanguineous unions (40.93%). This is a pilot study to analyze the potential of inbreeding coefficient (F) in the District Faisalabad.
Nilforoshan, H.; Reisler, J.; Jahanparast, E.; Moor, M.; Goodman, S.; Wager, S.; Leskovec, J.
Show abstract
COVID-19 has been shown to cause a range of harmful long-term effects on nearly every organ system1-3. These findings are based on retrospective studies comparing COVID-19 patients to patients with similar medical histories and demographics but no COVID-19 diagnosis4-16. However, concerns have emerged that these comparisons may be biased if COVID-19 patients had unrelated health conditions or other factors not recorded in their medical records17-21. Here, using a massive dataset of 14.4 billion health insurance claims from 244.7 million U.S. patients, we find that the large majority of long-term effects attributed to COVID-19 by methods used in conventional studies are likely due to bias from selective testing. This bias arises because individuals with non-COVID health conditions producing long-term symptoms were more likely to seek care and be tested for COVID-19. As a result, their non-COVID symptoms are attributed to COVID-19. We develop a study design that reduces this bias by only considering individuals who have taken a COVID-19 PCR test, and then comparing similar patients whose first test came back positive vs. negative. This way the COVID-19 patients and control group are more similar and non-COVID factors play less of a role. We examine 614 clinical outcomes over a 2-year followup period and reveal an order of magnitude smaller--but still clinically significant--number of long-term effects attributed to COVID-19 that persist for up to one year after infection. We confirm that the long-term effects of COVID-19 span many organ systems, including respiratory, cardiovascular, musculoskeletal, and integumentary systems, but are significantly narrower in scope and duration than previously believed. Although some symptoms exist more than one year after COVID-19 infection, they occur at similar rates in individuals who tested negative and are therefore not attributable to COVID-19 infection. Our findings pinpoint the specific long-term effects of COVID-19 and show how large-scale data can be used to enable careful evaluation and design of population health studies.
Hannan, M. A.; Selim, S.; Uddin, A. S. M. M.; Rana, M. M.
Show abstract
BackgroundMillions of people worldwide suffer from thyroid dysfunction, and especially hypothyroidism, which is a prevalent endocrine disorder contributing extensively to systemic and metabolic illness. In hypothyroidism, triiodothyronine (T3) and thyroxine (T4), thyroid hormones that control metabolism in several organ systems, are insufficiently secreted. ObjectivesThe objective of this study was to determine the effect of anti-thyroid antibodies on thyroid function in Bangladeshi newly diagnosed patients with hypothyroidism. MethodsA cross-sectional analysis of adult patients with newly diagnosed hypothyroidism was carried out. Thyroid function tests (FT4, TSH), thyroid autoantibodies (anti-TPO, anti-Tg), symptoms, physical findings, and demographics were obtained and analyzed. ResultsThe average age of the study participants was 36.07{+/-}11.00 years, and 70.1% were female. 72.7% of the cases were rural. 89% of the patients were antibody-positive, 81.8% anti-TPO, 55.2% anti-Tg, and 48.1% both. Enlargement of the thyroid (p<0.001) and gain of weight (p<0.043) were associated with antibody positivity. Grade 1 goitre alone was highly predictive of antibody positivity (AOR 11.766, p<0.001). Neither FT4 nor TSH correlated significantly with antibody titers. A significant correlation, however, was noted between anti-Tg and anti-TPO titers. ConclusionRecently developed hypothyroid patients usually have a condition named especially anti-TPO positive, and it is usually accompanied by goitre and family history. Even if the thyroid function tests are not conclusive, early diagnosis and better understanding of the disease process can be made by screening for thyroid antibodies.
Harris, M. J.; Arani, A.; Goel, T.; Zhang, K.; Beckett, S. J.; Lo, N. C.; Dushoff, J.; Weitz, J. S.
Show abstract
Declining vaccine coverage across the United States has increased the risk of outbreaks of vaccine-preventable diseases. Even when vaccines have low primary failure rates, conventional epidemic theory predicts a strongly nonlinear, positive relationship between vaccine coverage and the fraction of breakthrough infections in vaccinated individuals. These breakthrough infections may generate misconceptions that vaccines are not working and accelerate declines in confidence and coverage. Here, we set out to test predictions of conventional epidemic theory that assumes random mixing between individuals irrespective of vaccine status. In contrast to expectations from random mixing models, we find a far lower fraction of breakthrough infections in measles outbreak data from seven states in the United States. To explore this discrepancy, we evaluate an alternative, compartmental disease model that accounts for preferential mixing ( assortativity) between people with the same vaccination status. The model predicts significantly lower fractions of breakthrough infections, consistent with observations from measlesoutbreak data. Next, we leverage the deviation between statewide and school-level vaccine MMR coverage across kindergartens in sixteen states, finding substantial assortativity in all cases. Our model accounting for preferential mixing predicts the total number of breakthrough infections is nonlinear, peaking at intermediate coverage below vaccine-derived herd immunity. Nationally, 94% of counties that report MMR coverage are above the model-predicted breakthrough-maximizing coverage, suggesting that they are at risk for increasing breakthrough infections if coverage declines. Vaccination outreach and monitoring campaigns should develop proactive strategies to contextualize breakthrough infections before low levels of primary failure contributes to population-scale increases in preventable disease. Significance StatementInfections among vaccinated people may exacerbate vaccine hesitancy. Despite reports of breakthrough infections within ongoing measles outbreaks, we show that the realized fraction of breakthrough infections is far lower than predicted by conventional epidemic theory that assumes random mixing. Combining epidemic models and school-level vaccine coverage data, we show that breakthrough infections may be partially limited via preferential mixing ( assortativity) based on vaccination status. Given current MMR coverage, most of the country is at risk for increasing breakthrough infections if vaccine coverage declines further. We conclude that enhanced vaccine monitoring and outreach campaigns are needed to confront a potential positive feedback loop between increasing breakthrough infections and declining vaccine coverage that could substantially increase the burden of vaccine-preventable disease.
Animasahun, A. B.; Lawani, F.; Adekunle, M. O.; Animasahun, G. A.; Ariyibi, A. A.; Hughes-Darden, C.
Show abstract
Malnutrition is a deficiency of both macro- and micro-nutrients. Malnutrition is common in children with congenital heart defects due to reduced intake, poor gastrointestinal absorption resulting from gut hypo-perfusion and increased metabolism. The present study aimed to determine the prevalence and predictors of hypovitaminosis D in children with congenital heart defects aged 1 to 12 years compared with apparently healthy controls. MethodsA comparative cross-sectional study conducted from July to November 2020 involving 115 children with congenital heart disease and 115 apparently healthy controls matched for age, sex and socio-economic class. A self-designed proforma was used to collect information on subjects bio-data, socio-economic class, health information, morbidity pattern, frequency and duration of sunlight exposure and 48 hours dietary recall of vitamin D containing foods. Weight, height, waist and hip circumference were measured, BMI was calculated, waist hip ratio, weight-for-age, height-for-age and BMI-for-age Z scores were obtained. Blood samples were collected for measurement of serum 25-hydroxyvitamin D levels. Hypovitaminosis D was taken as both vitamin D deficiency and insufficiency with serum 25(OH)D levels of less than 20ng/ml and 20.1-29.9ng/ml respectively. Descriptive and inferential analysis was carried out using SPSS version 27. ResultThe prevalence of hypovitaminosis D in subjects was 59.1% with median (interquartile range) of 28.13ng/ml (21.2-42.3ng/ml) while the prevalence in healthy controls was 41.7%. The difference was statistically significant (p= 0.008). There was no statistically significant difference in the prevalence of hypovitaminosis D between the acyanotic and cyanotic subjects. The age groups greater than 3 years, female sex and duration of diagnosis greater than 48 months were independent predictors of hypovitaminosis D in subjects. Conclusionchildren with congenital heart defect should have their serum 25-hydroxyvitamin D measured at intervals and may benefit from treatment if found to have vitamin D deficiency.
Ong'era, E. M.; Katama, E. N.; Nyiro, J. U.; Lambisia, A. L.; Morobe, J. M.; Murunga, N.; Mwasya, S.; Mutunga, M.; Lewa, C.; Githinji, G.; Bejon, P.; Sande, C. J.; Kagucia, E. W.; Delicour, S.; Nokes, D. J.; Munywoki, P. K.; Holmes, E. C.; Agoti, C. N.
Show abstract
BackgroundRespiratory syncytial virus (RSV) is a leading cause of severe acute respiratory infection in infants, young children and vulnerable adults. Despite implications for designing interventions, our understanding of RSV infection/reinfection patterns during community outbreaks is incomplete. MethodsTo characterize respiratory virus infections regardless of symptom status, we performed a prospective cohort surveillance in coastal Kenya from August 2023 to August 2024. Nasopharyngeal/oropharyngeal (NP/OP) swabs were collected 1-2 times weekly regardless of symptom status for quantitative PCR testing followed by genomic analysis. RSV reinfections were defined as two positive tests separated by [≥]14 days and with intervening [≥]1 negative tests. ResultsOf 672 individuals screened, 74 tested positive (93/22,000 swabs; 0.4%). The median age among infections was 4.2 years (interquartile range (IQR): 1.8-9.4), 58.1% being female versus a median age of 14.3 years (IQR: 4.8-29.6) and 64.4% female among uninfected individuals. Overall incidence rate was 19.8 infections/100 person-years, highest incidence among infants (174.0/100 person-years, 95% CI:103.0-274.0). Infection episodes fell into seven viral lineages: A.D (n=1), A.D.1 (n=2), A.D.1.11 (n=21), A.D.2.1 (n=19), A.D.3 (n=30), A.D.5.2 (n=1), and B.D.E.1 (n=2). Six individuals (8.1%; 13.7/100 person-years) experienced reinfections, three involving same viruses with 0-3 nucleotide differences across the entire RSV genome, while other three had 20,78 and 200 nucleotide differences. The (suspect) reinfected individuals were all under 2 years of age, included both males and females, and had no reported chronic illnesses. ConclusionRSV community infections predominantly occur in children regardless of clinical presentation. Reinfections within the same season are rare. Key pointsO_LIIn a community cohort prospective study in coastal Kenya, RSV-A predominated the 2023/24 epidemic and seven lineages co-circulated. C_LIO_LIOverall incidence was 19.6 infections/100 person-years and highest in infants. C_LIO_LIMost reinfections (5/6) were asymptomatic and only half had amino acid changes. C_LI
Noorkhalisah, N.; Arisanti, R. R.; Ramtana, S. D.; Sitaresmi, M. N.
Show abstract
Pneumonia remains a leading cause of global child mortality. Following the Pneumococcal Conjugate Vaccine (PCV) introduction in Yogyakarta, Indonesia, uptake for the primary series (PCV1 and PCV2) exceeded 90%. However, PCV3 coverage remained suboptimal (60% in 2023; 75% in 2024), indicating significant dropout. This study aimed to identify determinants of PCV immunization completeness and timeliness to address this gap. We conducted a cross-sectional study using cluster sampling among 405 caregivers of children aged 13-37 months in Yogyakarta City in March 2025. Data were collected via structured digital questionnaires assessing socio-demographics, perinatal conditions, knowledge, support systems, and attitudes toward multiple injections. Multivariate logistic regression was employed to determine factors associated with PCV immunization completeness and timeliness. Of 398 participants (98.3% response rate), the majority were female (95.7%) and housewives (75.1%). The prevalence of PCV completeness was 66.3%, while timeliness was only 36.4%. Multivariate analysis revealed that acceptance of multiple injections was the strongest predictor for both completeness (aOR 49.18; 95% CI: 21.30-113.50) and timeliness (aOR 22.04; 95% CI: 6.55-74.08). Additionally, home ownership (aOR 1.93; 95% CI: 1.04-3.58) was associated with completeness, whereas high knowledge (aOR 1.85; 95% CI: 1.12-3.03) improved timeliness. Conversely, preterm birth was significantly associated with lower odds of timeliness (aOR 0.29; 95% CI: 0.09-0.88). Acceptance of multiple injections emerged as the most critical modifiable factor for both outcomes. To optimize the PCV program, health authorities should prioritize counselling strategies to alleviate parental concerns regarding multiple injections. Additionally, intensified monitoring for preterm infants is crucial to mitigate immunization delays.
Maleva, J. J.; Linkanti, V. E.; Yongolo, M. A.; Sebogo, Y. D.; Mallya, E. F.; Kimario, E. F.; Msafiri, J. G.; Kameka, C. T.; Sekelwa, C. N.; Temba, V. M.; Msafiri, E. A.; Mwalim, A. H.; Felcian, E. B.; Mwampale, E.; Rashid, F.; Lyimo, B.
Show abstract
Campylobacter jejuni is a zoonotic bacterium causing foodborne gastroenteritis in humans worldwide, whose main symptom is diarrhea. The infection is severe mostly in children and in immunocompromised individuals. Currently, the bacterium has become increasingly resistant to antibiotics, especially those first-choice drugs used to treat campylobacteriosis posing a significant health threat towards the treatment outcomes. The burden of campylobacteriosis and Antimicrobial Resistance (AMR) remains significant with limited genomic surveillance. This study aimed to characterize the resistome (ARGs), virulence factors as well as population structure across Homo sapiens, Milk (from dairy cattle), goat, Bos indicus, Ovis aries, and Gallus gallus in three countries Ethiopia, Kenya and Tanzania through the use of One Health Whole Genome Sequencing (WGS) approach. A total of 161 C. jejuni publicly available WGS were retrieved from NCBI database and analyzed by using established WGS bioinformatics pipelines from genome assembly and annotation, AMR gene identification via ResFinder -ABRIcate, virulence genes were detected via ABRicate/ VFDB. Visualization of gene distribution and population structure were done using heatmap, Venn diagrams, principal component analysis and minimum spanning tree for comparative analysis. Out of 161 C. jejuni WGS, 130 (80.75%) sequences were positive to one or more than one ARGs. Among detected ARGs, the resistome was dominated by {beta}-lactam (blaOXA 193, blaOXA-61, blaOXA-184 and blaOXA-489) genes. Two genes linked to tetracycline resistance (tet(O/32/O), and tet(O) were found in Ethiopia and Tanzania while resistance to aminoglycoside ant (6)-Ia was the least detected. The Gallus gallus-Homo sapiens transmission (zoonotic transmission) was portrayed by the overlap of ARGs (blaOXA-193 and tet(O) and PCA clustering. The conserved virulence gene profiles were shared by all isolated (cadF, jlpA, cdtA, cdtB, cdtC and flagellar genes). The present study adds to the current knowledge on molecular epidemiology and AMR development in C. jejuni species in Eastern African countries and globally. The findings underscore the need for sustained region-specific genome surveillance under One Health framework to inform AMR stewardship and public health interventions.
Yu, R.; Teichmann, P. N. N.; Shimizu-Jozi, A.; Luo, J. Y.; Arora, R. K.; Duarte, N.; Wagner, C. E.
Show abstract
1The timeliness of infectious disease surveillance systems largely determines the speed at which mitigation interventions may be implemented. However, it is unclear how surveillance timeliness evolves during a pandemic with changing government policies, testing tools, and population-level infection and immunity landscapes. Here, we adapt an agent-based model for COVID-19 transmission to explore the timeliness of the surveillance signals obtained from polymerase chain reaction (PCR) and rapid antigen (RAT) tests relative to true infection incidence. Across different pandemic scenarios, we investigate how surveillance timeliness depends on the prevalence of co-circulating influenza-like-illnesses (ILI) and test quality. If only PCR tests are available with symptom-based eligibility, and if tests can detect post-recovery residual viral load, then a surveillance lag may emerge which is amplified by ILI prevalence. When limited RATs are introduced with symptom-based eligibility, and PCR eligibility requires a recent positive RAT, then RAT/PCR timeliness is sensitive to ILI prevalence but insensitive to RAT failure probability. With unrestricted RAT supply, PCR timeliness varies with both ILI prevalence and RAT failure probability. Our work highlights how the timeliness of test-based surveillance signals can evolve throughout a pandemic, with important implications for interpreting real-time surveillance data and designing more effective, data-driven surveillance systems.
Eilersen, A.; Poder, S. K.; Grenfell, B. T.; Simonsen, L.
Show abstract
In 1798, Jenners smallpox vaccine made it possible to prevent the deadliest of childhood diseases. In Denmark the vaccine was used from 1801, and by 1810 a mandatory 1-dose childhood vaccination program was instituted, free of charge. As proof of vaccination (or natural immunity) was required for church confirmation around age 13, about 90 % of children were vaccinated and smallpox disappeared from Copenhagen after 1808. After a 16-year "honeymoon period", it returned in 1824 with a new face: a milder disease in mostly young adults (1, 2). Here we investigate the effects of smallpox vaccination on the epidemic patterns through the post-honeymoon era (1824-1875). We accessed data from the hospital "Sokvaesthuset" where all smallpox cases, mild and severe, were hospitalized during 1824-1835 in order to contain the outbreak. We identified [~]3000 smallpox cases and four separate epidemics occurring during this period (1-3). We used a mechanistic model (SEIR) to assess factors playing a role in explaining the return of smallpox, and the changing age distribution. These factors included vaccination coverage, duration of immunity from vaccination and from natural infections, and the fate of the "lost generation" of persons born around 1800, too early to get vaccinated and too late to have been infected with smallpox. Our model tracks well the disappearance and return of smallpox in 1824, the interval between epidemic peaks, and the aging pattern. We propose vaccine waning after [~]20 years as the primary reason explaining the return of smallpox and the epidemic pattern. SignificanceSmallpox has played a major role in shaping modern medicine. Recently, it has received renewed attention due to fears of bioterrorism and the emergence of the closely related mpox. In this article, we use data from the carefully recorded smallpox outbreaks in Copenhagen in the 1800s to study its dynamics following vaccine rollout. We show that the vaccine likely induced a long-lived but finite immunity and that the "lost generation" who were neither vaccinated nor had contracted smallpox in childhood continued to be plagued by the disease in the following decades. The study is relevant for understanding how smallpox was eradicated and the role of vaccination in dealing with present epidemic threats.
Houy, N.; Flaig, J.
Show abstract
Using the example of an unknown emerging disease with simple SIR (susceptible-infectious-recovered) dynamics, we show that an efficacy randomized clinical trial (RCT) for a vaccine can be misleading when it comes to the cost-effectiveness of that vaccine. An RCT is more likely to demonstrate efficacy with a high confidence level if it is carried out during the peak of the outbreak. However, in this scenario, the vaccine also has a higher chance of being approved too late to be cost-effective. A vaccine is more likely to be cost-effective if vaccination is implemented in the early stages of an epidemic, but an RCT is more likely to fail to demonstrate efficacy if it is implemented too early, that is when disease transmission is too low.
Cheng, I.-H.; Huang, X.; Wang, Y.-H.; Hung, Y.-m.; Wei, J. C.-C.
Show abstract
BackgroundThe safety of Paxlovid and Molnupiravir in COVID-19 patients with autoimmune diseases remains unclear, particularly concerning the risk of interstitial lung disease (ILD). ObjectiveTo evaluate the risk of developing ILD among COVID-19 patients with autoimmune diseases treated with Paxlovid or Molnupiravir. DesignRetrospective cohort study. SettingData were Based on data from the US Collaborative Network in TriNetX Patients: 18,384 COVID-19 patients with pre-existing autoimmune diseases. InterventionsTreatment with Paxlovid or Molnupiravir within five days of COVID-19 diagnosis. MeasurementsILD diagnosis confirmed by ICD-10-CM codes and radiographic evidence. ResultsILD occurred in 54 patients in the Paxlovid group and 79 patients in the Molnupiravir group (HR: 0.73, 95% CI: 0.52-1.03), indicating no statistically significant difference. Subgroup analyses by age, sex, and race showed consistent results. LimitationsObservational design limits causal inference; potential residual confounding. ConclusionTreatment with Paxlovid or Molnupiravir does not significantly increase ILD risk in COVID-19 patients with autoimmune diseases. Primary Funding SourceChung Shan Medical University Hospital. RegistrationNot applicable. Key messagesO_ST_ABSWhat is already known on this topicC_ST_ABSO_LIThe safety of COVID-19 antiviral treatments, particularly nirmatrelvir-ritonavir (Paxlovid) and Molnupiravir, in patients with autoimmune diseases has been poorly understood. Patients with autoimmune conditions are considered high-risk due to compromised immune systems and potential susceptibility to complications like interstitial lung disease (interstitial lung disease). C_LIO_LIPrevious studies have generally excluded or underrepresented patients with autoimmune diseases, creating a significant knowledge gap regarding the safety of these treatments for this vulnerable population. C_LI What this study addsO_LIThis study provides evidence that there is no statistically significant difference in the risk of developing interstitial lung disease between COVID-19 patients with autoimmune diseases treated with nirmatrelvir-ritonavir or Molnupiravir. C_LIO_LIIt reinforces that both antiviral treatments can be used safely regarding the risk of interstitial lung disease in patients with pre-existing autoimmune conditions, addressing an important gap in the literature. C_LI How this study might affect research, practice, or policyO_LIThe findings could influence clinical guidelines and policy decisions regarding the management of COVID-19 in patients with autoimmune diseases, suggesting that healthcare providers can use either nirmatrelvir-ritonavir or Molnupiravir without an increased risk of interstitial lung disease. C_LIO_LIThis study might prompt further research into the long-term effects of COVID-19 treatments on different subpopulations, particularly those with chronic underlying conditions. C_LIO_LIPolicymakers might consider these results when developing targeted recommendations for COVID-19 treatment in populations at increased risk for severe outcomes. C_LI
Bents, S. J.; Bubar, K. M.; Park, H. J.; Tan, S. T.; Baker, R.; Mordecai, E. A.; Lo, N. C.
Show abstract
In the three years since Omicron emergence, SARS-CoV-2 dynamics have exhibited persistent twice-yearly waves in the United States, peaking in late summer and winter, with heterogeneity in timing and intensity across states. This semiannual pattern sharply contrasts with typical annual respiratory pathogen dynamics in the US, yet their underlying mechanisms and whether this pattern will persist remain poorly understood. Here, we tested several hypothesized mechanisms and found that a combination of waning immunity, climatic factors of relative humidity and temperature, variant activity, and vaccination captured divergent patterns in COVID-19 hospitalization incidence across 10 US states, from January 2022-November 2024. Applying a compartmental disease model, we identified that waning infection-derived immunity was the dominant driver of semiannual SARS-CoV-2 dynamics, with climate factors shaping the timing and magnitude of seasonal waves across US states. Scenario analyses indicated that if infection-derived immunity remains short in duration, semiannual dynamics influenced by climate are likely to persist, with attenuation in severe disease over time. In contrast, more durable infection-derived immunity, or a slower rate of immune-evading viral evolution, could lead to an epidemiologic transition to annual dynamics. In some states, summer waves approached the magnitude of winter waves, likely reflecting local climatic influences on transmission, suggesting that optimal vaccination strategies may vary by state. These findings have broad implications for understanding epidemic dynamics and informing vaccine policy, including seasonal timing and two-dose vaccine schedules for high-risk persons.
Nayeem, J.; Salek, M. A.; Nayeem, J.; Hossain, M. S.; Kabir, M. H.
Show abstract
To characterize tuberculosis transmission and assess the impact of important interventions, a data-driven SEITR TB model is created. The potential for disease persistence has been calculated using the basic reproduction number. To determine the factors most significantly affecting the spread of tuberculosis, stability and sensitivity analyses are conducted. Strengthened treatment measures and optimized distancing significantly lower infection levels, according to numerical simulations. The Least Squares Fitting technique is used to validate real epidemiological data with a model solution. And the results emphasize that the best combinations of social distancing and treatment not only reduce the number of infections but also provide a cost-effective strategy for public health planning. Additionally, two numerical techniques, namely Pearson correlation and Partial Rank Correlation Coefficients (PRCC), are utilized to assess the sensitivity of model parameters. It is noted that the outcomes of these two methods are in agreeable comparison with one another regarding sensitivity analysis.
Greischar, M. A.; Childs, L. M.
Show abstract
Pathogenic organisms are typically thought to be constrained by a tradeoff between the rate and duration of transmission, an assumption that underpins a considerable body of evolutionary theory. Here we test for a transmission-duration tradeoff using detailed historical malaria infection data from an era prior to widespread use of antibiotics when humans were deliberately infected with malaria parasites as treatment for neurosyphilis (malariatherapy). These time series follow individual human infections until recovery or treatment with antimalarial drugs due to acute need (a proxy for virulence), and include data on the abundance of specialized transmission stages that can be used to project parasite fitness. We fit a model to estimate initial parasite multiplication rates (PMRs) and find that faster within-host multiplication extends infection duration (time until recovery) and enhances parasite fitness without a discernible cost, such as increased virulence. Initial PMRs exhibit strain-specific differences, a feature required for evolution by natural selection, but our results contradict the idea that the evolution of human malaria parasites is constrained by a transmission-duration tradeoff. Significance statementPathogenic organisms are usually assumed to face a tradeoff such that aggressively exploiting host resources enables more efficient transmission but at the cost of shorter infections. If such a classic transmission-duration tradeoff is not general, then it is not clear what prevents pathogenic organisms from evolving to exploit their hosts ever more aggressively. We use historical data from human malaria infections to show a remarkable lack of evidence for a transmission-duration tradeoff, since faster parasite multiplication tends to prolong infections and enable more efficient transmission. Therefore, classic predictions regarding the evolution of infection-induced harm to hosts may not apply to human malaria parasites, and efforts to locate general evolutionary constraints on pathogenic organisms should look beyond a classic tradeoff. FundingThis work was supported by the Cornell University College of Agriculture & Life Sciences (M.A.G.). L.M.C. was partially supported by the National Science Foundation Grant # 2144680. Data availabilityAll supporting data and code are provided as supplemental files. NoteFor ease of reviewing, this MS includes all elements (including figures) embedded in the text. If accepted, we would be happy to provide elements as separate files.
Mbugua, G. W.; Kanyiri, C.
Show abstract
Cervical cancer remains a significant cause of mortality and economic burden, particularly in developing countries with low rates of human papillomavirus (HPV) vaccination and screening. To address this, we present a mathematical model for controlling cervical cancer by integrating strategic HPV vaccination, screening and treatment. The population is divided into seven compartment: susceptible, vaccinated, infected with HPV, screened, cervical cancer, under treatment, and recovered. The models well-posedness is first established by proving the boundedness and non-negativity of solutions, ensuring biological relevance. The basic reproduction number R0 is computed using the next-generation matrix. The local and global stability of the disease-free equilibrium is analysed using the Jacobian matrix and Lyapunov function respectively. Furthermore, bifurcation analysis is performed using the Castillo-Chavez and Song theorem and sensitivity analysis is conducted on key parameters to identify their influence on disease dynamics. Numerical simulations of the model supports the analytical results. The findings of the study indicate that if the reproduction number is less than one, the solution converges to the disease-free state, signifying the asymptotic stability of the HPV-Cervical cancer free steady state. Crucially, the model demonstrates that increasing vaccination, screening and treatment rates significantly reduces HPV and cervical cancer incidence. This study underscores the value of mathematical modeling in informing the public health policy and provides a framework for optimizing control measures against HPV and Cervical cancer.