Back

Efficient and Secure μ-Training and μ-Fine-Tuning for TinyML Optimization and Personalization at the Edge

Huang, Z.; Yu, L.; Herbozo Contreras, L. F.; Kavehei, O.

2025-02-04 cardiovascular medicine
10.1101/2025.01.30.25321374 medRxiv
Show abstract

This study presents a novel, computationally efficient training framework demonstrated through bio-signal processing on edge medical devices. The approach integrates conventional full training with an innovative {micro}-Training technique, wherein the encoder and decoder of a compact model remain frozen while only the middle layer is updated. This design is further enhanced by a novel Future-Guided Self-Distillation mechanism that leverages the models anticipated future state in training to boost performance and improve generalization on unseen data, using electrocardiogram (ECG) signals as the primary case study. Additionally, {micro}-Fine-Tuning facilitates ondevice adaptation under resource-constrained conditions. We validate our framework using in-sample data from the Telehealth Network of Minas Gerais (TNMG) and out-of-sample testing on the China Physiological Signal Challenge 2018 (CPSC) datasets. Experimental results demonstrate that our integrated strategy (combining full training, self-distilled {micro}-Training, and {micro}-Fine-Tuning) consistently matches or surpasses conventional methods while significantly improving computational efficiency and mitigating catastrophic forgetting. Deployment on Radxa Zero hardware underscores the approachs practical applicability and scalability. Moreover, a demonstration incorporating the proposed self-distilled {micro}-Training into standard training procedures reveals performance improvements. This highlights the techniques potential for broader applications beyond medical diagnostics and TinyML systems, paving the way for its integration into existing training mechanisms to elevate overall model performance.

Matching journals

The top 3 journals account for 50% of the predicted probability mass.

1
IEEE Transactions on Biomedical Engineering
38 papers in training set
Top 0.1%
27.7%
2
Medical Image Analysis
33 papers in training set
Top 0.1%
18.7%
3
Nature Communications
4913 papers in training set
Top 28%
6.4%
50% of probability mass above
4
Journal of Neural Engineering
197 papers in training set
Top 0.5%
6.3%
5
PLOS ONE
4510 papers in training set
Top 39%
3.6%
6
IEEE Journal of Biomedical and Health Informatics
34 papers in training set
Top 0.5%
3.6%
7
Scientific Reports
3102 papers in training set
Top 37%
3.6%
8
IEEE Access
31 papers in training set
Top 0.3%
1.9%
9
npj Digital Medicine
97 papers in training set
Top 2%
1.8%
10
Nature Medicine
117 papers in training set
Top 2%
1.7%
11
iScience
1063 papers in training set
Top 16%
1.7%
12
Advanced Science
249 papers in training set
Top 12%
1.5%
13
Sensors
39 papers in training set
Top 1%
1.5%
14
eLife
5422 papers in training set
Top 49%
1.2%
15
Biology Methods and Protocols
53 papers in training set
Top 2%
0.8%
16
Nature Machine Intelligence
61 papers in training set
Top 3%
0.8%
17
JACC: Clinical Electrophysiology
11 papers in training set
Top 0.3%
0.8%
18
Computers in Biology and Medicine
120 papers in training set
Top 5%
0.7%
19
PLOS Digital Health
91 papers in training set
Top 3%
0.7%
20
IEEE Transactions on Medical Imaging
18 papers in training set
Top 0.5%
0.7%
21
Nature Biomedical Engineering
42 papers in training set
Top 2%
0.6%
22
PNAS Nexus
147 papers in training set
Top 3%
0.6%
23
Computer Methods and Programs in Biomedicine
27 papers in training set
Top 1%
0.6%
24
European Heart Journal - Digital Health
15 papers in training set
Top 0.7%
0.6%