High-fidelity bioimage restoration via adversarial learning
Munoz-Barrutia, A.; Lachowski, D.; Rey-Paniagua, G.
Show abstract
Live-cell microscopy restoration is constrained by a trade-off between inference latency and texture preservation. While diffusion models provide high textural fidelity, the computational cost of iterative sampling currently limits their use in low-latency instrument feedback loops. Here, we present NAFNet GAN, a restoration framework that couples an activation-free backbone with a perceptual adversarial objective to enable high-throughput analysis. Unlike diffusion architectures, NAFNet GAN achieves an inference latency of[~] 110 ms for 1024 x 1024 inputs, potentially suitable for real-time instrument feedback loops. Across eight datasets ranging from STED nanoscopy to histopathology, the method achieves the lowest Learned Perceptual Image Patch Similarity (LPIPS) scores in 7 of 8 benchmarks while preserving structural coherence (e.g., MS-SSIM > 0.968 in Cryo-EM), which facilitates reliable downstream analysis. Supported by performance benchmarks in the AI4Life Denoising Challenge, NAFNet GAN restores structural features from low-photon-budget acquisitions, maintaining the temporal resolution required for dynamic live-cell workflows.
Matching journals
The top 2 journals account for 50% of the predicted probability mass.