Bright-field to fluorescence microscopy image translation for cell nuclei health quantification

Ruixiong Wang*, Daniel Butt, Stephen Cross, Paul Verkade, Alin Achim

*Corresponding author for this work

Research output: Contribution to journalArticle (Academic Journal)peer-review


Microscopy is a widely used method in biological research to observe the morphology and structure of cells. Amongst the plethora of microscopy techniques, fluorescent labeling with dyes or antibodies is the most popular method for revealing specific cellular organelles. However, fluorescent labeling also introduces new challenges to cellular observation, as it increases the workload, and the process may result in nonspecific labeling. Recent advances in deep visual learning have shown that there are systematic relationships between fluorescent and bright-field images, thus facilitating image translation between the two. In this article, we propose the cross-attention conditional generative adversarial network (XAcGAN) model. It employs state-of-the-art GANs (GANs) to solve the image translation task. The model uses supervised learning and combines attention-based networks to explore spatial information during translation. In addition, we demonstrate the successful application of XAcGAN to infer the health state of translated nuclei from bright-field microscopy images. The results show that our approach achieves excellent performance both in terms of image translation and nuclei state inference.
Original languageEnglish
Article numbere12
JournalBiological Imaging
Publication statusPublished - 15 Jun 2023


Dive into the research topics of 'Bright-field to fluorescence microscopy image translation for cell nuclei health quantification'. Together they form a unique fingerprint.

Cite this