FeMA: Feature Matching Auto-encoder for Predicting Ischaemic Stroke Evolution and Treatment Outcome

Research output: Contribution to journalArticle (Academic Journal)peer-review

1 Citation (Scopus)
17 Downloads (Pure)


Although, predicting ischaemic stroke evolution and treatment outcome provide important information one step towards individual treatment planning, classifying functional outcome and modelling the brain tissue evolution remains a challenge due to data complexity and visually subtle changes in the brain. We propose a novel deep learning approach, Feature Matching Auto-encoder (FeMA) that consists of two stages, predicting ischaemic stroke evolution at one week without voxel-wise annotation and predicting ischaemic stroke treatment outcome at 90 days from a baseline scan. In the first stage, we introduce feature similarity and consistency objective, and in the second stage, we show that adding stroke evolution information increase the performance of functional outcome prediction. Comparative experiments demonstrate that our proposed method is more effective to extract representative follow-up features and achieves the best results for functional outcome of stroke treatment.
Original languageEnglish
Article number102089
Number of pages11
JournalComputerized Medical Imaging and Graphics
Early online date11 Jun 2022
Publication statusPublished - 20 Jun 2022

Bibliographical note

Funding Information:
The authors would like to thank the MR CLEAN Trial Principal Investigators: Prof Aad van der Lugt, Prof Diederik W.J. Dippel, Prof. Charles B.L.M. Majoie, Prof. Yvo B. W.E.M. Roos, Prof. Wim H. van Zwam and Prof. Robert J. van Oostenbrugge for providing the data. Zeynel Samak gratefully acknowledges funding from the Republic of Turkey Ministry of National Education Grant MoNE-1416/YLSY ).

Publisher Copyright:
© 2022 The Authors


Dive into the research topics of 'FeMA: Feature Matching Auto-encoder for Predicting Ischaemic Stroke Evolution and Treatment Outcome'. Together they form a unique fingerprint.

Cite this