Paper ID | BIO-12.6 |
Paper Title |
ECG HEART-BEAT CLASSIFICATION USING MULTIMODAL IMAGE FUSION |
Authors |
Zeeshan Ahmad, Anika Tabassum, Ling Guan, Naimul Khan, Ryerson University, Canada |
Session | BIO-12: Feature Extraction and Fusion for Biomedical Applications |
Location | Gather.Town |
Session Time: | Friday, 11 June, 11:30 - 12:15 |
Presentation Time: | Friday, 11 June, 11:30 - 12:15 |
Presentation |
Poster
|
Topic |
Biomedical Imaging and Signal Processing: [BIO-MIA] Medical image analysis |
IEEE Xplore Open Preview |
Click here to view in IEEE Xplore |
Virtual Presentation |
Click here to watch in the Virtual Conference |
Abstract |
In this paper, we present a novel Image Fusion Model (IFM) for ECG heart-beat classification to overcome the weaknesses of exisiting machine learning techniques that rely either on manual feature extraction or direct utilization of 1D raw ECG signal. At the input of IFM, we first convert the heart-beats of ECG into three different images using Gramian Angular Field (GAF), Recurrence Plot (RP) and Markov Transition Field (MTF) and then fuse these images to create a single imaging modality. We use AlexNet for feature extraction and classification and thus employ end-to-end deep learning. We perform experiments on PhysioNet’s MIT-BIH dataset for five different arrhythmias in accordance with the AAMI EC57 standard and on PTB diagnostics dataset for myocardial infarction (MI) classification. We achieved an state-of-an-art results in terms of prediction accuracy, precision and recall. |