Generation of Patient Specific Cardiac Chamber Models Using Generative Neural Networks Under a Bayesian Framework for Electroanatomical Mapping
Document Type
Article
Publication Date
3-2025
Publisher
Springer
Source Publication
Journal of Statistical Theory and Practice
Source ISSN
1559-8608
Original Item ID
DOI: 10.1007/s42519-024-00416-0
Abstract
Electroanatomical mapping is a technique used in cardiology to create a detailed 3D map of the electrical activity in the heart. It is useful for diagnosis, treatment planning and real time guidance in cardiac ablation procedures to treat arrhythmias like atrial fibrillation. A probabilistic machine learning model trained on a library of CT/MRI scans of the heart can be used during electroanatomical mapping to generate a patient-specific 3D model of the chamber being mapped. The use of probabilistic machine learning models under a Bayesian framework provides a way to quantify uncertainty in results and provide a natural framework of interpretability of the model. Here we introduce a Bayesian approach to surface reconstruction of cardiac chamber models from a sparse 3D point cloud data acquired during electroanatomical mapping. We show how probabilistic graphical models trained on segmented CT/MRI data can be used to generate cardiac chamber models from few acquired locations thereby reducing procedure time and x-ray exposure. We show how they provide insight into what the neural network learns from the segmented CT/MRI images used to train the network, which provides explainability to the resulting cardiac chamber models generated by the model.
Recommended Citation
Mathew, Sunil; Sra, Jasbir; and Rowe, Daniel B., "Generation of Patient Specific Cardiac Chamber Models Using Generative Neural Networks Under a Bayesian Framework for Electroanatomical Mapping" (2025). Mathematical and Statistical Science Faculty Research and Publications. 159.
https://epublications.marquette.edu/math_fac/159
Comments
Journal of Statistical Theory and Practice, Vol. 19, No. 1 (March 2025). DOI.