Authors
Joana Maria Ribeiro1,2, MD, MSc; Thijmen Hokken1, MD; Patricio Astudillo3, PhD; Rutger Jan Nuis1, MD, PhD; Ole de Backer4 MD, PhD; Giorgia Rocatello3, PhD; Joost Daemen1, MD, PhD; Nicolas M Van Mieghem1, MD, PhD; Paul Cummins1, Msc; Matthieu de Beule3, PhD; Joost Lumens5, PhD; Nico Bruining1, PhD; Peter PT de Jaegere*1, MD, PhD
- Department of Cardiology, Thoraxcenter, Erasmus Medical Center, Rotterdam, The Netherlands.
- Department of Cardiology, Centro Hospitalar e Universitário de Coimbra, Coimbra, Portugal
- FEops NV, Ghent, Belgium.
- Department of Cardiology, Rigshospitalet University Hospital, Copenhagen, Denmark.
- CARIM School for Cardiovascular Diseases, Maastricht University Medical Center, Maastricht, The Netherlands.
Abbreviations
AI – Artificial intelligence
CT – Computed tomography
DL – Deep learning
LAA – Left atrial appendage
LV – Left ventricle
ML – Machine learning
RCT – Randomized controlled trial
TAVI – Transcatheter aortic valve implantation
Abstract
In the current era of evidence-based Medicine, guidelines, and hence clinical decisions, are strongly based on the data provided by large-scale randomized clinical trials. Although such trials provide valuable insights on treatment safety and efficacy, their extrapolation to the real-world practice is often hindered by the strict inclusion and exclusion criteria. Precision Medicine calls for a patient-centred clinical management, focusing on the unique features of each individual, and is of particular Importance when invasive procedures are being considered. When applied to Transcatheter Interventions for Structural Heart Disease, Precision Medicine implies the selection of the right treatment for the right patient at the right time. Artificial Intelligence and Advanced Computer Modelling are powerful tools for a patient-tailored treatment delivery. In this review, we reflect on how such technologies can be used to enhance Precision Medicine in Transcatheter Interventions for Structural Heart Disease.
Introduction
The premises of treatment from a patient and physician’s perspective are that the proposed treatment is safe and effective and preferably associated with the lowest possible physical and/or emotional burden. In interventional cardiology, safety can be defined by the absence of peri-procedural complications such as death or stroke but also those from which the patient may not suffer directly, such as conduction abnormalities. Effectiveness relates to short- and long-term clinical outcomes including device performance and the eventual repair of adverse cardiac remodelling that has occurred as a result of the (longstanding) valve disease in case of valve interventions and the absence of stroke in case of left atrial appendage (LAA) occlusion. In addition to these clinical and quality-oriented outcome measures, health care authorities also require treatment to be affordable and deliverable (i.e. access). In other words, treatment has to be accurate, patient-tailored and embedded in their social environment. The latter is easy to address by engaging the patients and their families into the decision-making process. To define the most accurate treatment is much more complex and demands a new way of thinking.1
Evidence-based medicine is the foundation of contemporary clinical practice and heavily relies on the findings of randomized controlled clinical trials (RCT). However, patients included in RCT’s most often do not represent the patient seen in one’s practice due to the in- and exclusion criteria. Moreover, the selection criteria only partially capture the complex reality of the patient, as many patient-related features are not considered or cannot be accounted for. This justifies the call for Precision Medicine that is defined by: “treatments targeted to the needs of an individual patient on the basis of genetic, biomarker, phenotypic or psychosocial characteristics that distinguish a given patient from another patient with a similar clinical presentation”.2,3
Precision Medicine is possible with the advent of powerful computer systems that are capable of storing and analysing large datasets in addition to the application of refined algorithms allowing the prediction of outcome clinically (i.e. prognosis), technically (i.e device/host interaction and, thus, valve performance) and/or physiologically (e.g. intra-cardiac haemodynamics, recovery of left ventricular [LV] function). Digital transformation of health care may therefore be expected to improve safety and efficacy but also efficiency by reducing the time of decision making while at the same time enhancing the quality of the treatment decisions (utility vs futility) and thus enhance cost-effectiveness. The purpose of this paper is to reflect and demonstrate how artificial intelligence (AI) and advanced computer modelling can be applied in transcatheter interventions for structural heart disease and how it will affect our clinical practice.
Digital Health and Artificial Intelligence
Digital health has been defined as the use of digital technologies for health. The digitalization of medical records, telemedicine, patient monitoring through mobile devices (mobile health), remote recruiting of volunteers for research studies are examples of digital health applications in clinical practice.4-7 AI, however, might be the ultimate example of avant-garde digital technology applied to health care.
Artificial Intelligence is defined as the capability of a machine to analyse and interpret external data and learn from it in order to achieve goals through flexible adaptation.7-11 It stems from an initiative taken at the Dartmouth Conference in 1953 during which scientists proposed the development of computers able to perform tasks that require human intelligence. Soon thereafter followed the concept of Machine Learning (ML, 1959), to develop mathematical methods for classification and was first applied for linear problems (data separation by a straight line). However, for non-linear problems, neural networks, which used a small stack of densely connected layers, were introduced. These neural networks yield a shallow network, where the input signal flows through the layers until the output. By using multiple different layers (for example, the convolutional layer) and thus enlarging the stack of layers, deeper networks were developed, introducing the concept of Deep Learning (DL). DL was stimulated with the advent of graphic processing units (GPU) used for the gaming industry, which are processors capable of processing large amounts of floating-point arithmetic in parallel. AI and DL, in particular, were further stimulated by cloud platforms such as, for example, Microsoft Azure, enabling the storage of large amounts of data that can be used for development and implementation of AI algorithms.
Machine learning is an advanced AI algorithm by which the machine learns from experience, as the performance of a given task improves with more experience. Distinction is made between supervised, unsupervised or reinforcement learning (Figure 1).8,10,11 Supervised learning relies on the analysis of human-labelled (annotated) data, in which the machine is given examples of input-output relationships until it learns the function that best correlates the input with the labelled output, and ultimately is capable of predicting the output from an unlabelled dataset. Unsupervised learning, on the other hand, relies on the processing of unlabelled data – minimising the need for human interaction, while looking for patterns in the data. An example of unsupervised learning is cluster analysis, which is increasingly used in medicine. The strength of cluster analysis is that it is capable of analysing large amounts of unlabelled data with extensive heterogeneity from which it differentiates distinct clusters of disease phenotypes (i.e. alike versus not alike). Reinforcement Learning combines features of supervised and unsupervised learning. This area of ML focuses on how artificial agents take actions in an environment in order to maximise some notion of cumulative reward.8,10,11
Deep learning is a specific type of ML which can create the features required to perform the task at hand by itself. Whereas ML requires feature engineering in order to represent the data, DL uses the depth and chain of layers to create these features by itself (Figure 2). The created features may include, for example an edge detector or a calcification detector but can also include features which are incomprehensible to humans which is why DL may be considered as a black-box model. However, the combination of all these created features within a model can form non-linear decision boundaries, resulting in models that achieve (or even surpass) humans abilities. For example, Poplin et al.12, presented a method to predict cardiovascular risk factors from retinal fundus photographs using a DL algorithm, which was information not thought to be contained in this imaging modality. The quality of the output of DL (or ML) depends on the quantity, quality and diversity of the input data. Large samples of heterogeneous multi-variable data from large samples of patients or populations with heterogeneous profiles, diseases and disease states – “big data” - are needed in order to fulfil the goal of Precision Medicine.10,11 One particular problem with DL is the risk of overfitting calling for a critical selection of the relevant variables for a given problem.13 Also, the inclusion of a great amount of data increases the algorithm complexity and processing time, particularly with supervised learning. For instance, Madani et al14 labelled only 4% of the available data for accurate supervised and semi-supervised deep learning algorithms for automated diagnosis of cardiac disease.
DL thus requires considerable computing power that is capable of computing with low levels of energy demand. This may be achieved, for instance, by stochastic or probabilistic computing. At variance with conventional computing that operates via deterministically operated systems using information in binary code (0 & 1), probabilistic computing uses probabilistic bits (p-bits) that are classical entities interacting with each other based upon the principle of human neural networks.15
The coupling of DL techniques with modern advanced computer modelling may become the next step in the treatment decision in patients with structural heart disease and the planning of the intervention, thereby stimulating Precision Medicine.
Artificial Intelligence and Advanced Computer Modelling in Clinical Practice
Treatment decision – disease phenotype & prognosis
How could AI help Mrs Jones, who comes to the outpatient clinic because of cardiac valve disease such as aortic stenosis or mitral regurgitation or a combination thereof? Echocardiography provides a series of quantitative measures of her heart (e.g. dimensions of the cavities and myocardium, function [LVEF, strain, …]) and valves (e.g. calcification, anatomic and effective orifice area,…) and possibly MRI may provide information on myocardial fibrosis, regurgitant volume if present, etc.. AI, and in particular cluster analysis, allows the processing of this tremendous amount of data, offering the possibility of identifying those phenotypes associated with a benign outcome that do not warrant treatment at this point versus those associated with an impaired prognosis for which intervention is indicated. Furthermore, Cluster Analysis may identify phenotypes that convey such an advanced disease state that no treatment impacts prognosis thus avoiding futile intervention.
Although these advanced computational techniques have not yet been widely implemented, their feasibility has been ascertained.16-20 ML algorithms have been used to identify different phenotypes of aortic stenosis and suggested that two pathways of disease progression exist and that they may be related to different adaptive mechanisms.20 Primor and colleagues21 used hierarchical cluster analysis based on preoperative clinical and echocardiographic characteristics in patients with severe primary mitral regurgitation in order to define distinct phenotypes with different surgical outcomes. Cluster analysis has also been applied in atrial fibrillation to identify patients at risk for stroke and other adverse events.22,23 Conceptually, it could also be used to understand which anatomic features of the LAA predispose to thrombus formation and/or dislocation from the LAA. Upon further refinement and validation, AI can and will be used for patient-tailored medicine. It is conceivable that Cluster Analysis may identify patients or disease phenotypes that are currently not accepted for treatment (e.g. asymptomatic patients or patients with moderate-severe valve disease and preserved LV function) but may benefit from valve replacement or repair with the primary objective of long-term preservation of LV function.
Treatment Planning
A crucial step in procedure planning is device selection (type and size). With respect to sizing, current surgical practice is based upon in situ probing of the target anatomy while device selection for catheter-based procedures is more intricate since it relies on the analysis of echocardiography and computed tomography (CT) by experienced operators. Albeit more precise and accurate than surgical sizing, it is more time-consuming and prone to significant intra- and inter-observer variability. For transcatheter aortic valve implantation (TAVI) planning, DL allows automatic identification of relevant landmarks from pre-procedural CT images, as well as the measurement of aortic dimensions and automatic selection of valve size with variations within or below the range of inter-observer variability (Figure 3).24-26 This has reduced the processing time from several minutes to a few seconds per patient and even less than one second.24-26 The same concept has been used for sizing of the mitral valve annulus24,27 and could, in principle, be applied to other structures relevant for treatment planning such as the mitral leaflets, tricuspid and pulmonary valves and for LAA closure procedures as well.
Device selection must encompass more than the valve size in relation to the dimensions of the target anatomy. It must also include the assessment of the interaction of the device with the host. For that purpose advanced patient-specific computer modelling has been developed.28 It incorporates the geometric and mechanical properties of both the device and host (Figure 4). The combination with DL may further enhance modelling accuracy. This technology has already been applied in several interventions for structural heart disease. The most experience has been gained in TAVI procedures of patients presenting with bi- and tricuspid aortic stenosis. It has been shown to accurately predict the occurrence of PVL, conduction abnormalities and coronary obstruction.28-33 The recently concluded TAVIGuide multicentre registry34 revealed that patient-specific computer simulation might not affect so much the selection of the size of the valve but more importantly the execution of the TAVI procedure, mainly by changing the target depth of implantation. Dowling et al.35 mentioned even a change of the initial treatment decision (surgery instead of TAVI) in two out of nine patients with bicuspid aortic stenosis who were accepted for TAVI.
Advanced computer simulation has also been proposed for the surgical and catheter-based interventions of the mitral valve and LAA.36-41 It has been shown to accurately simulate transcatheter edge-to-edge mitral valve repair as well as transcatheter mitral valve replacement.35,38,39 The clinical application of patient-specific simulation in determining anatomic (un)suitability for transcatheter mitral valve replacement (i.e. left ventricle outflow tract obstruction) has been reported40,41. Bavo and colleagues42 validated the first model for the simulation of percutaneous LAA occlusion and reported a good correlation between the simulated outcomes and post-procedural CT imaging. Some of these groups used ML for automation of structure recognition and model creation from the pre-procedural imaging, but have not yet incorporated DL to refine the simulation process.
Treatment evaluation – Physiological impact
Advanced computer modelling also offers the possibility of simulating the haemodynamic changes of the heart and cardiovascular system that occur in a patient in response to an intervention such as valve replacement or repair (i.e. in silico experiments). Walmsley and colleagues43 used the CircAdapt model of the human heart and circulatory system to compare a one-step versus gradual correction of mitral and tricuspid regurgitation (Figure 5). Their results suggested that gradual correction avoids the transient increase in ventricular load in comparison to a sudden correction. Such application of advanced system-level computer modelling for pre-clinical in silico evaluation of novel valvular devices may reduce the time to market of novel therapeutic options.44 Furthermore, in silico (pre-)clinical trials facilitate and improve the design of real-world clinical trials and thereby reduce the number of patients required to be included.45 Other in silico experiments have also provided new insights for prosthesis design by demonstrating the impact of the mode of mitral valve opening on LV haemodynamics.46 They have also proved useful to clarify the haemodynamic impact of prosthesis position in both the aortic and mitral positions. For transcatheter mitral valve replacement, a prosthesis protrusion into the LV outflow tract greater than 35% was shown to significantly increase LV afterload and decrease stroke volume.47 For TAVI, commissural misalignment showed no effect in coronary filling pressures or transprosthetic gradient but was associated with an increased incidence of mild aortic regurgitation.48 These experiments thus allow the tailoring of invasive procedures towards obtaining the haemodynamic profile that most closely resembles the original non-diseased status.
Summary
As doctors, we interact daily with our patients. We talk to them, experience their doubts, their concerns, their troubles. We eventually come to look at a patient as the person that she/he is, rather just another case file. The medical dilemmas that so often trouble us are inevitabilities of clinical practice that arise precisely from the fact that we want to deliver the best treatment possible to that particular patient standing in front of us. However, our current medical armamentarium does not provide us with a clear answer. The advent of Artificial Intelligence provides us with the possibility of processing large and extensive sets of data in such a way that complex patterns and relationships between variables that would never be accessible to the human eye and mind will eventually come to light. Artificial Intelligence must thus be regarded as the tool that will allow us to deliver true patient-centred healthcare, rather than a technology that will hoard our clinical tasks. Moreover, Artificial Intelligence, coupled with advanced simulation modelling, grants us the possibility of testing an invasive treatment in a patient-specific anatomic setting, thereby predicting which treatment is the most optimal for a specific patient. Advanced computer modelling also allows us to pursue a favourable haemodynamic outcome by allowing us to understand the physiological responses to a given treatment.
Declaration of Interest
The authors state nothing to declare.