Inside the Texas Heart Studio
Inside the Studio features interviews with special guests visiting The Texas Heart Institute’s TV studio.
From international leaders in the field of cardiovascular medicine to pioneering scientists to community leaders near and far, the Inside the Studio interviews amplify current trends in research and education related to the prevention, diagnosis, and treatment of heart and vascular disease.
View our website:
texasheart.org
Watch Interview Reels on Texas Heart TV:
tv.texasheart.org/inside-the-studio
Stay Connected with Us:
Inside the Texas Heart Studio
Embracing the Future of Medicine: AI and Informatics in Cardiovascular Care
In this episode of Inside the Texas Heart Studio, Dr. Stephanie Coulter, Assistant Medical Director at The Texas Heart Institute (THI), sits down with former THI fellow, Dr. Tony S. Das, to explore the transformative role of artificial intelligence and informatics in modern cardiology.
Together, they discuss how AI-powered tools are already enhancing diagnostics, reducing variability, and improving patient care.
From advanced imaging algorithms to strategies for optimizing healthcare costs, this episode sheds light on how embracing technology can drive better outcomes and streamline practices in cardiovascular medicine.
Tune in to hear how the future of cardiology is closer than you think!
Watch the sit-down interview here.
Watch On Demand Videos on Texas Heart TV
Visit Our Website: texasheart.org
1
00:00:07.925 --> 00:00:08.325
I am Dr.
2
00:00:08.645 --> 00:00:11.565
Stephanie Coulter, and I'm here inside the studio today
3
00:00:11.875 --> 00:00:13.405
with my friend and colleague, Dr.
4
00:00:13.435 --> 00:00:14.925
Tony dos. Thanks for joining us.
5
00:00:15.195 --> 00:00:17.205
He's a, that's a great pleasure. He's esteemed former
6
00:00:17.345 --> 00:00:20.525
fellow, and, um, he's showing us the,
7
00:00:20.825 --> 00:00:25.285
you are showing us the ropes on how to use medicine
8
00:00:25.425 --> 00:00:29.325
and informatics to help us like look into the futures.
9
00:00:29.395 --> 00:00:32.885
Listening to your presentation today, while I was humbled
10
00:00:33.425 --> 00:00:35.645
and I was thinking that the whole time
11
00:00:37.625 --> 00:00:41.045
you are giving me an image of the Jetsons, right.
12
00:00:42.025 --> 00:00:46.845
And I'm behind, I'm already behind on all of this
13
00:00:46.845 --> 00:00:49.965
because it seems so futuristic
14
00:00:50.305 --> 00:00:54.325
and like your presentation today almost made it seem like
15
00:00:55.755 --> 00:00:57.685
it's approachable.
16
00:00:58.235 --> 00:01:00.525
It's not only approachable. Steph, thanks for having me.
17
00:01:00.545 --> 00:01:02.405
By the way. I think the thing about it is
18
00:01:02.405 --> 00:01:03.565
that you're already using it.
19
00:01:03.945 --> 00:01:06.805
You don't even know sometimes that you're using AI
20
00:01:06.825 --> 00:01:09.965
and informatics inside of your evaluation of these patients.
21
00:01:10.355 --> 00:01:14.045
Even the programs that you use for echo, nuclear vascular,
22
00:01:14.385 --> 00:01:18.005
pet ct, they have algorithms in them
23
00:01:18.315 --> 00:01:20.445
that are making the diagnosis
24
00:01:20.445 --> 00:01:21.885
and assessment of these patients.
25
00:01:22.155 --> 00:01:25.165
Anatomy and physiology basically combined.
26
00:01:25.585 --> 00:01:26.685
And we're using a lot
27
00:01:26.685 --> 00:01:28.485
of these technologies on a daily basis.
28
00:01:28.545 --> 00:01:30.005
And I showed several of those today.
29
00:01:30.105 --> 00:01:32.205
So some of those were probably familiar names.
30
00:01:32.205 --> 00:01:33.965
Some of them are probably not familiar names.
31
00:01:34.265 --> 00:01:38.645
But at the end of the day, this is a, an evolution of a way
32
00:01:38.665 --> 00:01:41.925
to be able to assess patients and to be able to diagnose
33
00:01:41.945 --> 00:01:45.485
and then ultimately to treat by using, I would say,
34
00:01:45.485 --> 00:01:47.325
helpful hints from ai.
35
00:01:47.325 --> 00:01:49.285
And that's kind of, wouldn't be worried about it.
36
00:01:49.285 --> 00:01:52.125
I'd be more embracing what it's actually gonna do for you.
37
00:01:52.165 --> 00:01:53.325
Well, we have to embrace it. Yeah.
38
00:01:53.325 --> 00:01:55.845
Because either we embrace it or we're behind.
39
00:01:55.955 --> 00:01:58.685
Yeah, for sure. Right. Because the patients are demanding
40
00:01:58.685 --> 00:02:01.045
it, the payers are demanding it.
41
00:02:01.045 --> 00:02:03.205
And what was your, your funny comment?
42
00:02:03.425 --> 00:02:05.245
Fee for service or fee for value.
43
00:02:05.395 --> 00:02:07.325
Yeah, no, that's the transition that's
44
00:02:07.445 --> 00:02:08.445
Occurred. I mean, it's theran. I mean, that's how you get
45
00:02:08.445 --> 00:02:09.245
people to
46
00:02:10.755 --> 00:02:13.405
pony up for the added costs.
47
00:02:13.745 --> 00:02:16.485
Or in fact, like you were describing, in many ways
48
00:02:17.255 --> 00:02:20.125
there are opportunities for us as physicians,
49
00:02:20.185 --> 00:02:24.165
as hospital systems to actually recoup some of the money
50
00:02:24.165 --> 00:02:27.005
that we're already spending on patient care
51
00:02:27.145 --> 00:02:28.885
that's going unfunded.
52
00:02:29.505 --> 00:02:32.605
And now in the new world reality, maybe some of that
53
00:02:33.115 --> 00:02:35.605
funding can be reestablished
54
00:02:35.625 --> 00:02:38.085
or reallocated back to the people
55
00:02:38.085 --> 00:02:40.365
that are on the front lines that are providing the care
56
00:02:40.545 --> 00:02:42.165
or the systems that are providing
57
00:02:42.165 --> 00:02:43.165
The care. I think the
58
00:02:43.165 --> 00:02:45.365
thing is, is that when you reduce variability
59
00:02:45.945 --> 00:02:49.885
and you increase the specificity of diagnosis,
60
00:02:50.355 --> 00:02:51.685
then you reduce costs.
61
00:02:51.685 --> 00:02:52.845
There's no question about it.
62
00:02:53.025 --> 00:02:55.765
If there's 10 of us that do things in 10 different ways
63
00:02:56.185 --> 00:02:58.565
and we're going to the administration saying we need 10
64
00:02:58.565 --> 00:03:01.725
different catheters or 10 different devices, the fact
65
00:03:01.725 --> 00:03:03.845
of the matter is, is that when you can show
66
00:03:04.155 --> 00:03:07.165
that there's an improvement in outcomes, that fee for value,
67
00:03:07.625 --> 00:03:10.685
and you can show that with that there's less variability,
68
00:03:10.685 --> 00:03:12.925
meaning everybody's kind of rowing in the same direction.
69
00:03:13.545 --> 00:03:16.245
And would that brings resources
70
00:03:16.245 --> 00:03:19.445
because the payers are very comfortable paying for things
71
00:03:19.445 --> 00:03:21.765
that they feel will reduce their costs.
72
00:03:21.765 --> 00:03:22.885
Right. And at the end of the day,
73
00:03:22.955 --> 00:03:25.525
it's not a reduction in the dollar amounts
74
00:03:25.865 --> 00:03:28.205
to each individual or each individual patient.
75
00:03:28.635 --> 00:03:30.005
It's the total cost of care.
76
00:03:30.145 --> 00:03:34.005
So if you can reduce the likelihood of needing a second echo
77
00:03:34.005 --> 00:03:35.645
because one wasn't done properly, right.
78
00:03:35.645 --> 00:03:38.205
Or do, as you know, you're only gonna get one covered
79
00:03:38.205 --> 00:03:40.005
and the patient's gonna undergo that other thing.
80
00:03:40.105 --> 00:03:44.685
So part of this is reduce variability, increase, um,
81
00:03:44.985 --> 00:03:48.445
the overall, you know, efficacy of what you're diagnosing,
82
00:03:48.665 --> 00:03:50.405
and then ultimately the cost will go down
83
00:03:50.405 --> 00:03:51.645
and the resources will go up.
84
00:03:52.185 --> 00:03:55.725
And partly, you know, I can see, you know, like in a lot
85
00:03:55.725 --> 00:03:59.925
of ways we see the big box burden of medicine
86
00:04:00.825 --> 00:04:05.565
as a blanket that we can't, you know, get out from under.
87
00:04:06.465 --> 00:04:10.285
And, and yet in some ways it's this, um,
88
00:04:10.925 --> 00:04:14.045
verticalization of medicine that's actually going
89
00:04:14.045 --> 00:04:18.365
to encourage these kinds of investments
90
00:04:18.705 --> 00:04:23.045
and requirements to standardize what we do,
91
00:04:23.515 --> 00:04:25.045
justify why we're doing it.
92
00:04:25.115 --> 00:04:26.805
Like appropriate use criteria,
93
00:04:27.555 --> 00:04:31.885
guideline directed therapy check boxes. Yeah. I
94
00:04:31.885 --> 00:04:33.605
Mean, at the end of the day, I don't think we want
95
00:04:33.625 --> 00:04:36.365
to have, you know, so many different options
96
00:04:36.465 --> 00:04:39.965
for doing things, uh, that aren't validated by data.
97
00:04:39.965 --> 00:04:41.205
Right? I mean, we're data based.
98
00:04:41.455 --> 00:04:43.805
We're data-driven people, cardiologists
99
00:04:43.805 --> 00:04:44.885
and other love it specialists.
100
00:04:44.945 --> 00:04:46.885
We like the data. Mm-Hmm. So at the end of the day,
101
00:04:46.905 --> 00:04:49.445
we want make sure that we're doing things
102
00:04:49.445 --> 00:04:51.205
that aren't guideline directed
103
00:04:51.505 --> 00:04:54.765
and we want that gentle nudge sometimes that allows us
104
00:04:54.785 --> 00:04:56.405
to basically take information
105
00:04:56.405 --> 00:04:58.125
that we may not see right in front of us.
106
00:04:58.455 --> 00:04:59.965
Maybe it's a piece of information
107
00:04:59.965 --> 00:05:02.285
that's not right in the electronic health record.
108
00:05:02.355 --> 00:05:03.405
Easy to assess
109
00:05:03.595 --> 00:05:06.045
that maybe in the future there'll be a unified health record
110
00:05:06.045 --> 00:05:08.485
that basically tells you, Hey, by the way,
111
00:05:08.485 --> 00:05:09.565
there was this test
112
00:05:09.565 --> 00:05:12.285
that was done eight years ago and this is what it showed.
113
00:05:12.285 --> 00:05:13.725
And that impacts what you're doing right now.
114
00:05:14.035 --> 00:05:15.445
It's actually huge. 'cause we,
115
00:05:15.645 --> 00:05:17.685
I don't know about you, but we're epic.
116
00:05:17.685 --> 00:05:19.525
Yeah. So we get care everywhere. Yeah.
117
00:05:19.905 --> 00:05:23.525
And, um, we just implemented an epic upgrade in the office
118
00:05:23.945 --> 00:05:27.605
and well, I can unfortunately go down
119
00:05:27.605 --> 00:05:29.965
that rabbit hole when I see a patient. Yeah.
120
00:05:30.035 --> 00:05:31.565
It's really helpful And spend a lot
121
00:05:31.565 --> 00:05:32.885
of time through that.
122
00:05:33.105 --> 00:05:36.885
But, so I'm not being remunerated
123
00:05:37.025 --> 00:05:38.525
for my time Mm-Hmm.
124
00:05:38.715 --> 00:05:42.565
Even though I might save a lot of resources
125
00:05:42.585 --> 00:05:46.085
and redundancies by like
126
00:05:46.825 --> 00:05:51.685
having the time to review, you know, the full record
127
00:05:52.025 --> 00:05:55.885
and to see the patient as a, a full human as opposed
128
00:05:55.885 --> 00:05:59.485
to a myopic view of what's the arrhythmia or Right.
129
00:05:59.485 --> 00:06:01.445
What's the valve show? Because well,
130
00:06:01.465 --> 00:06:04.205
You're gonna share in the remuneration at some point.
131
00:06:04.225 --> 00:06:05.405
And the reason is, is
132
00:06:05.405 --> 00:06:07.605
because there are shared savings that occur
133
00:06:08.145 --> 00:06:11.805
and the payers have always been open to the idea
134
00:06:11.865 --> 00:06:13.085
of sharing those savings.
135
00:06:13.265 --> 00:06:15.925
The problem is in a fee for service world,
136
00:06:16.225 --> 00:06:17.565
that's not, doesn't really exist.
137
00:06:17.705 --> 00:06:19.765
So when you get to this other part, I think
138
00:06:19.765 --> 00:06:22.605
that physicians really can't fathom the idea
139
00:06:22.605 --> 00:06:25.365
of doing less can actually potentially make you more,
140
00:06:25.785 --> 00:06:27.245
and that's really not the goal.
141
00:06:27.485 --> 00:06:29.005
I Well, but the goal is to do it right
142
00:06:29.345 --> 00:06:31.925
and do it without, without so much, you know, overlap
143
00:06:31.925 --> 00:06:33.165
and redundancy, et cetera.
144
00:06:33.225 --> 00:06:34.965
And, and that's value. Right?
145
00:06:34.965 --> 00:06:37.045
So ultimately those things are gonna help us do that.
146
00:06:37.275 --> 00:06:41.005
It's kind of cute. So you're all in for the value I'm in
147
00:06:41.005 --> 00:06:42.525
for, I'm for sure into value.
148
00:06:43.145 --> 00:06:45.285
So what I really heard from like,
149
00:06:45.285 --> 00:06:49.405
the big takeaway is things are coming. Don't be afraid.
150
00:06:49.655 --> 00:06:51.405
Don't be afraid. Like embrace it.
151
00:06:51.885 --> 00:06:54.605
I think if you just recognize that it's gonna help you
152
00:06:54.665 --> 00:06:55.845
and not hurt you Yeah.
153
00:06:56.025 --> 00:06:58.205
You will embrace it. Because at the end of the day,
154
00:06:58.205 --> 00:07:00.005
what we wanna do is take care
155
00:07:00.005 --> 00:07:01.245
of our patients more efficiently.
156
00:07:01.345 --> 00:07:03.205
We want to have the data at our fingertips
157
00:07:03.425 --> 00:07:04.805
and we wanna do things that are
158
00:07:04.805 --> 00:07:06.325
guideline driven, data driven.
159
00:07:06.425 --> 00:07:08.685
And if we can do that without changing our workflow,
160
00:07:09.275 --> 00:07:10.885
then I think we'll be able to adopt them.
161
00:07:10.985 --> 00:07:14.165
That's been the biggest challenge to adopting new things, is
162
00:07:14.165 --> 00:07:15.445
that changes the workflow.
163
00:07:15.445 --> 00:07:17.245
Everybody's out if it doesn't change
164
00:07:17.245 --> 00:07:18.525
the workflow and it's kind of working in
165
00:07:18.525 --> 00:07:20.245
The background or if add, if it's an add-on Yeah.
166
00:07:20.265 --> 00:07:21.445
Nobody wants to work harder.
167
00:07:21.805 --> 00:07:23.285
I mean, we wanna work smarter, right?
168
00:07:23.555 --> 00:07:26.845
Well, yeah. We all work Well, you know, the,
169
00:07:27.905 --> 00:07:28.925
the paradigm is
170
00:07:28.925 --> 00:07:30.925
that the physicians are working harder
171
00:07:30.925 --> 00:07:32.125
than they ever did before.
172
00:07:32.235 --> 00:07:35.765
Yeah, for sure. So not putting more burdens on the physician
173
00:07:35.905 --> 00:07:40.405
and like having more computer driven ai, you know,
174
00:07:40.915 --> 00:07:42.445
more confidence Mm-Hmm.
175
00:07:43.105 --> 00:07:45.765
In knowing what the next step will be. Yeah.
176
00:07:45.985 --> 00:07:50.445
You know, like how fast can you get this done, right? Sure.
177
00:07:50.445 --> 00:07:53.605
Because speed matters in a hospital-based climate.
178
00:07:54.385 --> 00:07:56.205
Um, so there's lots of things
179
00:07:56.275 --> 00:07:59.525
that having data scientists on your
180
00:07:59.525 --> 00:08:00.965
faculty can help you with.
181
00:08:01.275 --> 00:08:04.085
Yeah. And we have the ability to work closely
182
00:08:04.275 --> 00:08:05.805
with innovation teams
183
00:08:06.145 --> 00:08:09.565
and folks that are in the tech space when we do certain
184
00:08:09.565 --> 00:08:13.365
projects, which to me starts to give them an insight into
185
00:08:13.365 --> 00:08:15.445
what workflow for us actually looks like.
186
00:08:16.065 --> 00:08:18.685
And they can bring a lot of value to you. Agree with that.
187
00:08:18.685 --> 00:08:20.045
Agree. And I think that they don't know
188
00:08:20.055 --> 00:08:21.885
until they actually witness No, they,
189
00:08:21.955 --> 00:08:23.485
what you're actually doing on a daily basis,
190
00:08:23.715 --> 00:08:24.765
something shock them.
191
00:08:24.995 --> 00:08:26.525
Yeah, I agree. I agree. Is how much time we waste
192
00:08:26.585 --> 00:08:27.725
and how much time we spend,
193
00:08:28.025 --> 00:08:31.845
but also sometimes how they can take little changes
194
00:08:32.225 --> 00:08:36.125
and completely improve our, our daily workflow just
195
00:08:36.185 --> 00:08:37.485
by a couple of little things.
196
00:08:37.545 --> 00:08:40.005
And so I think it's worthwhile keeping those things in mind.
197
00:08:40.685 --> 00:08:44.725
I think that's the best lasting lesson from all this is
198
00:08:44.725 --> 00:08:47.965
that, you know, as we have accumulated
199
00:08:48.875 --> 00:08:51.045
massive amounts of data Mm-Hmm.
200
00:08:51.305 --> 00:08:54.485
On our patients and about their outcomes, um,
201
00:08:55.385 --> 00:08:58.725
we are gonna be able to use that data to improve what we do
202
00:08:58.945 --> 00:09:01.765
for the patients, for mankind, obviously,
203
00:09:02.065 --> 00:09:04.165
and for research generated ideas.
204
00:09:05.105 --> 00:09:09.445
So being part of a bigger organization, there's some real
205
00:09:10.285 --> 00:09:11.645
upstream value for that.
206
00:09:11.945 --> 00:09:14.005
For sure. So doctors shouldn't be afraid.
207
00:09:14.265 --> 00:09:15.765
The med students, you know, like,
208
00:09:15.995 --> 00:09:18.245
Well that's another great question is
209
00:09:18.385 --> 00:09:20.685
how are we gonna train the next generation?
210
00:09:21.385 --> 00:09:23.525
Is it gonna be what we're doing now?
211
00:09:23.825 --> 00:09:27.245
And how much is gonna be added to their knowledge base for
212
00:09:27.905 --> 00:09:30.565
AI driven data assessment?
213
00:09:30.745 --> 00:09:33.165
So what are they gonna need to know in that space?
214
00:09:33.285 --> 00:09:34.565
I can tell you there's gonna be plenty.
215
00:09:35.105 --> 00:09:37.365
And that's gonna be a great way for them to learn
216
00:09:37.385 --> 00:09:40.525
and get educated without being only
217
00:09:40.625 --> 00:09:42.245
at the patient's bedside. They'll be able
218
00:09:42.245 --> 00:09:43.725
To, they can't be because we need
219
00:09:43.725 --> 00:09:46.005
to train them like on, you know, um,
220
00:09:47.155 --> 00:09:48.565
Virtual and simulators.
221
00:09:48.565 --> 00:09:52.005
Virtual simulators. So in fact, we all have simulators,
222
00:09:52.585 --> 00:09:53.725
but you can't, you,
223
00:09:53.725 --> 00:09:55.605
unless you have the techie people around
224
00:09:55.745 --> 00:09:58.165
to simulate some grave medical,
225
00:09:58.825 --> 00:10:01.085
you know, conundrum, it's Yeah.
226
00:10:01.085 --> 00:10:03.005
They're additive. Yeah. They, they, they are definitely.
227
00:10:03.105 --> 00:10:05.165
So I think that there's val value in that,
228
00:10:05.265 --> 00:10:09.285
but I I love that we, you started out with just like
229
00:10:10.785 --> 00:10:14.085
the 1914 technology of the ECG Mm-Hmm.
230
00:10:14.165 --> 00:10:16.205
I mean like, 'cause I kind of poo pooh it like Right.
231
00:10:16.515 --> 00:10:18.045
Look at, look at what you can figure it out at this single
232
00:10:18.625 --> 00:10:20.125
eek g how's that gonna help me?
233
00:10:20.215 --> 00:10:21.325
Right. It tells me
234
00:10:21.325 --> 00:10:23.085
what rhythm you're in, what your heart rate is.
235
00:10:23.865 --> 00:10:24.765
Now you can tell whether they have
236
00:10:24.925 --> 00:10:25.685
structural heart disease.
237
00:10:25.805 --> 00:10:27.725
I know you can follow it, you know, uh,
238
00:10:27.825 --> 00:10:29.405
on a regular basis for ischemia.
239
00:10:29.445 --> 00:10:31.005
I mean, there's a ton of stuff
240
00:10:31.005 --> 00:10:32.325
that can be found. And, and it's all,
241
00:10:32.355 --> 00:10:33.445
Well, not only stuff.
242
00:10:33.445 --> 00:10:35.605
Well, not, it's, it's, it's available. Yeah.
243
00:10:35.745 --> 00:10:38.045
And it doesn't interfere with your workflow. Mm-Hmm.
244
00:10:38.745 --> 00:10:42.485
And so having that dumped into some algorithm,
245
00:10:43.465 --> 00:10:46.245
you know, one of our fellows was involved in that big,
246
00:10:46.505 --> 00:10:47.685
um, aortic valve.
247
00:10:48.195 --> 00:10:50.765
Okay. Yeah. He was the one, he's the data guy behind it.
248
00:10:50.765 --> 00:10:51.965
Yeah. That's great. And he's a third
249
00:10:51.965 --> 00:10:53.085
year, he's a fourth year fellow.
250
00:10:53.425 --> 00:10:56.805
Um, actually from Southwestern. Okay. He's a total star.
251
00:10:56.805 --> 00:10:58.365
You're welcome. Thank you so much Dr.
252
00:10:58.555 --> 00:11:02.165
Doss, for joining us for another round of inside the studio.
253
00:11:02.905 --> 00:11:05.045
And thank you so much for having me, Stephanie.