AI is quickly working its way into healthcare. These clinicians want to be at the forefront
Sara Well, DropStat

AI is quickly working its way into healthcare. These clinicians want to be at the forefront

There’s a truism in medicine when it comes to artificial intelligence – I’ve heard it a number of times, and chances are, you have too.

AI won’t replace doctors, the saying goes, but doctors who know how to use AI will replace those who don’t.

And yet doctors and nurses have been cautious, skeptical even, when it comes to implementing the technology. As many as 55% of physicians indicated that they don’t think AI is ready for use in medicine, according to a global survey from GE HealthCare. Nurses too have qualms about the technology; when LinkedIn conducted a 2018 survey, AI received the most negative sentiment among all the digital tools we asked about. And just this past week, hundreds of nurses with National Nurses United rallied with Hollywood writers and actors to voice their concerns about AI. 

Still, ever since ChatGPT came on the scene, the conversation around artificial intelligence has exploded. Studies of the technology abound: patients in one paper preferred ChatGPT’s more empathetic responses over doctors’ 79% of the time, while other research has found that the technology has a 72% accuracy rate when it comes to clinical decision-making. 

ChatGPT certainly has its champions. Doctors-in-training have been urging their colleagues to keep an open mind about the tool, while industry veterans are touting the possibilities for reducing burnout in clinical practice.  

But if there’s another truism I’ve heard, it’s this: Silicon Valley doesn’t understand healthcare. And the disconnect manifests in a number of ways: from software that doesn’t integrate with the technology used in hospitals and clinics, to algorithms that aren’t open to peer review, to products that put more demands on a clinician’s time rather than simplify workflow.

Clinicians could hardly be faulted for tuning out the clamor from the scores of vendors jockeying to be first-to-market. 

“Clinicians are not the ones who are getting to choose the technology that they use,” said Subha Airan-Javia, the CEO and founder of health-tech company CareAlign. “That's probably why they're like, I don't even want to look at the stars if I can't get close to them.”

But healthcare professionals who are excited about AI see an opportunity for offloading the tasks that currently suck the joy out of medicine and nursing.  

“The operations of healthcare is clearly where I think it should be implemented first,” said Dr Rowland Illing , chief medical officer of international public sector health at Amazon Web Services. “It’s much more likely to impact whether the patients turn up on time to their appointments, or whether they are scheduled appropriately. It's not necessarily the diagnostic elements … it's actually triage.” 

I chatted with a few clinicians who are helping to shape the next generation of AI platforms. We discussed what they’re building, how it might help their fellow doctors and nurses, and why they think AI is the solution. 

Here’s what they told me. 


Sara Well , founder and CEO, DropStat 

Well had a fairly typical path in nursing: she spent years at the bedside in critical care before working up her way to charge nurse. It was then that she started to take on administrative tasks, and saw how time-consuming and expensive scheduling is for hospitals.

Scheduling is often a manual process. When there’s a gap in the schedule – someone calls out sick, for example – charge nurses have to scramble for coverage. First, they might try offering incentive bonuses to other nurses on the unit, then they might call to see if there’s a float nurse available, and then finally, as a last resort, they might request a travel nurse from an agency. 

“It really struck me that there's gamma knife surgery on the eighth floor and minimally invasive robotics on the third, and I'm calling and texting people to come into work the same way we did 20 years ago,” Well said.  

“And what really bothered me was that I knew that I could only access my one unit, but there are six other units within the hospital where the staff are cross-trained to work. My unit policy allowed us to float. Technology did not.”

She created DropStat, a platform that can automatically reach out to any nurse who’s trained to work on a given unit. But what’s different about the technology is its ability to predict when there might be coverage gaps.

“We use machine learning and AI to trace callout patterns and to look down the road about 60 days in advance and predict what the staffing needs are going to be,” Well said.

By doing so, DropStat’s clients have been able to reduce incentive pay by about 75% in three months, according to Well.

“This is something that really should be done by machines,” she said. “It shouldn't be done by people who are primarily tasked with taking care of patients.”


James Colbert, MD, MBA , SVP, care delivery, Memora Health

The responsibility for patient care doesn’t end after an appointment. Doctors still have to field questions and triage medical concerns. At the same time, they usually aren’t reaching out proactively – they don’t know if a patient has stopped taking a certain medication or if they’ve followed through on a referral to a specialist.

Enter Memora, a texting platform that allows patients to communicate with their care teams 24/7 and get an immediate response. Doctors can also use the platform to check in on their patients between visits – something that’s becoming more important under value-based payment models, where reimbursement is linked to outcomes.

The technology works because the majority of patient concerns can be anticipated ahead of time, Colbert said, pointing to the example of a new mom who has questions about how much bleeding is normal.

At least 70% of messages are answered automatically, according to Colbert, and upwards of 80% of patients say they’re satisfied with the support they received. 

That patient satisfaction piece is particularly top of mind for health systems that are looking over their shoulder at Amazon and other disruptors, he noted. 

“There is some threat right now in the industry where traditional health systems realize that they need to be more customer-centric,” he said. “That requires additional tools and technology that health systems can't develop themselves.”

Yet while patient engagement remains a concern for health systems, there’s an even bigger problem they’re grappling with: burnout. 

“In conversations that I have with hospital leaders, one of the biggest challenges that they have right now is just retaining their clinical staff,” Colbert said. “[Clinicians are] being inundated where they're getting messages from patients 24/7, and they don't have time during the day to even respond to those messages. That's where our value prop is.” 


Subha Airan-Javia, CEO, CareAlign

CareAlign doesn’t currently use AI in the purest sense – where the technology learns and self-perpetuates. Nevertheless, the project management software has some of the features that people might colloquially refer to as artificial intelligence.

In particular, it uses natural language processing to detect when a doctor includes a word or phrase in a note that can perpetuate implicit bias. “Then we'll nudge them and say, Hey, this word can lead to ageism or patient blaming biases; use this instead,” Airvan-Jain said.

Ageism and patient blaming are two of the most common examples of where stigmatizing language tends to pop up, according to her research. “In medicine, we talk as if patients are complaining, like ‘patient complains of blank,’” she said. “They're not complaining. They're just coming to you for help with a problem.” 

Other words and phrases that get flagged include “noncompliant,” “elderly” and “poor historian.”

“With stigmatizing language, there is still a lot of disagreement around what are words that you should avoid,” Airvan-Jain said. “But, to be honest, I think I was just tired of waiting. I wanted to start making an impact and start getting it out there.”

But that concern also speaks to a larger challenge with AI: knowing whether and how the results were validated. Academic communities are just starting to put together the guidelines around responsible use of the technology. 

Still, she emphasized that AI will make a significant impact on medical practice, particularly around workflow.

“Much like Excel is a tool that we use to get work done,” Airvan-Jain said, “AI will be a tool that we use to be more efficient … so that we're doing fewer things that don't require as much higher-level learning for us.”

It's only ok but it's not true

Like
Reply
Harvey Castro, MD, MBA.

Advisor Ai & Healthcare for Singapore Government| AI in healthcare | TedX Speaker #DrGPT

1y

In likelihood, the feverish growth of truism will necessitate doctors begin adopting effective best AI clinical practices, so elective courses in medical school and CEU (continuing education units) courses for all licensed health care workers could become effective outreach educational methodologies in the health care ecosystem. Because AI can be utilized as a supplemental information source in applicable care scenarios rather than a standalone medical treatment solution, doctors could reap endless benefits because AI provides accessibility to valuable medical information not obtainable through other medical sources.   #patientadvocacy #aiadvantage

I agree that AI has its place. Scheduling is certainly one of them. I would, however, be very cautious about using it in a clinical setting communicating with patients. I know this isn’t exactly AI, but how many times have you looked at an EKG that read artifact as a bundle branch block or even an MI? I never trust what the machine says. So call me skeptical.

To view or add a comment, sign in

Insights from the community

Others also viewed

Explore topics