News from 秘密研究所 Langone Health
STAT News 鈥 September 5聽
-Michael P. Recht, MD, the Louis Marx Professor of Radiology, chair, Department of Radiology
*Subscription required. Please see full text at end of report.
USA Today 鈥 September 5聽
-Nicole M. Ali, MD, clinical associate professor, Department of Medicine, Division of Nephrology
Daily Record 鈥 September 4聽
-Lisa Ganjhu, DO, clinical associate professor, Department of Medicine, Division of Gastroenterology and Hepatology
Epicenter NYC 鈥 September 1聽
-Nadia S. Islam, PhD, associate professor, Department of Population Health, Institute for Excellence in Health Equity
SciTech Daily 鈥 September 2
-Joanne Bruno, MD, PhD, fellow, Department of Medicine, Division of Endocrinology, Diabetes, & Metabolism
-Jose O. Aleman, MD, PhD, assistant professor, Department of Medicine, Division of Endocrinology, Diabetes & Metabolism
Cancer Network 鈥 September 4聽
-Joshua K. Sabari, MD, assistant professor, Department of Medicine, Division of Hematology and Medical Oncology, Perlmutter Cancer Center
Health Digest 鈥 September 2聽
-Ashley S. Roman, MD, the Silverman Professor of Obstetrics and Gynecology, vice chair for clinical affairs-Obstetrics, Department of Obstetrics and Gynecology
Medscape 鈥 September 5聽
-Julia Greenberg, MD, resident, Department of Neurology
Women鈥檚 Health 鈥 September 5聽
-Nicole Lund, MPH, RDN, clinical nutritionist, Department of Rehabilitation Medicine, Sports Performance Center
Martha Stewart 鈥 September 1聽
-Nina Blachman, MD, associate professor, Department of Medicine, Division of Geriatric Medicine & Palliative Care
The Straits Times 鈥 September 5聽
-Benjamin M. Brucker, MD, associate professor, Departments of Urology, and Obstetrics and Gynecology
FOX News 鈥 September 3聽
-Michael B. Whitlow, MD, clinical associate professor, the Ronald O. Perelman Department of Dermatology
-Doris Day, MD, clinical associate professor, the Ronald O. Perelman Department of Dermatology
-Marc K. Siegel, MD, clinical professor, Department of Medicine, Division of General Internal Medicine
FOX News Rundown Podcast 鈥 September 1
-Marc K. Siegel, MD, clinical professor, Department of Medicine, Division of General Internal Medicine
USA Today 鈥 September 5
-Marc K. Siegel, MD, clinical professor, Department of Medicine, Division of General Internal Medicine
-Lenard A. Adler, MD, professor, Departments of Psychiatry, and Child and Adolescent Psychiatry
News from 秘密研究所 Langone Hospital鈥揕ong Island
LongIsland.com 鈥 September 4聽
-Erika Banks, MD, professor, Department of Obstetrics and Gynecology, 秘密研究所 Long Island School of Medicine
-秘密研究所 Langone Hospital鈥擫ong Island聽
The Island 360 鈥 September 1聽
-Erika Banks, MD, professor, Department of Obstetrics and Gynecology, 秘密研究所 Long Island School of Medicine
-秘密研究所 Langone Hospital鈥擫ong Island
News from 秘密研究所 Langone Hospital鈥揃rooklyn
Popsugar 鈥 September 1聽
-Meleen Chuang, MD, clinical associate professor, Department of Obstetrics and Gynecology, Family Health Centers at 秘密研究所 Langone
-Veleka Willis, MD, clinical assistant professor, Department of Obstetrics and Gynecology
*STAT News, September 5, 2023 - Moving beyond ChatGPT: How generative AI is inspiring dreams of a health data revolution - The world鈥檚 largest technology companies are racing to build generative AI into every corner of health and medicine.
Microsoft has formed an alliance with the electronic health records vendor Epic to wire the technology into dozens of health software products. Google is infusing it into tools used by hospitals to collect and organize data on millions of patients. Not to be outdone, Amazon has unveiled a service to help build clinical note scribes, and is separately working to embed generative AI in drug research and development.
All of this has unfolded faster than federal regulators could blink 鈥 or answer questions about how the technology should be tested and evaluated, whether it will help or hurt patients, and how it will impact privacy and the use of personal data.
To keep tabs on its use, STAT built designed to catalog the emerging applications and experiments, and trace the alliances quickly forming between the builders of generative AI models and health businesses eager to save time and money 鈥 and score marketing points 鈥 from their use.
鈥淭here is a lot of froth and hype out there,鈥 said Brian Anderson, chief digital health physician at Mitre Corp. and co-founder of the , an industry group developing standards for AI in medicine. 鈥淭he concern many of us have is that, particularly with generative AI in a consequential space like health, it is inappropriate to use a tool that we don鈥檛 have an agreed-upon set of standards or best practices for.鈥
Trained on vast amounts of data, these AI systems use pattern recognition to produce human-like responses to questions posed in just about any kind of language, from written text, to imaging data, to computer code. Dozens of businesses are applying generative models built by Google or OpenAI, the developer of ChatGPT, to perform tasks in health care delivery, drug research, and medical billing. Most of the work involves bureaucratic jobs that human workers would like to offload, such as answering patient emails or filling out medical records. But some users are beginning to apply the technology to core medical tasks, such as early disease detection, diagnosis, and treatment.
Efforts to apply it are constantly invoked in press releases and advertisements by health businesses wanting to appear on the cutting edge, even as its ability to improve care remains speculative, and its harms largely hidden. 鈥淭here鈥檚 this disconnect between the flashy headlines and claims that have yet to be verified, and presumably unflashy applications that can have immediate value to patients and people in the health care system,鈥 said Nigam Shah, chief data scientist for Stanford Health Care.
The work ahead will only get harder. Clinicians and researchers must expose the technology鈥檚 biases and blind spots 鈥 and develop strategies to fix them 鈥 before they are used to deliver information to patients, or help doctors make decisions about their care. Here鈥檚 a closer look at the projects underway.
Generative AI鈥檚 ability to provided the first inkling of its potential. But the next wave of uses is already unfolding, as hospitals test its ability to surface the most salient information in voluminous patient records.
HCA Healthcare, a for-profit hospital chain based in Nashville, is developing a tool using Google鈥檚 PaLM model to write the messages nurses send one another about patients at shift change.
The goal is to relieve nurses of a time-consuming search through records to find details on medication changes, lab results, and other crucial information. It is a purely bureaucratic task, but also one that can lead to harm, or sloppy care, if done incorrectly.
The question facing the AI is not just whether it makes the nurses jobs鈥 easier, but whether its use results in fewer errors and better patient care relative to the current manual process. Mike Schlosser, a physician and vice president of care transformation and innovation at HCA, said the system is testing the tool by comparing the automated and manually-produced notes side by side.
鈥淲e鈥檙e still learning,鈥 he said. 鈥淩ight now we鈥檙e using a lot of human-in-the-loop to ensure that it鈥檚 providing the right output.鈥
At Northwestern Medicine in Chicago, clinicians are working with Epic, through its partnership with Microsoft, to use generative AI to review and summarize records to help flag emerging medical issues and help clinicians be more responsive to treatment needs.
Doug King, chief information officer at Northwestern, said the technology offers an enormous opportunity to relieve clinicians of administrative tasks that soak up hours and reduce job satisfaction. But he also emphasized that its benefits still have to be proved.
鈥淲hat people forget is that these things aren鈥檛 free,鈥 King said. 鈥淭hey are not free to train, and you have to maintain them. Balancing the costs against the true benefit is a conversation that every health system is going to have to have.鈥
Michael Recht, chief of radiology at 秘密研究所 Langone Health, is leveraging ChatGPT to help patients understand their imaging results.
The idea came from a project he鈥檚 been working on to give his patients short videos explaining the abnormalities or problems found in their images. The hardest part of that work, he said, is getting radiologists to translate jargon-heavy reports into plain language.
So far, ChatGPT appears highly adept at boiling it down. But Recht said its accuracy depends on the effectiveness of the prompt provided to the AI.
鈥淚鈥檇 love to tell you I have this great logical way of doing it,鈥 Recht said. 鈥淏ut to be honest with you, it鈥檚 been trial and error.鈥 He said he鈥檚 spent hours tinkering with prompts at night as one of many clinicians at 秘密研究所 to take on a wide array of tasks. In some situations, he just asks the AI directly: Why are you getting this wrong? How can I make it easier for you to understand?
鈥淪ometimes that does help. It gives me some clues on how to simplify my prompt,鈥 聽Recht said. Radiologists at 秘密研究所 are still monitoring its output to ensure any information delivered to patients is accurate.
鈥淚f we can鈥檛 do that, then we won鈥檛 use it,鈥 Recht said. 鈥淲e think we can solve it, but we鈥檙e still working on it, and that鈥檚 why it鈥檚 not in production at this point.鈥
In Boston, Massachusetts General Brigham is already exploring a longer-term goal for generative AI: Using it to help physicians diagnose patients and make the best treatment decisions.
A recent by researchers at the health system showed ChatGPT, the popular model developed by OpenAI, was 77% accurate when diagnosing cases featured in . However, the model was only 60% accurate in making differential diagnoses, or coming up with a list of plausible illnesses based on a patient鈥檚 symptoms. It wasn鈥檛 much better 鈥 68% accurate 鈥 in making treatment decisions, reflecting the technology鈥檚 difficulty dealing with uncertainty.
鈥淚f we鈥檙e going to use this, we need to know where it鈥檚 good and where it鈥檚 not,鈥 said Marc Succi, a physician and associate chair of innovation and commercialization at Mass General Brigham. 鈥淚t鈥檚 got to be applied in very specific situations.鈥
The health system is also pursuing lower-risk applications to draft responses to emails and automatically load appointment details into medical records. Succi said it could take many years of experimentation, and trust building, before the technology is used directly in patient care. The fastest way to do that, he said, is for model developers to be more open about how they鈥檙e iterating their models to achieve better results.
鈥淐ompanies should be sharing that data with hospitals if they want to get adoption,鈥 he said, adding that clinical users will likely need to fine tune the AI systems before using them. 鈥淚 do envision that, in 5, 10, 15 years, we鈥檒l be training sub-specialty chatbots as well as a generalist chatbot.鈥
The largest category of generative AI tools publicly adopted by health systems is the clinical note scribe, which creates a transcript of a recorded doctor鈥檚 visit and summarizes it into a note for a patient鈥檚 medical record. While the tools have been in use for years, the release of GPT-4 in March has supercharged the solution 鈥 and demand for it.
鈥淚t became very clear pretty quickly that voice-to-text powered by GPT-4 especially, but also other models, was a very compelling use case right off the bat,鈥 said Byron Crowe, an internal medicine physician at Beth Israel Deaconess Medical Center and digital health researcher tracking generative AI. It performs well at creating visit summaries, and it鈥檚 possible to manage the risk of mistakes by having a human review notes.
But some health systems are pushing the systems to do even more, piloting scribes that skip the review step and fully automate documentation. Northwestern Medicine is one of the few health systems that has acknowledged its planned testing of DAX Express, Nuance鈥檚 GPT-4-powered automated scribe, in a bid to reduce some of the clinician burnout that can stem from documentation.
鈥淵ou have caregivers leaving the industry. You have an aging population,鈥 said Northwestern鈥檚 King. 鈥淭echnology is one the main levers, if not the only lever, that we can pull to try to take tasks off of clinicians.鈥
Most clinical tools apply generative AI鈥檚 superpowers to text in medical records. But in the future, large language models 鈥渨ill be directly applicable to more than just text,鈥 said Jeffrey Ferranti, senior vice president and chief digital officer of Duke Health, pointing out GPT-4鈥檚 ability to analyze images.
Cedars-Sinai says it is using generative AI to create physician avatars that communicate with patients, with software developed by Acolyte Health. The idea is that patients will engage more with important preventive health care strategies, appointment follow-up information, and reminders when it鈥檚 delivered by a trusted human 鈥 or at least something that looks like a human.
鈥淚t鈥檚 to the point where I think most patients wouldn鈥檛 identify that it鈥檚 generated,鈥 said Craig Kwiatkowski, chief information officer at Cedars-Sinai. 鈥淚t鈥檚 a little similar to a deep fake, and I鈥檝e seen some of those where I can鈥檛 tell.鈥
This year, its first pilot of the technology sent videos to patients encouraging them to follow up on colorectal cancer screening, using the likeness and voice of Cedars-Sinai physician Caroline Goldzweig. About 40% of those patients returned their screening tests, said Kwiatkowski 鈥 a response rate about twice that of the industry benchmark for screening outreach.
It plans to expand the technology to more physicians and use cases, including preventive care prompts for diabetes, and in the long run may use AI to generate more personalized video scripts by incorporating data from patients鈥 clinical records. But that would incur more risk, Kwiatkowski said. 鈥淲e鈥檙e proceeding deliberately and cautiously,鈥 he said. 鈥淭he content is established, curated, and authenticated so that we don鈥檛 have AI that鈥檚 just making up things that the doctor could say.鈥
Many health systems aren鈥檛 diving in head first, choosing a strategy of watchful waiting as research progresses.
Duke Health, which founded the Health AI Partnership, a group that advises hospitals on AI use, intends to use generative tools, and is to establish the necessary infrastructure and training. But it doesn鈥檛 have any generative AI tools in production today, said Duke鈥檚 Ferranti. Neither does Mount Sinai, a press officer told STAT.
And when more risk-averse health systems do start to experiment with the technology, they鈥檒l likely stay with safer administrative tasks rather than anything patient-facing. 鈥淔rom a health care system standpoint, I don鈥檛 see the reason for being super excited about clinical use,鈥 said Stanford鈥檚 Shah. 鈥淭here鈥檚 a whole bunch of other stuff we can do that is easy, straightforward, and direct value,鈥 including scheduling, translating patient instructions to the language of their choice, and acting as an interpreter during a visit.
鈥淭his new generative AI technology that we are seeing is a world-changing technology, and it really has the potential to impact all aspects of what we do in medicine from the bench-side research in the laboratory, to the bedside, to some of the administrative things that we do,鈥 said Ferranti. 鈥淏ut I think we all acknowledge there鈥檚 a lot of real risks associated with these new technologies.鈥