ChatGPT might show more empathy than human doctors: study

Artificial intelligence might be better at humanity than humans themselves.

AI assistants could express more sympathy toward patients, a new study suggests.

The study, published Friday in the journal JAMA Internal Medicine, found that OpenAI’s ChatGPT answers patient questions with more compassion than human physicians can.

“The opportunities for improving health care with AI are massive,” lead author John W. Ayers, an epidemiologist from the Qualcomm Institute at the University of California San Diego, said in a release. “AI-augmented care is the future of medicine.”

Researchers at the University of California San Diego at La Jolla took 195 patient questions from Reddit’s AskDocs forum — a social media page where people publicly post medical questions for doctors to respond to — and had both human doctors and ChatGPT answer the questions.

“ChatGPT might be able to pass a medical licensing exam,” study co-author Davey Smith, a physician-scientist, co-director of the UC San Diego Altman Clinical and Translational Research Institute and professor at the UC San Diego School of Medicine, said. “But directly answering patient questions accurately and empathetically is a different ballgame.” 


Medical technology concept with 3d rendering robot hand or cyborg hand hold stethoscope
AI assistants could express more sympathy toward patients, a new study suggests.
Getty Images/iStockphoto

The responses were evaluated by a panel of licensed health-care professionals that rated each answer by “the quality of information provided” — very poor, poor, acceptable, good or very good — and “the empathy or bedside manner provided” — not empathetic, slightly empathetic, moderately empathetic, empathetic and very empathetic.

ChatGPT won — and the competition didn’t even come close.

“ChatGPT messages responded with nuanced and accurate information that often addressed more aspects of the patient’s questions than physician responses,” Jessica Kelley, a nurse practitioner with San Diego firm Human Longevity and study co-author, said. 


Doctor And Robot Shaking Hands Over Stethoscope
The study authors noted that further research would need to be completed in clinical settings.
Getty Images/iStockphoto

The clinical team chose the computer response over the human response nearly 80% of the time.

“It’s pretty obvious why AI was better. It’s not constrained by time,” Ayers told Axios. “You could take a simple query like: ‘I have a headache, can you help me?’ and you’ll immediately see ChatGPT say ‘I’m sorry you have a headache.’ The doctor knows that, they feel that. They don’t have time to say it.”

While artificial intelligence is nowhere near replacing doctors, the findings suggest that bringing ChatGPT and other AI assistants into the health-care system can help human physicians provide higher quality, more efficient and more empathetic care by improving workflow, removing any health disparities for minorities and affecting a patient’s overall health.

“We could use these technologies to train doctors in patient-centered communication, eliminate health disparities suffered by minority populations who often seek health care via messaging, build new medical safety systems, and assist doctors by delivering higher quality and more efficient care,” Mark Dredze, Ph.D., the John C Malone Associate Professor of Computer Science at Johns Hopkins and study co-author, said.

Study authors noted that further research would need to be completed in clinical settings, including using chatbots to draft responses that human physicians could edit. Trials could also help determine if AI assistants “might improve responses, lower clinician burnout and improve patient outcomes.”

“I never imagined saying this, but ChatGPT is a prescription I’d like to give to my inbox. The tool will transform the way I support my patients,” Aaron Goodman, M.D., associate clinical professor at UC San Diego School of Medicine and study co-author, admitted.