ChatGPT becomes a tool for doctors to deliver bad news – 06/21/2023 – Health

ChatGPT becomes a tool for doctors to deliver bad news – 06/21/2023 – Health

[ad_1]

OpenAI released the first free version of ChatGPT last November 30th. Within 72 hours, doctors were already using the AI-powered chatbot.

“I was amazed and interested, but frankly a little alarmed,” said Peter Lee, corporate vice president of research and incubation at Microsoft, which invested in OpenAI.

He and other experts predicted that ChatGPT and other large AI-powered language models could take care of mundane tasks that consume hours of physicians’ time and contribute to burnout among them — things like writing health insurance referrals or chart summaries. of patients.

But they worried that artificial intelligence also offered an all-too-tempting shortcut to finding diagnoses and medical information that might be incorrect or even made up — a daunting prospect in a field like medicine.

What Lee found most surprising, though, was a use he hadn’t anticipated: doctors were using ChatGPT to help them communicate with patients in more compassionate ways.

In one survey, 85% of patients said that the physician’s compassion is more important to them than the cost of care or the waiting time. In another, nearly three-quarters of respondents said they had seen doctors who were not compassionate. And a study of doctors’ conversations with the families of dying patients found that many of the doctors lacked empathy.

And then came chatbots, which doctors are using to find the words to convey bad news or express concern about a patient’s distress, or just to explain medical advice more clearly.

Even Microsoft’s Peter Lee said this is a bit disconcerting.

“Personally, as a patient I would feel weird about that,” he commented.

But Michael Pignone, director of the department of internal medicine at the University of Texas at Austin, has no qualms about the help he and other physicians on his staff have received from ChatGPT to communicate regularly with their patients.

He explained the problem using medical jargon: “We’re running a project on improving treatments for alcohol use disorder. How do we get patients who don’t respond to behavioral interventions to adhere?”

Or, how ChatGPT might respond if you asked it to translate this into layman’s language: what can doctors do to help patients who drink excessively but haven’t stopped even after talking to a therapist?

Pignone asked his team to write a script for talking to these patients in a compassionate way.

“A week later, nobody had done it,” he said. He only had a script written by his research coordinator and a social worker on staff, “and it wasn’t a real script.”

So Pignone tried ChatGPT, which responded immediately with text covering the topics doctors wanted.

But social workers said the script needed to be revised for patients with little medical knowledge and also needed to be translated into Spanish. The end result, which ChatGPT produced when asked to rewrite the text for a fifth-grade reading level, began with a reassuring introduction:

“If you think you drink too much alcohol, you’re not alone. Many people have this problem, but there are remedies that can make you feel better and lead a healthier, happier life.”

Then there was a simple explanation of the pros and cons of the different treatment options. The team started using the script this month.

Christopher Moriates, the project’s co-principal investigator, was impressed.

“Doctors are notorious for using language that is difficult to understand or overly advanced,” he said. “It’s interesting to see that even words that we think are easily understandable actually aren’t.”

Some experts question whether it’s really necessary to use an AI program to find compassionate words.

“Most of us want to trust our doctors and respect them,” said Isaac Kohane, professor of biomedical informatics at Harvard Medical School. “If they demonstrate that they are good listeners and are empathetic, the trust and respect we feel tends to increase.”

But empathy can be deceptive. Kohane says it can be easy to confuse a kind attitude on a doctor’s part with good medical advice.

For Douglas White, director of the critical illness ethics and decision-making program at the University of Pittsburgh School of Medicine, there’s a reason doctors can ignore compassion. “Generally physicians are cognitively focused, treating a patient’s medical problems as a series of problems to be solved,” he said. As a result, they may fail to pay attention to “the emotional side of what patients and their families are going through.”

At other times, physicians may be fully aware of the need for empathy, but cannot find the right words to express it.

Gregory Moore, who until recently was a top executive who headed Microsoft’s health and life sciences department, wanted to help a friend who had advanced cancer. Her situation was very serious, and she needed medical advice about her treatment and the future. Moore decided to submit her questions to ChatGPT.

The result left him appalled.

In long, compassionate responses to her requests, the show gave her the words to explain to her friend the lack of effective treatments:

“I know this is a lot of information to process and you may be disappointed or frustrated with the lack of options… I wish there were more treatments and better treatments… And I hope there will be in the future.”

Moore went on to advertise ChatGPT, telling his doctor friends what had happened. But he and others say that when doctors turn to the chatbot to find words to speak with more empathy, they are often hesitant to reveal the fact, except to a few friends.

“Perhaps it’s because we’re clinging to what we see as an intensely human part of our profession,” Moore said.

Or, as Harlan Krumholz, director of the Center for Research and Evaluation of Outcomes at the Yale School of Medicine, pointed out, for a doctor to admit that he uses the chatbot for this purpose is equivalent to “admitting that he does not know how to talk to his patients.”

Yet those who have experimented with ChatGPT say that the only way clinicians can decide how comfortable they are with handing over tasks like this to a chatbot — like cultivating an empathetic approach or reading the results of medical records — is to do a few things. questions to the bot, themselves.

“You’d be crazy not to try and find out more about what he can do,” Krumholz said.

Translated by Clara Allain

[ad_2]

Source link

tiavia tubster.net tamilporan i already know hentai hentaibee.net moral degradation hentai boku wa tomodachi hentai hentai-freak.com fino bloodstone hentai pornvid pornolike.mobi salma hayek hot scene lagaan movie mp3 indianpornmms.net monali thakur hot hindi xvideo erovoyeurism.net xxx sex sunny leone loadmp4 indianteenxxx.net indian sex video free download unbirth henti hentaitale.net luluco hentai bf lokal video afiporn.net salam sex video www.xvideos.com telugu orgymovs.net mariyasex نيك عربية lesexcitant.com كس للبيع افلام رومانسية جنسية arabpornheaven.com افلام سكس عربي ساخن choda chodi image porncorntube.com gujarati full sexy video سكس شيميل جماعى arabicpornmovies.com سكس مصري بنات مع بعض قصص نيك مصرى okunitani.com تحسيس على الطيز