Parents trust ChatGPT over doctors, shocking new study claims
|

Parents trust ChatGPT over doctors, shocking new study claims

Paging Dr. Bot.

In a new study, researchers at the University of Kansas’ Life Span Institute revealed that parents trust artificial intelligence (AI), such as ChatGPT, more than healthcare professionals.

“Participants found minimal distinctions between vignettes written by experts and those generated by prompt-engineered ChatGPT,” says Calissa Leslie-Miller, a doctoral student in clinical child psychology at the university and lead author of the study. “When vignettes were statistically significantly different, ChatGPT was rated as more trustworthy, accurate, and reliable.”

Parents trust ChatGPT’s advice over medical professionals. Maria Sbytova – stock.adobe.com

The team conducted a study with 116 parents aged 18 to 65 who were given health-related texts about children.

Each participant analyzed the content and determined whether they believed it to be produced by ChatGPT or health care professionals without knowing the original author.

Although the study did not examine why parents trusted ChatGPT more, it details factors that could contribute to their preference.

Jim Boswell, the president and CEO at OnPoint Healthcare Partners, who has experience developing an AI-based platform, believes ChatGPT’s straightforward approach to presenting information directly makes it easier for people to digest.

“I can understand why [parents], not knowing the source, would prefer the wording of AI,” says Mordechai Raskas, MD, EdM, chief medical information officer and director of telemedicine at PM Pediatric Care. “Think of AI as the ultimate salesperson; it knows exactly what to say to win you over.”

Parents prefer relying on AI because they can receive fast answers to their issues without waiting for a doctor’s appointment.

Each participant analyzed the content and determined whether they believed it to be produced by ChatGPT or health care professionals without knowing the original author. Kaspars Grinvalds – stock.adobe.com

However, while using ChatGPT might be a quick fix for many parents, it comes with some drawbacks.

“Information may be inaccurate or not tailored to specific circumstances. For example, suggesting medication for a child who is too young or offering incorrect treatment advice could lead to a wide range of dangerous outcomes,” says Leslie-Miller.

Experts suggest checking the sources of your AI-generated answers or consulting with a medical professional before you apply it to your condition.

“Reputable health content usually credits qualified medical writers or health professionals and links to research-backed sources,” Boswell adds.

Parents enjoy the fast response ChatGPT gives them without having to wait. AnnaStills – stock.adobe.com

Artificial intelligence like ChatGPT gathers information from various online sources and compiles it into one answer. But when it comes to AI health-related information, these answers lack a medical expert’s opinion that is personalized to the patient.

“Relying on these tools for medical advice could lead to missed symptoms, misinterpretations of serious conditions, or delays in seeking appropriate care,” says Boswell. “For kids, in particular, small health issues can escalate quickly, so having a qualified professional assess a situation is essential.”

Leslie-Miller recommends that parents also use trusted online medical sources such as the American Academy of Pediatrics (AAP)the Centers for Disease Control and Prevention (CDC), and the World Health Organization (WHO). Some hospitals also offer health information and advice from their healthcare providers.

“Reading online and searching can be very helpful,” says Dr. Raskas. “It just depends on the context and needs to be in conjunction with a trusted source or professional to help digest what you’ve read.”

Check out our Latest News and Follow us at Facebook

Original Source

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *