Patients Use Generative AI to Search for Doctors

Stuart Kerr
0
Illustration of a patient using generative AI on a laptop to find a doctor, with a doctor’s profile appearing on a smartphone screen.


By Stuart Kerr, Technology Correspondent

Published: 23/08/2025
Last Updated: 23/08/2025
Contact: [email protected] | Twitter: @LiveAIWire
Author Bio: About Stuart Kerr


A growing number of patients are turning to generative AI tools such as ChatGPT and Gemini when searching for doctors online, according to new survey data. As healthcare costs rise and online information grows more fragmented, patients are increasingly using AI to cut through noise and make more informed decisions about providers.

TechTarget reports that nearly one-third of patients now rely on AI-powered search to identify healthcare providers. Patients use conversational prompts to filter doctors by location, specialty, insurance coverage, and even patient reviews. The appeal lies in AI’s ability to summarize multiple sources instantly, compared to the manual sifting required with traditional search engines.

Trust and Reliability Concerns

However, trust remains a significant hurdle. An Ipsos survey found that 31% of consumers already use generative AI for healthcare queries, but many worry about accuracy. While AI can quickly consolidate available information, it can also generate hallucinations—responses that sound authoritative but lack factual grounding.

A study from the Annenberg Public Policy Center revealed that 75% of respondents considered AI-generated health responses useful, and 63% judged them reliable, yet a majority still double-checked with a doctor before making decisions. This tension highlights AI’s growing role as a complement rather than a replacement for professional care.

The Promise and Pitfalls

Researchers warn of deeper ethical and practical issues. A BMJ report PDF examined how patients integrate AI into primary care decisions, raising concerns about uneven adoption across demographics. Some patients, especially younger digital natives, may embrace AI guidance, while older populations remain skeptical.

Similarly, the AAFP’s report PDF highlights the risks of bias and misinformation. If AI is trained on skewed or incomplete healthcare data, it could unintentionally reinforce systemic inequities in access and outcomes.

AI as a Front Door to Healthcare

For many, generative AI is becoming the front door to the healthcare journey. Instead of starting with Google search or insurance portals, patients ask AI tools to recommend providers, check availability, or draft initial outreach emails. This is particularly appealing in markets where finding a new provider can be daunting due to insurance complexity or lack of transparency.

This trend mirrors themes in AI & Autism — Neurodiverse Communication, where AI was shown to bridge communication gaps. In the context of healthcare, AI serves as an interpreter between patients and an often opaque medical system.

Economic and Social Drivers

The shift is also tied to rising economic pressures. As highlighted in AI Exodus: Automation & the Jobless Future, automation often emerges in response to cost constraints. Patients—like businesses—are adopting AI to reduce the time and expense of decision-making.

In addition, the expansion of generative AI aligns with advances in underlying model architectures. As observed in New AI Model Mor Succeeds Transformers, more capable models are enabling AI to move beyond simple Q&A toward personalized, context-aware assistance.

Looking Ahead

While healthcare regulators remain cautious, momentum is unlikely to slow. Patients are demonstrating that generative AI has a role in healthcare—not to diagnose or treat, but to navigate a labyrinth of options. The next step will be integrating these tools responsibly into official patient engagement strategies, balancing accessibility with reliability.

As with all AI shifts, the future will hinge on trust. Patients may continue to embrace AI as a search companion, but doctors, policymakers, and developers must collaborate to ensure it supports—not distorts—the path to care.


About the Author
Stuart Kerr is the Technology Correspondent for LiveAIWire. He writes about artificial intelligence, ethics, and how technology is reshaping everyday life. Read more.

Post a Comment

0 Comments

Post a Comment (0)

#buttons=(Ok, Go it!) #days=(20)

Our website uses cookies to enhance your experience. Check Now
Ok, Go it!