AI in healthcare is about clinicians, not technology

PHG_Stock_Blog_AI-in-Healthcare-Transforming-Care_Q4_2025
Nele Jessel, athenahealth
Nele Jessel, MD
December 18, 2025
7 min read

Over the course of my career as a physician and healthcare informaticist, I’ve watched technology reshape nearly every aspect of care delivery. Some of those shifts expanded our capacity. Others added complexity, often in ways that pulled clinicians further from the people they serve.

Today, we are at an exciting turning point. Artificial intelligence (AI) has moved from theoretical promise to daily practice. It now influences how documentation is created, how we retrieve and synthesize information, how staff manage intake and scheduling, and how clinical decisions are supported. But unlike previous waves of digital change, AI is offering something that has become increasingly rare in modern healthcare: relief.

This year, the athenaInstitute launched our AI on the Frontlines of Care study to understand what AI is truly doing inside practices — what’s helping, what’s not, what clinicians hope for, and what must be true before they will trust AI as part of the practice of care. The results reveal a field in motion: hopeful, discerning, and already building a clearer sense of how AI should serve the work of care.

AI is already part of daily practice and clinicians are using it with purpose

AI technologies are far more embedded in ambulatory care than many may realize. Among the 500+ clinicians and practice leaders in our study — all of whom use AI in their day-to-day work — 62 percent use four or more AI-enabled tools today. The most used AI-powered healthcare solutions include:

  • Self-scheduling (59%)
  • Automated intake and check-in (47%)
  • Ambient / NLP-powered documentation (36%)
  • Clinical decision support (35%)

These adoption levels signal something important: clinicians are already turning to AI where it helps them manage the pressures of modern medicine.

Yet adoption is not the same as maturity. We also heard consistently about friction and noise in the current way AI is deployed — discrepancies between AI output and clinical judgment (39%), difficulty integrating tools with existing EHR workflows (39%), and concerns around data privacy (35%). These challenges do not undermine adoption of AI, but they do help explain the shape and pace of how it continues to roll out. The tools must do more than provide relief. They must prove to be reliable to be accepted fully as an integral part of providing care. As a physician, I see this as healthy skepticism grounded in professional responsibility.

The clearest value of AI today is burden relief — especially through ambient documentation

If there is one theme that came through most strongly, it is this: AI’s greatest contribution so far is giving clinicians and staff time back.

When asked why they adopted AI, respondents cited reducing documentation workload (45%) and administrative burden (44%) as their top motivations. And the impact is unmistakable:

  • 63% say AI is reducing documentation burden
  • 65% believe AI does more good than harm in the delivery of care
  • 69% of physicians see AI as a way to focus less on the EHR and more on connecting with patients

Ambient documentation stands out as the most powerful example. Clinicians describe it not just as a time-saver, but as a restoration of presence — the ability to sit with patients, observe non-verbal cues, and focus fully on the patient encounter.

It is difficult to overstate how meaningful that shift is. Care is relational at its core. When technology returns us to that center, everyone benefits.

AI tools must prove to be reliable to be accepted fully as an integral part of providing care.

Clinicians trust AI to help — but not to care

One of the most encouraging findings is how consistently clinicians draw the line between assistance and autonomy. They welcome AI for tasks requiring pattern recognition, information retrieval, or synthesis:

  • 76% believe AI can surface care gaps
  • 60% think can help identify easy-to-miss details
  • 62% see AI’s potential to help balancing guidelines with individualized care

But when it comes to work defined by empathy, interpretation, or moral reasoning, clinicians are categorically clear that this is not the job of artificial intelligence. Clinicians are not worried about being replaced by AI, but they are worried about maintaining what makes care human:

  • 70% say building rapport must remain human
  • 69% say comforting and reassuring patients must remain human
  • 65% say interpreting non-verbal cues must remain human

Reliability and oversight are therefore essential. Human judgment before AI action (38%), consistent performance over time (32%), and routine accuracy monitoring (31%) are the trust signals clinicians rely on, with less desire for algorithmic transparency and other controls. Most importantly, perhaps, physicians in our qualitative research continually spoke about this delineation of care responsibilities with a sense of purpose: they got into medicine to care for people, and when AI takes on some of the more rote and administrative tasks, they get the joy and fulfillment of connecting with patients.

As a CMO, this resonates deeply. Clinicians are asking AI to take on what machines do best, so they can focus on what humans do best — and find their purpose and delight in practicing medicine again.

Patient expectations are outpacing practice capacity — and AI can help close the gap

Clinicians report feeling increasing pressure to deliver quicker access to appointments, more personalized communication, and smoother digital interactions. 84% of respondents say patient expectations are higher than ever, and nearly half believe AI can help practices meaningfully enhance the patient experience with smoother appointment management, more personalized communication and education, and better support outside traditional office hours. They also see strong potential for AI to streamline check-in processes and scheduling, with the goal of reducing wait times and enhancing care access. At the same time, patients understandably have questions about privacy. 47% of respondents say patients have raised concerns about sharing their data externally. AI must lead with transparency in order to earn trust from both caregivers and those receiving care.

Interoperability, data quality, and financial pressure shape the real-world AI experience

It’s impossible to talk about AI without acknowledging the ecosystem into which it is deployed. Secure, connected, healthcare data is a prerequisite for AI models to be able to deliver the kinds of insights and efficiencies that will augment care. Yet clinicians still struggle to access complete and relevant patient information across fragmented systems:

  • 42% report difficulty accessing data across records
  • 46% cite inconsistent data formats across platforms
  • 45% struggle to integrate only relevant data into the chart

Financial pressure and staffing shortages compound the reality of day-to-day operations in U.S. healthcare:

  • 41% cite declining reimbursement
  • 36% identify burnout as significant
  • 34% point to staffing challenges

This backdrop of system challenges in the ecosystem explains why clinicians primarily evaluate AI through the lens of burden reduction, revenue stability, and operational resilience. They are not looking for AI to redefine medicine itself but for AI to help make the current system more humane and sustainable.

A field in forward motion, but still in its formative stages

Across the findings, a clear pattern emerges: Clinicians are not resisting AI — they are learning how to incorporate it thoughtfully. They are discerning, not doubtful. Optimistic, but grounded. They want tools that are dependable, predictable, aligned with their workflows, and supportive of their judgment.

The evidence points to a field moving steadily along a trajectory of trust and tech adoption: from First Win, to Proving Value, and gradually toward a New Normal of medical practice infused with responsible AI across clinical and operational tasks.

Our responsibility as an industry is to support that progression — to build systems that honor the relational heart of care, ensure that innovation does not outrun trust, and to measure success not by the sophistication of the technology, but by how effectively it restores humanity to healthcare.

At athenahealth, we are committed to that path. And I am encouraged — not only by the progress we’re seeing, but by the clarity with which clinicians are telling us what they need next.

AI in healthcarethought leadershipelectronic health recordhealthcare & burnoutpatient engagementreducing admin burdenclinical documentationEHR usabilitydata & interoperabilitystaff shortages