Stephen Parodi, MD, details strategies for supporting the physician workforce through uncertain times.
Building trust in health care through better AI governance

As artificial intelligence rapidly transforms health care and other sectors, concerns about data security, outcomes, and other emerging risks are growing. During a recent Reuters Events Momentum AI San Jose 2025 panel, Vincent Liu, MD, MS, chief data officer, The Permanente Medical Group, highlighted the need for building trust in health care through better AI governance, transparency, and accountability.
“At the heart of all this, whether it’s about AI or a new medication or intervention, is trust,” said Dr. Liu, who is also a senior research scientist at the Kaiser Permanente Northern California Division of Research. “It’s about delivering high-quality, affordable care, doing it in a safe and effective way, and ultimately using technology to do that in a human way.”
Related health care innovation podcast: PermanenteDocs Chat on the promise of AI in health care
He added that Permanente physicians have long called AI “augmented intelligence” to emphasize its role in enhancing the work of clinicians and employees.
“I think the fear is that AI will disintermediate patients from their physicians or their nurses,” he said. “But we’re working to deploy AI in a way that really supports our clinicians and ultimately supports our patients in a way that brings trust to moments that can be very, very challenging.”
For example, Permanente clinicians use ambient AI listening technology to record clinical notes during patient visits, allowing them to focus on communication rather than documentation. Physicians review and edit these AI-generated notes before adding them to a patient’s electronic medical record. In the previous year, Kaiser Permanente implemented this tool in 40 hospitals and over 600 medical offices across 8 states and the District of Columbia.
In the past, physicians spent hours at home writing clinical notes. Such administrative work, Liu said, “is not an effective use of their skills. It leads to burnout and it doesn’t move the needle on effective care. We’re using AI to recapture some of that lost time and restore joy to the practice of medicine.”
Related health care innovation care story: How AI is giving physicians more time for what matters most
Dr. Liu said patient consent is required before the technology can be activated. “We want to make sure that our patients know that this technology is being used,” Dr. Liu said. “We’re also looking at how it is used across specialties because how it performs in a primary care doctor’s office might be very different than how it performs in a psychiatrist’s office or an oncologist’s office. Because this technology is changing so fast, we’re setting up systems that allow us to monitor it continuously.”
Dr. Liu stressed that care teams should participate in every stage of technology deployment. For example, frontline clinicians helped develop and implement Advance Alert Monitor, an AI-driven predictive analytics tool used in all 21 Kaiser Permanente Northern California hospitals. This program identifies patients at risk of clinical deterioration, enabling early intervention and preventing over 500 deaths annually.
“It’s important for our workforce to understand precisely how a new technology works and where it’s going to fit into their workflow, and for them to guide us,” Dr. Liu said.
It’s equally important for organizations to train physicians and clinicians who are using AI tools to retain a sense of agency in making decisions.
“They have to know when they should not follow the recommendation or the summarization of whatever output an AI produces,” Dr. Liu said. “We’ve gone through a lot of training to understand what are the factors that are giving us that spidey sense that something is not quite right…especially when AI seems to be diverging from what is good practice.”
UPDATE: Watch the full panel session here.
UPDATE: Watch a short interview from the event with Dr. Liu on ambient AI.