A patient walks into an exam room, sits down, and starts describing symptoms. They don’t know a microphone is on. They don’t know an AI is transcribing every word. They don’t know their private medical disclosures are being routed to a third-party cloud server.
This week, a group of Californians filed a class-action lawsuit against Sutter Health and MemorialCare, alleging exactly that. The complaint, submitted Wednesday in federal court in San Francisco, accuses both health systems of deploying Abridge AI, an ambient clinical documentation tool, to record physician-patient conversations without informed consent. According to the filing, Abridge “collected and processed their private physician-patient conversations,” including medical histories, diagnoses, medications, and treatment discussions, all without clear notice to patients that an artificial intelligence platform was listening.
This is not an isolated case. In November 2025, a patient named Jose Saucedo filed a nearly identical class action against Sharp HealthCare in San Diego. That complaint made an especially damning allegation: Sharp’s EHR notes contained boilerplate language stating patients had been “advised” of and “consented” to the AI recording, when no such advisement ever occurred. In other words, the system auto-generated false consent documentation.
Why Florida Physicians Should Pay Attention Right Now
These are California lawsuits. But the legal exposure for Florida physicians is arguably worse.
Florida is one of thirteen states with an all-party consent recording law. Under F.S. 934.03, intercepting or recording a wire, oral, or electronic communication without the consent of every party is a third-degree felony, punishable by up to five years in prison and a $5,000 fine. That statute does not carve out exceptions for healthcare settings, and it does not distinguish between a human scribe listening in and an AI scribe doing the same thing.
Here is the problem: Abridge is now deployed across 250-plus health systems nationwide, including Kaiser Permanente, the Mayo Clinic, Duke Health, Johns Hopkins, and WVU Medicine. If your health system or practice has adopted or is considering ambient AI documentation, the question is not whether this technology works. The question is whether your patients actually know it’s there.
A multi-state health system that runs a single ambient scribe workflow could be fully compliant in a one-party consent state and simultaneously committing a felony in Florida. If the consent process is a boilerplate checkbox buried in an intake packet, or an auto-generated EHR notation that the patient never actually heard, you are exposed.
The Consent Problem Is a Systems Problem
Physicians did not create this mess. Health systems purchased Abridge and similar tools, integrated them into clinical workflows, and in some cases rolled them out with consent protocols that were, to put it charitably, thin.
The Sharp HealthCare complaint paints a picture that should worry every practicing physician: an AI system that silently generates documentation claiming consent was obtained when it was not. That is not a technology failure. That is a documentation integrity problem. And when regulators or plaintiffs come knocking, the physician whose name is on that note will be the first one asked to explain it.
Ambient AI scribes can be genuinely useful. They reduce documentation burden, keep physicians present during encounters, and can improve note accuracy. None of that matters if the consent infrastructure is broken. A tool that saves you twenty minutes per visit is worthless if it generates a felony charge or a seven-figure settlement.
What Florida Physicians Should Do Now
- Find out whether your practice uses ambient AI documentation. Ask your IT department or practice administrator directly. If Abridge, Nuance DAX, DeepScribe, or any similar tool is active, confirm exactly what it records and when.
- Audit your consent process against F.S. 934.03. Florida law requires all-party consent. A buried clause in an intake form is not sufficient. Each patient, at each visit where recording occurs, needs to clearly understand that an AI system will be listening, what it will capture, and where that data goes. Document that consent explicitly, not through auto-generated EHR language.
- Demand transparency from your health system. If your employer deployed this technology, they owe you a clear answer about the legal review that was conducted before rollout. Ask for the compliance memo. If one does not exist, that tells you everything you need to know.
- Raise this at your next medical staff meeting. This is not a solo problem. It requires an organizational response. Bring the Sutter Health and Sharp HealthCare complaints to your chief medical officer and your risk management team. The time to address this is before a plaintiff’s attorney files in a Florida courtroom.
- Contact the Florida Medical Association. The FMA should be developing guidance on AI scribe consent protocols specific to Florida’s all-party consent statute. If that guidance does not yet exist, your call is the one that gets it started.
The Bigger Picture
Physicians should be leading the conversation about AI in clinical practice — not learning about its deployment from class-action filings. The technology itself is not the enemy. The enemy is a rollout model that treats patient consent as a checkbox and physician liability as an afterthought.
Florida’s recording statute exists to protect the sanctity of private conversations. There is no conversation more private than the one between a patient and their doctor. If we are going to bring AI into that room, we owe patients a clear explanation and a genuine choice. Anything less is a betrayal of the trust that makes medicine possible.
Frequently Asked Questions
What is an ambient AI scribe, and how does it work in a doctor’s office?
An ambient AI scribe is software that uses a microphone (typically on a physician’s device) to passively record clinical encounters in real time. The audio is transmitted to a cloud-based AI system that generates clinical notes, including history of present illness, assessments, and plans. The physician reviews and signs the note. Products like Abridge, Nuance DAX, and DeepScribe operate this way.
Is it legal to use AI scribes in Florida without patient consent?
No. Florida is an all-party consent state under F.S. 934.03. Recording any oral communication, including a clinical encounter, without the consent of every person present is a third-degree felony, carrying penalties of up to five years in prison and a $5,000 fine. Physicians and health systems must obtain explicit, informed consent from patients before activating ambient recording.
What should Florida physicians do if their health system already uses Abridge or a similar tool?
Start by confirming the consent protocol. Determine whether patients receive clear, verbal disclosure at each visit, not just a buried clause in intake paperwork. If the consent process relies on auto-generated EHR language that patients never see or hear, escalate to your compliance department and risk management team immediately.
Could a Florida physician face personal liability for AI scribe recordings?
Yes. While health systems bear institutional liability, the physician whose encounter was recorded and whose name appears on the resulting note could face regulatory scrutiny, malpractice claims, or criminal exposure under Florida’s wiretapping statute. Physicians should not assume that employer-provided technology automatically comes with adequate legal protection.
How does this California lawsuit affect physicians outside California?
The Sutter Health and MemorialCare lawsuit signals that plaintiffs’ attorneys are actively targeting health systems over AI scribe consent failures. Florida’s stricter recording law makes the legal exposure even greater here. Physicians in any all-party consent state should treat this lawsuit as an early warning and review their own practices before similar litigation arrives locally.
Get stories like this delivered to your inbox every two weeks — subscribe to Florida Doctor Magazine (free).





Leave a Reply