For Patients & Providers

AI in Healthcare: Know What's Happening. Own Your Records.

AI is already in your doctor's office. Here's how to make sure it's working for you, not against you.

Two-thirds of physicians now use AI tools at work. AI scribes are recording and transcribing your conversations with your doctor. AI is drafting clinical notes, flagging potential diagnoses, and summarizing your medical history. Some of this is genuinely helpful. Some of it introduces risks you need to understand.

This page gives you the strategies to stay in charge of your healthcare data — as a patient and as a provider.

1. AI Scribes: What's Happening in the Exam Room High Priority

An AI scribe is software (usually running on a smartphone or tablet) that listens to your conversation with your doctor, transcribes it, and automatically generates clinical notes for your medical record. Tools like Abridge, Nuance DAX, and DeepScribe are being deployed across major health systems. The idea is that your doctor can look you in the eye instead of typing on a computer. That's the upside. The downside: an AI is now processing one of the most intimate conversations you'll ever have.

Yes. AI systems can hallucinate — generating false, incorrect, or misleading information presented as fact. If an AI scribe mishears something, misinterprets context, or fills in a gap with incorrect information, that error goes into your permanent medical record. If you don't catch it, it could affect future diagnoses, prescriptions, and treatment decisions. This isn't theoretical. Health systems are actively working to eliminate these errors, which means they're actively occurring.

Yes. You can decline the use of AI scribes without it affecting your care. Several states now require explicit consent before recording encounters. Even where consent isn't legally required, best practice says your provider should tell you AI is being used and give you the choice to opt out. If they don't mention it, ask: "Are you using any AI tools to record or document this visit?" You have the right to know.

Ask your provider's office before you arrive: "Do you use AI scribes or any AI-assisted documentation tools during visits?" If yes, ask how you can opt out. When you arrive, confirm at check-in. After the visit, review your notes in your patient portal carefully. If anything is inaccurate, request a correction immediately. Don't wait for the next visit. Errors in your chart could compound over time and lead to incorrect diagnoses, wrong prescriptions, or delayed care.

2. Digital Check-In and Terms of Service High Priority

This is becoming standard. Before you sign, copy the full terms into an AI tool (ChatGPT, Claude, Gemini) and ask: "Identify any clauses that grant access to my data beyond what's needed for my direct medical care. Flag anything related to AI processing, third-party data sharing, or research use of my information." The AI will highlight the overreach. Then you have two options: opt out of specific clauses (see below), or have a conversation with your provider about what you're consenting to.

You can try. Healthcare terms of service are generally adhesion contracts ("take it or leave it"), but writing a formal letter to your provider stating that you consent to care with specific exceptions to their AI and data-sharing policies creates a documented record of your intent. This matters if there's ever a dispute. It doesn't guarantee legal enforceability for every clause, but it demonstrates informed, deliberate consent rather than passive acceptance. It also signals to the provider that you're paying attention — which changes behavior. Bring the letter to your appointment. Ask that it be noted in your chart.

Here's a framework you can adapt:

"Dear [Provider/Practice Name], I consent to receive medical care and to the standard documentation of my visits. However, I respectfully decline the following: [1] The use of AI-powered recording or transcription tools during my appointments without my explicit verbal consent at each visit. [2] The use of my health data for AI model training, research, or any purpose beyond my direct care. [3] The sharing of my health data with third parties not directly involved in my treatment. I request that this letter be noted in my medical record. I'm happy to discuss any of these items with my care team. Sincerely, [Your Name, Date]"

Paste this into an AI tool and customize it for your specific situation. Ask the AI to review the provider's actual terms and tailor the exceptions to what's actually in them.

In most cases, yes — but it depends on your state's recording consent laws. In "one-party consent" states (the majority), you can record any conversation you're part of without the other person's knowledge. In "two-party consent" states (California, Florida, Illinois, and others), all parties must agree to be recorded.

Practically: tell your provider you'd like to record the conversation for your own records. Most will agree. Some hospitals have policies about this, so ask at check-in. You can also use AI tools after the appointment to transcribe your own recording and compare it against the official notes in your patient portal.

The bigger idea: Healthcare conversations could be moving toward a shared model where both provider and patient receive a verified transcript. We're not there yet. In the meantime, take ownership of your own record.

3. Review Your Medical Records Regularly Medium Priority

Whether notes were written by a human or an AI, errors happen. But AI-generated notes introduce a new category of error: hallucination. The AI might confidently document something that was never said, misattribute a symptom, or record an incorrect medication. If you don't catch it, the next provider who reads your chart will treat it as fact. Review your After Visit Summary in your patient portal after every appointment. If anything is wrong, message your provider immediately and request a correction.

Under HIPAA, you have the right to request an amendment to your medical records. Your provider can deny the request if they believe the record is accurate, but they must document your disagreement. If you believe AI-generated notes contain errors, be specific: "On [date], the note states [X]. The actual conversation was [Y]. I request this be corrected." Document everything in writing, not just verbally.

4. Understand Where Your Health Data Goes Medium Priority

It depends on your provider and their vendors. Some AI scribe companies state that recordings are used only for drafting notes, not for model training, and that data is deleted within 60 days. Others may use de-identified data for research and improvement. The problem: "de-identified" health data can often be re-identified, especially when combined with other datasets. Ask your provider directly: "Is any of my health data used to train AI systems?" If they don't know, that's a problem worth raising.

Your Fitbit, Apple Watch, Oura Ring, or health tracking app may share data with third parties including insurers, employers, and advertisers. HIPAA only protects data held by healthcare providers and their business associates. Your health app is likely not covered. Read the terms of service (or have an AI read them for you). Check whether your data is being shared, sold, or used for advertising. Many apps that seem medical are actually consumer products with no HIPAA obligation.

5. For Healthcare Providers High Priority

Best practice: tell every patient when AI is being used in their care. Several states require explicit consent before recording encounters with AI scribes. Even where it's not legally mandated, transparency builds trust and protects you from liability. Add AI scribe disclosure to your consent forms. Train staff to explain it in plain language. Respect every opt-out without friction.

Review every AI-generated note before it's finalized. Don't treat AI output as a finished product. Cross-check against your own recollection of the encounter. Pay special attention to medications, dosages, allergies, and diagnoses — the categories where errors carry the most risk. Implement a verification workflow: AI drafts, provider reviews, patient confirms through their portal. Three layers of review for the most sensitive documentation in your practice.

Every AI vendor that processes patient data must sign a Business Associate Agreement (BAA) under HIPAA. Beyond that, ask: Is patient data used to train AI models? Where is data stored and for how long? What happens to recordings after notes are generated? Is the tool compliant with state-specific recording consent laws? Does the vendor carry their own cyber liability insurance? If your vendor can't answer these questions clearly, find one who can.

Patient Action Checklist
  • Ask before every visit: "Are you using AI to record or document this appointment?" High
  • Review your After Visit Summary or your patient portal notes after every appointment. High
  • Request corrections for any inaccurate AI-generated notes in writing. High
  • Use AI to review digital check-in terms before signing. High
  • Prepare a consent exception letter for your provider. Medium
  • Ask your provider: "Is my health data used to train AI systems?" Medium
  • Review the terms of service for any health apps or wearables you use. Medium
  • Check your health insurance portal for AI-related disclosures. Recommended
Provider Action Checklist
  • Disclose AI use to every patient at every visit. High
  • Add AI scribe consent language to intake forms. High
  • Review every AI-generated note before finalization. High
  • Verify your AI vendor has a signed BAA and meets HIPAA requirements. High
  • Train staff to explain AI documentation in plain language. Medium
  • Establish a patient opt-out workflow that doesn't create friction. Medium
  • Ask your vendor whether patient data is used for model training. Medium
  • Encourage patients to review notes in their portal and report errors. Recommended

Next Steps

Your health data is some of the most sensitive information that exists. Protect it the way it deserves.