40
submitted 1 day ago by [email protected] to c/[email protected]

Nearly one in four GPs nationally are believed to be using AI digital scribes for note-taking, according to the Australian Medical Association (AMA). Other practitioners, like psychologists and podiatrists, are also using the technology.

top 5 comments
sorted by: hot top new old
[-] [email protected] 17 points 1 day ago

Are these doctors paying for their own locally hosted instances? If not there's a pretty big risk of peoples medical records being injested by the LLMs and spat back out randomly with little or no concern for privacy.

[-] [email protected] 6 points 1 day ago* (last edited 17 hours ago)

I was offered a permission form at my last trip to the GP. "Do you consent to Ai based note-taking assistants?"

I didn't. I don't because, in addition to Ai being unreliable, it requires we surrender our data sovereignty immediately, since Ai compute space is all in America.

[-] [email protected] 9 points 1 day ago

Literally just came from the doctor's office (Sydney) and saw a poster about AI transcription usage on the wall.

Pros I can see:

  • More time back for doctors
  • This may in turn lead to better patient outcomes.

Cons I can see:

  • Higher rates of adverse medical outcomes due to inaccuracies & hallucinations in patient notes
  • Higher chance of personal data breaches due to third parties holding and handling private materials.

There are so many issues here. The fact GPs are in a position of thinking this tradeoff is worth it is caused by a cluster of problems. The fact big companies are convincing them this is acceptable is another layer. The history of big tech companies selling such data off to special interest groups (anti-abortion, real estate, etc) a third.

LLM companies are desperate for people to buy their products because nothing is profitable in the AI industry (other than selling the shovels like Nvidia does).

[-] [email protected] 9 points 1 day ago

FYI, a medical scribe is a job that a ton of people do. Not all doctors do their own transcription, or have their nursing staff do it. Often one that people looking at getting into medicine do for additional learning.

All the AI is going to do is misunderstand and hallucinate responses like they do with everything else. It's clear that the issues with LLM-based AI are above standard human error at this point.

There is no advantage here either than removing yet another person's job for a worse result.

[-] [email protected] 1 points 1 day ago

My veterinarian's office started using this recently. I approve for that purpose but definitely not for my own health notes. My PCP has her MA take notes.

this post was submitted on 06 Aug 2025
40 points (97.6% liked)

Public Health

977 readers
95 users here now

For issues concerning:


🩺 This community has a broader scope so please feel free to discuss. When it may not be clear, leave a comment talking about why something is important.



Related Communities

See the pinned post in the Medical Community Hub for links and descriptions. link ([email protected])


Rules

Given the inherent intersection that these topics have with politics, we encourage thoughtful discussions while also adhering to the mander.xyz instance guidelines.

Try to focus on the scientific aspects and refrain from making overly partisan or inflammatory content

Our aim is to foster a respectful environment where we can delve into the scientific foundations of these topics. Thank you!

founded 2 years ago
MODERATORS