AI program provided hallucinated notes to Ontario doctors, report says

1 hour ago 6

Auditor general's findings show program also generated incorrect, missing or incomplete info about patients

Published May 12, 2026  •  2 minute read

A doctor wears a stethoscope as he sees a patient.A doctor wears a stethoscope as he sees a patient. Photo by Joe Raedle / Files /Getty Images

An auditor general report into the use of an artificial intelligence program by Ontario doctors in recent years found it hallucinated and made many mistakes in the notes it provided.

Advertisement 2

Toronto Sun

THIS CONTENT IS RESERVED FOR SUBSCRIBERS ONLY

Subscribe now to read the latest news in your city and across Canada.

  • Unlimited online access to articles from across Canada with one account.
  • Get exclusive access to the Toronto Sun ePaper, an electronic replica of the print edition that you can share, download and comment on.
  • Enjoy insights and behind-the-scenes analysis from our award-winning journalists.
  • Support local journalists and the next generation of journalists.
  • Daily puzzles including the New York Times Crossword.

SUBSCRIBE TO UNLOCK MORE ARTICLES

Subscribe now to read the latest news in your city and across Canada.

  • Unlimited online access to articles from across Canada with one account.
  • Get exclusive access to the Toronto Sun ePaper, an electronic replica of the print edition that you can share, download and comment on.
  • Enjoy insights and behind-the-scenes analysis from our award-winning journalists.
  • Support local journalists and the next generation of journalists.
  • Daily puzzles including the New York Times Crossword.

REGISTER / SIGN IN TO UNLOCK MORE ARTICLES

Create an account or sign in to continue with your reading experience.

  • Access articles from across Canada with one account.
  • Share your thoughts and join the conversation in the comments.
  • Enjoy additional articles per month.
  • Get email updates from your favourite authors.

THIS ARTICLE IS FREE TO READ REGISTER TO UNLOCK.

Create an account or sign in to continue with your reading experience.

  • Access articles from across Canada with one account
  • Share your thoughts and join the conversation in the comments
  • Enjoy additional articles per month
  • Get email updates from your favourite authors

Article content

The report released Tuesday by auditor general Shelley Spence said AI Scribe, a program that purportedly relieves pressure on physicians to take notes while speaking with a patient, led to many mistakes including incorrect and incomplete information as well as “AI hallucinations.”

Article content

Article content

The report said “hallucinations” occur when AI systems generate information that is made up, fabricated or not based on any data provided to it.

Since 2023, physicians in the province can use the technology only if patients consent.

Some of the Ontario auditor general's findings on AI Scribe, a program that purportedly relieves pressure on physicians to take notes. Some of the Ontario auditor general’s findings on AI Scribe, a program that purportedly relieves pressure on physicians to take notes. Photo by Toronto Sun graphic

Program creates SOAP note

Once authorized, the program listens to the conversation between a patient and doctor and compiles the information into a SOAP (subjective, objective, assessment and treatment plan) note.

However, Spence noted the types of inaccuracies found in the notes generated by AI found incorrect information in 12 of 20 vendors, hallucinations in nine of 20 and incomplete information in six of 20.

The inaccuracies “captured a different drug than what was prescribed by the doctor,” the report said.

Advertisement 3

Article content

Even more alarming were the AI hallucinations. The report found AI Scribe systems “fabricated information and made suggestions to patients’ treatment plans, such as referring the patient to therapy or ordering blood tests, even though these steps were not mentioned in the simulated recordings.”

Another example included statements that there were “no masses found” or that a patient had issues with anxiety, although it was never discussed.

RECOMMENDED VIDEO

Loading...

We apologize, but this video has failed to load.

Concerns about privacy risks

AI notes that generated missing or incomplete information occurred 85% of the time, which included key details that were omitted about the patients’ mental-health issues.

Another concern was the risk of loss of privacy from the potential exposure of patient information when the AI tool is used.

“When Ontarians see their doctor, they need to share intimate information about their health, their bodies and their personal lives to receive proper care,” the report said.

“Ontarians expect this extremely personal information to be kept private and confidential. Using AI to assist in providing health care must not come at the cost of compromising privacy.”

Read More

  1. Ontario is expanding the list of illnesses pharmacists are permitted to treat.

    Ontario pharmacists to administer more vaccines, treat more conditions

  2. A woman with a hearing aid communicates with her daughter and granddaughter on a smartphone.

    Ontario government all ears on expanding hearing aid access

Article content

*** Disclaimer: This Article is auto-aggregated by a Rss Api Program and has not been created or edited by Bdtype.

(Note: This is an unedited and auto-generated story from Syndicated News Rss Api. News.bdtype.com Staff may not have modified or edited the content body.

Please visit the Source Website that deserves the credit and responsibility for creating this content.)

Watch Live | Source Article