The Perils of Relying on AI: A Lesson from a local

31st Mar 2026

In the age of AI, tools like ChatGPT can feel like an instant encyclopedia at our fingertips. They provide answers quickly, summarize information, and even craft stories or quizzes. But a recent exchange highlights that AI is only as accurate as the data it draws from and mistakes can propagate easily.
Getting on the bandwagon a while ago, I decided to see if AI might be helpful in creating a quiz about our local history. One of the questions that popped up had me intrigued.
“In 1921, Guyra made national headlines when a young girl went missing. What was her name?”
The answers was Shirley Collins. Who is Shirley Collins, you might ask and so did I.

Here’s the story:
The Disappearance of Shirley Collins (1921)
• Who she was: Shirley Collins was a 4-year-old girl living with her family on a property just outside of Guyra.
• The disappearance: On January 27, 1921, Shirley went missing from her home. The country around Guyra is rugged and heavily timbered in parts, and fears quickly grew that she had become lost in the bush.
• The search: A huge search party was organised. Hundreds of locals joined in, along with police, trackers, and volunteers from neighbouring towns. Newspapers across Australia followed the story closely.
• The discovery: After being missing for four nights, Shirley was found alive by a local farmer and his dog. She was weak, scratched, and hungry but otherwise unharmed. Her survival at such a young age in the bush for that long was considered miraculous.
• Aftermath: Shirley became a kind of national symbol of endurance and hope. The story was retold in papers, school readers, and even song. The event was often referred to simply as “The Guyra Mystery.” Shirley later grew up, married, and lived much of her life in the local region.
There was just one problem. To the best of my knowledge, it
wasn’t true. As a local, I pointed out that I had never heard of such an event. Upon investigation, it became clear that two separate stories had somehow merged into one to create this false narrative.
It appears that AI had somehow blended elements of the Guyra Ghost mystery and the story of “Little Boy Lost” and presented them as fact - a perfect example how the model can generate plausible-sounding but highly inaccurate information.
This Guyra example underscores a vital point: AI is a tool, not a truth machine. It can mislead as easily as it can inform, particularly in nuanced or localised topics. Users must remain sceptical, verify carefully, and remember that human knowledge and local expertise are irreplaceable.
Just in case - if you have ever heard of Shirley Collins drop us a line. We’d love to here from you.