NewsBite

Using ChatGPT instead of the GP? Here’s what you need to know

It's the new "Dr Google". But there are things parents need to be aware of before uploading any details for advice. 

I feel like I'm showing my age here, but back in the day I had patients saying "my grandmother said it was such-and-such". Then it morphed into “Dr Google". Now, it’s “Dr AI.”

You know the drill, a weird pain, an odd rash, or a sudden “what on earth is this?” moment.

Instead of booking an appointment with their friendly GP, people are now opening up their phone, jumping into an AI tool, and asking the modern oracle: “What’s wrong with me?”

And AI will happily give you an answer - or ten.

But before you start relying on it instead of your local GP, let’s have a proper chat about what’s really going on here.

Want to join the family? Sign up to our Kidspot newsletter for more stories like this.

Image: Supplied
Image: Supplied

The big shift

Only a few years ago, people typed their symptoms into search engines. You’d get a list of websites ranging from medical journals to random blog posts, with the challenge of figuring out which were legit. Then, you'd need to sift through various conditions and see which one was the "best fit".

Now, AI skips the messy search results and gives you a neat, confident, personalised-sounding response - in mere seconds.

It feels like a conversation with someone who knows. And because it sounds convincing, many people assume it is convincing. (kinda reminds me of the overly vocal celebrity anti-vaxer...)

Why this is tempting

I get it. Booking a GP appointment takes time and effort. AI is instant, available at 2am, and doesn’t give you the side-eye when you describe something awkward. It can help you organise your thoughts, explain possible causes, and even suggest what questions to ask at the clinic.

In some cases, that’s actually helpful, especially if it encourages you to see a doctor sooner, or gives you the confidence to bring up something you’ve been too embarrassed to talk about.

The problem with medical advice and AI

AI doesn’t actually understand you. It’s not a human. It doesn’t know your medical history or family background. It's not sitting in the doctor's room observing subtle clues in your tone and body language. It can't read if there's another hidden agenda, not see the big festering sore on your leg that's actually the bigger issue.

AI is simply playing the odds, predicting text based on patterns in data and that data can be out-of-date, biased, or simply wrong.

This means AI can:

  • Miss rare but serious conditions
  • Over-diagnose scary things that aren’t actually there
  • Give advice that isn’t safe for your specific situation
  • Suggest treatments that don’t work in Australia, aren’t available here, or are outright dangerous

The scariest part? AI says everything with the same confident tone: whether it’s right, wrong, or completely made up.

RELATED: Is it RSV, the flu or just a head cold? Here’s the difference

The dangers of skipping the GP

Let me be clear: I’ve seen AI-assisted self-diagnosis cause real harm.

Delayed treatment. A patient convinced they “just had a virus” when AI agreed, but it was actually appendicitis.

Unnecessary anxiety. Another patient thought a harmless skin spot was melanoma because AI told them it could be, and they spent weeks in a panic.

Unsafe medication advice. AI suggested a drug dosage based on US guidelines, which would’ve been wrong - and risky - in Australia.

Doctors don’t just hand out diagnoses. We take a history, examine you, order the right tests, weigh up multiple possibilities, and importantly - rule out the dangerous ones first. AI can’t do that.

So, is there any benefit?

Used wisely, yes. AI can help you describe your symptoms more clearly before seeing your GP. This is particularly useful for people who struggle to articulate themselves, understand their body or how to describe symptoms, and for those who speak the native language.

It can also give you general background information about a condition you’ve already been diagnosed with and suggest credible questions to ask your doctor

The key is using AI as a starting point, not the finish line.

Image: iStock
Image: iStock

RELATED: Could that lingering cough be Mycoplasma Pneumonia?

How to use AI safely for health advice

If you’re going to ask AI about your health, here’s how to keep yourself safe:

  1. Never rely on it alone. If your symptoms are new, worsening, or worrying, see a real doctor.
  2. Check the source. Ask AI to tell you where it got the information. If it can’t give a credible, reputable source, be cautious.
  3. Be location-specific. Medical guidelines differ between countries. Make sure the advice applies in Australia.
  4. Don’t self-medicate based on AI advice. Always double-check with your GP pharmacist
  5. Use it for learning, not replacing. Let it help you understand, but don’t let it diagnose you in isolation.

The bottom line

AI is here to stay, and it’s going to become even more integrated into our daily lives. It can be an incredible tool for education and empowerment. But it’s not your doctor, and it never will be.

When it comes to your health, you still need a trained human who can look you in the eye, examine you, order the right tests, and - yes - sometimes tell you something you weren’t expecting.

So by all means, chat with “Dr AI”... just make sure you also see the real doctor. Your health and your peace of mind are worth it.

Originally published as Using ChatGPT instead of the GP? Here’s what you need to know

Original URL: https://www.themercury.com.au/lifestyle/using-chatgpt-instead-of-the-gp-heres-what-you-need-to-know/news-story/d921970df05bf69b5bbe139fb949d0e0