ChatGPT Health promises to personalise health information. It comes with many risks
- Written by Julie Ayre, Post Doctoral Research Fellow, Sydney Health Literacy Lab, University of Sydney

Many of us already use generative artificial intelligence (AI) tools such as ChatGPT for health advice. They give quick, confident and personalised answers, and the experience can feel more private than speaking to a human.
Now, several AI companies have unveiled[1] dedicated “health and wellness[2]” tools. The most prominent is ChatGPT Health[3], launched by OpenAI earlier this month.
ChatGPT Health promises to generate more personalised answers, by allowing users to link medical records and wellness apps, upload diagnostic imaging and interpret test results.
But how does it really work? And is it safe?
Most of what we know about this new tool comes from the company that launched it, and questions remain about how ChatGPT Health would work in Australia. Currently, users in Australia can sign up for a waitlist to request access.
Let’s take a look.
AI health advice is booming
Data from 2024 shows[4] 46% of Australians had recently used an AI tool.
Health queries are popular. According to OpenAI, one in four[5] regular ChatGPT users worldwide submit a health-related prompt each week.
Our 2024 study[6] estimated almost one in ten Australians had asked ChatGPT a health query in the previous six months.
This was more common for groups that face challenges[7] finding accessible health information, including:
- people born in a non-English speaking country
- those who spoke another language at home
- people with limited health literacy.
Among those who hadn’t recently used ChatGPT for health, 39% were considering using it soon.
How accurate is the advice?
Independent research consistently shows[8] generative AI tools do sometimes give unsafe health advice, even when[9] they have access to a medical record.
There are several high-profile examples of AI tools giving unsafe health advice, including when ChatGPT allegedly encouraged suicidal thoughts[10].
Recently, Google removed several AI Overviews on health topics – summaries which appear at the top of search results – after a Guardian investigation found the advice about blood tests results[11] was dangerous and misleading.
This was just one health prompt they studied. There could be much more advice the AI is getting wrong we don’t know about yet.
So, what’s new about ChatGPT Health?
The AI tool has several new features[12] aimed to personalise its answers.
According to OpenAI, users will be able to connect their ChatGPT Health account with medical records and smartphone apps such as MyFitnessPal. This would allow the tool to use personal data about diagnoses, blood tests, and monitoring, as well as relevant context from the user’s general ChatGPT conversations.
OpenAI emphasises[13] information doesn’t flow the other way: conversations in ChatGPT Health are kept separate from general ChatGPT, with stronger security and privacy. The company also says ChatGPT Health data won’t be used to train foundation models.
OpenAI says it has worked with more than 260 clinicians in 60 countries[14] (including Australia), to give feedback on and improve the quality of ChatGPT Health outputs.
In theory, all of this means ChatGPT Health could give more personalised answers compared to general ChatGPT, with greater privacy.
Read more: Can you say no to your doctor using an AI scribe?[15]
But are there still risks?
Yes. OpenAI openly states[16] ChatGPT Health is not designed to replace medical care and is not intended for diagnosis or treatment.
It can still make mistakes. Even if ChatGPT Health has access to your health data, there is very little information about how accurate and safe the tool is, and how well it has summarised the sources it has used.
The tool has not been independently tested. It’s also unclear whether ChatGPT Health would be considered a medical device[17] and regulated as one in Australia.
The tool’s responses may not reflect Australian clinical guidelines, our health systems and services, and may not meet the needs of our priority populations. These include First Nations people, those from culturally and linguistically diverse backgrounds, people with disability and chronic conditions, and older adults.
We don’t know yet if ChatGPT Health will meet data privacy and security[18] standards we typically expect for medical records in Australia.
Currently, many Australians’ medical records are incomplete due to patchy uptake of MyHealthRecord, meaning even if you upload your medical record, the AI may not have the full picture of your medical history.
For now, OpenAI says[19] medical record and some app integrations are only available in the United States.
So, what’s the best way to use ChatGPT for health questions?
In our research[20], we have worked with community members to create short educational materials[21] that help people think about the risks that come with relying on AI for health advice, and to consider other options.
Higher risk
Health questions that would usually require clinical expertise to answer carry more risk of serious consequences. This could include:
- finding out what symptoms mean
- asking for advice about treatment
- interpreting test results.
AI responses can often seem sensible – and increasingly personalised – but that doesn’t necessarily mean they are correct or safe. So, for these higher-risk questions, the best option is always to speak with a health professional.
Lower risk
Other health questions are less risky. These tend to be more general, such as:
- learning about a health condition or treatment option
- understanding medical terms
- brainstorming what questions to ask during a medical appointment.
Ideally, AI is just one of the information sources you use.
Where else can I get free advice?
In Australia we have a free 24/7 national phone service, where anyone can speak with a registered nurse about their symptoms: 1800 MEDICARE[22] (1800 633 422).
Symptom Checker[23], operated by healthdirect, is another publicly funded, evidence-based tool that will help you understand your next steps and connect you with local services.
AI tools are here to stay
For now, we need clear, reliable, independent, and publicly available information[24] about how well the current tools work and the limits of what they can do. This information must be kept up-to-date as the tools evolve.
Purpose-built AI health tools could transform how people gain knowledge, skills and confidence to manage their health. But these need to be designed with communities and clinicians, and prioritise accuracy, equity and transparency.
It is also essential to equip our diverse communities with the knowledge and skills to navigate this new technology safely.
References
- ^ have unveiled (www.anthropic.com)
- ^ health and wellness (openai.com)
- ^ ChatGPT Health (openai.com)
- ^ shows (digitalinclusionindex.org.au)
- ^ one in four (cdn.openai.com)
- ^ study (doi.org)
- ^ face challenges (www.health.gov.au)
- ^ shows (doi.org)
- ^ even when (doi.org)
- ^ encouraged suicidal thoughts (www.bbc.com)
- ^ advice about blood tests results (www.theguardian.com)
- ^ several new features (openai.com)
- ^ emphasises (openai.com)
- ^ more than 260 clinicians in 60 countries (openai.com)
- ^ Can you say no to your doctor using an AI scribe? (theconversation.com)
- ^ openly states (openai.com)
- ^ medical device (www.tga.gov.au)
- ^ data privacy and security (www.oaic.gov.au)
- ^ says (openai.com)
- ^ our research (doi.org)
- ^ short educational materials (sydneyhealthliteracylab.org.au)
- ^ 1800 MEDICARE (www.1800medicare.gov.au)
- ^ Symptom Checker (www.healthdirect.gov.au)
- ^ information (www.oaic.gov.au)















