Governments ‘flying blind’' on Closing the Gap due to outdated data, new report warns
A gap in data is a problem for bipartisan agreement on Indigenous disadvantage, according to report about the risks and opportunities for AI to improve their health and wellbeing.
Governments are “flying blind” in some of their attempts to lift the health, education and employment status of Aboriginal and Torres Strait Islander Australians because too much of the information that should be guiding decisions is old or unknown, according to analysis published by the philanthropic Paul Ramsay Foundation.
Wiradjuri doctor Kyle Turner’s new report identifies the risks and benefits of ramping up the use of artificial intelligence to meet targets in the national agreement on Closing the Gap, the key bipartisan policy for reducing disparity between Indigenous and non-Indigenous Australians.
Dr Turner’s report – The Future Impact of Artificial Intelligence on First Nations Communities – finds potential for better results and savings if governments and Indigenous communities jointly develop AI tools for tasks such as health screening, to identify patterns of abuse early and to support victims of domestic violence.
However, the report, completed as part of a fellowship with the Paul Ramsay Foundation, also finds out-of-date or nonexistent data is a problem for the national agreement on Closing the Gap.
The Productivity Commission published its most recent annual update for Closing the Gap to an online dashboard in July and it showed there was no new information for three of the 17 targets. For example, the last known rates of domestic violence in Indigenous families were published in the 2018-19 financial year, making it difficult to know the effects of any policies and programs introduced or repealed since the Closing the Gap agreement began five years ago.
“At the moment, too much of the Closing the Gap picture is still missing,” Dr Turner told The Australian on Sunday.
“When key data doesn’t appear on the dashboard, or arrives years late, governments are essentially flying blind.
“You can’t improve what you don’t measure.”
It is not clear if data has ever existed for other measures in the Closing the Gap agreement; for example, how many Indigenous households have essential services such as electricity and running water, part of the housing target.
Other figures guiding the Closing the Gap agreement are a few years old or come with a warning that they are based on limited information, or both. For example, the 2025 Closing the Gap update shows the proportion of Indigenous babies born a healthy weight has improved since 2017 but that is based on the most recent figure – 89.2 per cent – from 2022.
Dr Turner has a PhD in epidemiology and created a virtual screening tool for gums and teeth that has been used by thousands of people who do not live near a dentist. His knowledge of AI and the burden of chronic disease has made him hopeful about AI’s potential to help Australia’s most disadvantaged groups, particularly Aboriginal and Torres Strait Islander Australians, who have “historically been the last to benefit from advancements in tech”.
“I am an optimist about the role AI can play in health, especially for mob, but only if it’s done the right way,” Dr Turner said.
“In the report, I argue that AI can improve access to care, support earlier diagnosis and relieve some of the pressure on overstretched services – particularly through telehealth and predictive tools – but only when it’s built on strong infrastructure and co-designed with First Nations communities.
“Without that, the same tools can very easily deepen existing health inequities rather than close them.”
Dr Turner said child protection systems overseas had used predictive models such as the Allegheny Family Screening Tool to assess the risk of future harm when a report is made about a child.
In Australia there had been work done on predictive and simplified risk models for family violence and child maltreatment.
Dr Turner said these tools could help services with limited capacity, but they were highly controversial because they often reproduced existing biases in welfare and justice data.
“The lesson for First Nations communities is that any AI used for ‘early warning’ must be co-designed with families and services, and tested for racial bias from the start – not bolted on afterwards,” Dr Turner said.

To join the conversation, please log in. Don't have an account? Register
Join the conversation, you are commenting as Logout