Ullrich Ecker is our leading researcher in cognitive science
Ullrich Ecker from the University of Western Australia is Australia’s top researcher in cognitive science.
Ullrich Ecker, a misinformation researcher, had a tragic early encounter with what happens when a loved one clings onto misguided beliefs.
His father had a scientific mindset but Ecker’s mother was raised by parents who believed herbal teas could cure nearly all ailments.
“My mum died of cancer when I was quite young,” says Ecker, a professor in cognitive psychology at the University of Western Australia who is named in The Australian’s 2026 Research magazine as the top researcher in the field of cognitive science.
“She abandoned traditional medicine at some point, and died not long after relying solely on alternative therapies.
“Even as a child, I felt it wasn’t right for someone to recommend herbal tea for a life-threatening illness. So I’ve always been interested in why people sometimes struggle to let go of ideas that have been disproven.”
This question has become the central focus of his research, which investigates why misinformation continues to influence people’s thinking even after it has become clear that the information is false or misleading.
His early work included a PhD on the cognitive neuroscience of memory at Saarland University in Germany.
On moving to Australia in 2008, he began to research the social and cognitive mechanisms of why people believe false ideas, and how best to counteract this influence.
Misinformation comes in many forms, Ecker says. It might be as simple as incorrectly recalling an event, to clinging on to beliefs about public health issues, such as vaccines causing autism.
One reason people hang on to misinformation is they prefer to believe things that align with their world view, or bolster their social identity, Ecker says.
It’s also difficult for people to change their beliefs because of how memory operates. “For us to remember things and plan for the future, we want stable representations, but if the world changes, we also need to have some flexibility, and that’s a difficult tension for memory to resolve,” Ecker says.
Countering misinformation requires repeating accurate information over and over, because repetition is one of the strongest drivers of belief.
“Even if it comes from just a single source, repetition has an influence because the more familiar a piece of information becomes, the more we tend to believe it,” Ecker says.
While misinformation has existed since humans began communicating, recent developments have exacerbated the issue.
“One of these is the advent of social media, which serves as an accelerant,” he says.
“Fifty years ago, if you believed the Earth was flat, you would have been the village idiot. Today, you can find many others online who think the same and now you’re the enlightened one.”
The other change has been in the way leaders communicate and form policy.
“Politicians have always twisted the truth, but there has been a measurable shift away from speaking about evidence towards justifying decisions merely based on beliefs,” Ecker says. Now with the emergence of generative AI and deep fakes, sifting truth from misinformation is becoming ever more challenging.
Ecker’s recent research has been on how AI-generated deep fake videos can influence people’s judgments even when they know a video is fake.
“As this technology matures, we will no longer be able to tell the difference between AI-generated and real content,” Ecker says.
“One implication is that I’m less concerned about how deep fakes will change people’s beliefs and more concerned that people will no longer trust any information.”