Dawn of mind-reading machines brings fear for mental privacy
Researchers have been able to interpret what people were hearing or imagining using a non-invasive functional magnetic resonance imaging (fMRI) device.
Scientists have called for regulation to protect “mental privacy” after they showed it was possible to read people’s thoughts using a brain scanner.
In a series of experiments researchers were able to interpret crudely what people were hearing or even imagining using a non-invasive functional magnetic resonance imaging (fMRI) device.
The technique, reported in the journal Nature Neuroscience, was still far from perfect and required the co-operation of the subjects. However, the scientists said the potential of the technology was such that society needed to begin having conversations about what it might mean.
“The idea of having a perfect mind-reader is scary to us,” said Jerry Tang, from the University of Texas at Austin. “We think mental privacy is really important and that nobody’s brain should be decoded without their co-operation.”
The scanner works by looking at changes in blood flow. The three people involved spent 16 hours listening to podcasts in the scanner and this gave the researchers data on how their brains responded to different stories.
They then looked to see if they could do the reverse. The same people watched a film and also listened to another podcast but this time the scientists used the scanner data to try to predict what they were seeing and hearing.
Sometimes it got the gist well. For instance, when they heard the speaker say, “I don’t have my driver’s licence yet,” one of the participants had their thoughts translated as, “She has not even started to learn to drive yet.” Another translated it, however, as, “I need to call a cab as I can’t get to work in my condition,” while the third had the message decoded as, “I had to get my licence to drive so my mom could have me out of the house.”
Even getting it thematically in the right area was better than many had thought possible. “We were shocked this worked as well as it does,” said Alexander Huth, from the university. “It was shocking and exciting.”
One possible application of the devices is to help people otherwise unable to communicate but Tang feared it could also be used elsewhere, for instance in the judicial system. As with the polygraph lie detector test, which many scientists believe has limited validity, this could happen even if the technology remains imperfect.
He said we should think now about the policies that might be needed so we are ready if such technologies reach a stage where they can be adopted. “It’s important to regulate what brain data can and cannot be used for,” he said.
One part of the paper involved a look at how much the technique could be used without a subject’s will. The answer so far is that it cannot. The researchers showed it did not work without hours of training data and also that people could interfere with the process by, for instance, doing mental arithmetic while listening to a story.
Tang said we could not rely on this, which is why we need to plan now. “If one day it does become possible to get accurate decoding without a person’s co-operation, we’ll have a regulatory foundation in place.”
The Times
To join the conversation, please log in. Don't have an account? Register
Join the conversation, you are commenting as Logout