NewsBite

Technology: facial recognition to eye scans and thought control

Devices that scan your eyes, glean your mood and act on signals from your brain are no longer science fiction.

Intel’s Mooly Eden reveals the RealSense camera at the 2014 Consumer Electronics Show.
Intel’s Mooly Eden reveals the RealSense camera at the 2014 Consumer Electronics Show.

You stare at a ceiling light and it switches on. The same applies when you stare at the coffee machine or focus your eyes on the button showing your preferred washing machine cycle. You refocus on the “on” button and away it goes.

Looking intently at the television switches it on and you watch a streaming channel. The ads all look appealing. Somehow the TV knows which ads you like from your mood when watching earlier ones. And your home robot slinks around the corner, out of sight, having discerned you are in a filthy mood.

This isn’t telepathy. It isn’t the distant future. It’s part of how we are about to communicate with electronic devices. It’s potentially our most intimate interaction with machines.

Devices that scan your eyes, judge where you look, glean your mood and act on electrical signals from your brain are not science fiction. Versions exist now and they’re coming to your phone, tablet, notebook and PC.

Even basic face recognition software is becoming more sophisticated and reaching new markets.

In Israel a Tel Aviv start-up called Faception reportedly has developed technology that identifies character traits and mood, including spotting who may be a terrorist, by analysing a person’s face.

In Russia, start-up NTechLab has launched an app called Findface that reportedly seeks to match photos you take in the street with members of Russian social network Vkontakte.

News site Digital Trends reports that so far FindFace has performed about one quadrillion photo comparisons using images from Vkontakte, which has about 200 million profiles.

That’s just the start. Companies that specialise in iris tracking, that harness cameras and artificial intelligence to read and interpret emotions, and devices that pick up the electrical signals sent to muscles from the brain are a growing multi-billion-dollar economy. And the big tech companies are investing in them.

In 2012, Intel pumped $US21 million into Swedish eye-tracking tech firm Tobii, taking about a 10 per cent stake.

Medical diagnostics and assessment, lie detection, attention monitoring, training and rehabilitation are among the listed applications.

These technologies can be godsends to many of those suffering from disabilities. Tobii’s tracking system lets physically disabled people control a Windows 10 system with their eyes. It’s for those with spinal cord injury, motor neurone disease, Rett syndrome or muscular dystrophy.

Eye tracking also can evaluate a user experience on a device and provide user feedback in infant and child research.

Use has spread quickly to the mainstream. At last January’s Consumer Electronics Show in Las Vegas, gaming company MSI showed off a laptop with integrated Tobii eye tracking called the GT72S Tobii.

According to MSI, “Both game characters and environments react to your gaze, focus and attention.” “Eye tracking gaming is the next evolution in PC gaming,” it claims.

Apple has invested in technology that assesses mood by buying San Diego-based artificial intelligence firm Emotient. Marketers can gauge how receptive you are to their message by observing your reaction via the device’s camera.

Many will see this as a gross invasion of privacy.

But Apple’s use of Emotient’s technology could be cast differently. Using Apple’s ResearchKit framework, North Carolina’s Duke University has developed an Autism & Beyond program that gauges emotion for research purposes.

An app uses the front-facing HD camera in an iPhone, along with facial recognition algorithms, to analyse reactions to videos in children as young as 18 months.

The iPhone’s camera records emotional responses as each child watches four videos.

“We hope that this technology may one day be used to screen young children in their homes for autism and mental health challenges, such as anxiety or tantrums,” the website says.

News reports noted in March that Apple’s acquisition of Emotient had thwarted Intel’s plans to incorporate Emotient mood sensing in its RealSense 3-D camera. Intel’s RealSense camera has been pivotal in Microsoft’s successful rollout of Windows Hello, which uses facial recognition to log users into Windows 10.

The Intel RealSense camera is now used by Microsoft Hello for face recognition.
The Intel RealSense camera is now used by Microsoft Hello for face recognition.

On its website, Emotient lists three key performance indicators it measures. There’s attention (whether advertising is getting noticed), engagement (whether people are responding emotionally) and sentiment (whether they show positive, negative or no emotion).

Mood recognition systems generally analyse facial expressions to detect fear, anger, sadness, joy, disgust, surprise and contempt.

Emotient is not the only player in this space. There are dozens. One of them, Affdex, has a website where you can try out this technology. It lets you measure your emotional connectivity with a series of videos it presents. It then offers you the results in graphical form.

As well as eye tracking and mood recognition, there is technology that measures the electric signals from the brain. A paralysed person may not be able to move their leg or their finger but their brain still can signal their intention to do so. This signal can be repurposed to represent, say, an instruction to type a letter on a keyboard or to move a wheelchair forward.

Simply trying to move the muscle can be enough to generate a signal.

You need special equipment to measure the electric current from the brain. That has been achieved by Australian company Control Bionics with its NeuroSwitch.

It was developed by chief executive and company founder Peter Ford, a former NBC, CNN and Seven Network newsreader who is also a programmer.

NeuroSwitch interfaces with Apple software called Switch Control, which lets anyone with impaired physical and motor skills navigate through onscreen items.

For example, you can control a MacBook, iOS device or Apple TV, or type a particular character by synchronising a muscle pulse with a light that passes over that character on an onscreen keyboard. You trigger the keypress by thinking about moving the paralysed body part.

The US Veterans Administration has funded the installation of Ford’s NeuroSwitch system for qualifying disabled US veterans since 2007.

The system also has received some local support through the National Disability Insurance Scheme, Ford says.

He says the psychological interpretation of facial images has been around for at least 20 years, but it is now more sophisticated.

“It’s become a security issue and therefore highly funded,” Ford says. “The key to all these interfaces is that a computer is interpreting some kind of muscular movement.”

As well as assisting people with motor neurone disease, cerebral palsy and autism, Control Bionics is seeking to connect able-bodied people’s neural systems to the world via their phone.

So an executive at a meeting could think an action that would quietly trigger a yes or no response to an incoming email. Again, the system would pick up electronic pulses from the brain.

“It’s a huge breakthrough,” Ford says. “This technology that we originally developed for people who could not move or speak has massive applications for everybody.”

I previously trialled Ford’s system by projecting thoughts about moving muscles in Sydney to drive an Anybots remote presence robot down a street in Santa Clara, California. The technology exists.

Ford, who’s also a former NASA correspondent, tells me he had approached NASA with a plan for a NeuroSwitch user to drive the Mars Rover using this brain-thought technology. There is something futuristic about earthlings controlling objects on other celestial bodies by thought, but he says NASA did not agree, probably because of the cost of the Rover.

Nevertheless, “it will inevitably happen”, he says. “It (neural control) is a lot easier as you don’t have the interfacing of doing one action to move a joystick and have that joystick interpret that electrically. It (the signal) goes straight to your electrical system, the rover or aircraft or whatever you want to control.”

Whatever the case, all these ­instances show that the age of ­simulated telepathy that connects man and machine intimately is almost upon us.

Add your comment to this story

To join the conversation, please Don't have an account? Register

Join the conversation, you are commenting as Logout

Original URL: https://www.theaustralian.com.au/life/gadgets/technology-facial-recognition-to-eye-scans-and-thought-control/news-story/427d8980994955e916341887dfd5e65e