National

Eye-tracking technology can use your eyes as a ‘window’ into your mind, study finds

A study published in the journal Frontiers in Human Neuroscience found that technology that tracks a person’s eye movements can predict if a person is agreeable, conscientious, extroverted or rhetoric.
A study published in the journal Frontiers in Human Neuroscience found that technology that tracks a person’s eye movements can predict if a person is agreeable, conscientious, extroverted or rhetoric. Creative Commons

When it comes to personality, it could all be in the eyes.

That’s according to a new study published in the journal Frontiers in Human Neuroscience, which found that artificial intelligence can use your eyes as a ‘window’ into some of the most basic parts of your personality.

Researchers from Germany and Australia placed eye-tracking technology on 50 people at Flinders University in Adelaide, South Australia, and used it to monitor their eye movement as they went about their usual days. The scientists gave the subjects three often-used questionnaires to determine how neurotic, extroverted, agreeable, open and conscientious they were.

Those descriptors are described as the “Big Five” personality traits — and researchers tasked the technology to guess the personalities of the study’s subjects by using algorithms and compared it with the questionnaire date to determine just how accurate the artificial intelligence could be.

The research, as noted by Forbes, discovered the technology “reliably” managed to guess all the personality traits except openness.

While far from perfect, the technology was able to gauge people’s personalities “well above chance,” the researchers noted. Researchers also examined subjects during different activities — like “on the way to the shop vs. the way back to the laboratory” — and then examined the predictions the algorithm made for both of the scenarios.

Tobias Loetscher, from the University of South Australia, which participated in the study, said in a press release that the study suggests there could be a future when robots can actually understand the emotions of people they are dealing with.

“People are always looking for improved, personalized services,” he said. “However, today’s robots and computers are not socially aware, so they cannot adapt to non-verbal cues. This research provides opportunities to develop robots and computers so that they can become more natural, and better at interpreting human social signals.”

Olivia Carter, a neuroscientist at the University of Melbourne, told the New Scientist that the findings, while intriguing, could also foreshadow a new privacy problem for those who encounter eye-tracking technology.

“If the same information could be gained from eye recordings or speech frequency,” she said, “then it could easily be recorded and used without people’s knowledge.”

In the new book "Microtrends Squared," authors Mark Penn and Meredith Fineman write about the growing influence AI is having on our everyday lives. Amazon's Alexa is one example.

  Comments