Women and men look at faces and absorb visual information in different ways, according to a new study which suggests that there is a gender difference in understanding visual cues.
Researchers from Queen Mary University of London (QMUL) in the UK used an eye tracking device on almost 500 participants over a five-week period to monitor and judge how much eye contact they felt comfortable with while looking at a face on a computer screen.
They found that women looked more at the left-hand side of faces and had a strong left eye bias, but that they also explored the face much more than men.
They observed that it was possible to tell the gender of the participant based on the scanning pattern of how they looked at the face with nearly 80 per cent accuracy.
Given the very large sample size the researchers suggest this is not due to chance.
“This study is the first demonstration of a clear gender difference in how men and women look at faces,” said lead author Antoine Coutrot from QMUL.
We are able to establish the gender of the participant based on how they scan the actor's face, and can eliminate that it is not based on the culture of the participant as nearly 60 nationalities have been tested,” said Coutrot.
“We can also eliminate any other observable characteristics like perceived attractiveness or trustworthiness,” Coutrot added.
The participants were asked to judge how comfortable the amount of eye contact they made was with the actor in a Skype-like scenario.
Each participant saw the same actor (there were eight in total) during the testing period, which was around 15 minutes.
At the end of the session the researchers collected personality information about the participants through questionnaires.
“There are numerous claims in popular culture that women and men look at things differently – this is the first demonstration, using eye tracking, to support this claim that they take in visual information in different ways,” said Isabelle Mareschal from QMUL.
Researchers also suggest that the gender difference in scanning visual information might impact many research fields, such as autism diagnosis or even everyday behaviours like watching a movie or looking at the road while driving.
The study appears in the Journal of Vision.