Science/Tech

Touch Influences the Way We See Faces

By Christine Hsu | Update Date: Sep 05, 2013 01:55 PM EDT

Even though we rarely touch the faces of people we meet, new research reveals that out sense of "touch" influences the way we see them.

Researchers explain that the way people perceive faces may be similar to the way they perceive everyday non-face objects and events.

"In daily life, we usually recognize faces through sight and almost never explore them through touch," lead researcher Kazumichi Matsumiya of Tohoku University in Japan, said in a news release. "But we use information from multiple sensory modalities in order to perceive many everyday non-face objects and events, such as speech perception or object recognition - these new findings suggest that even face processing is essentially multisensory."

Researchers used the "face aftereffect" to see if the visual system responds to non-visual signals for processing faces. Researchers explain that people's perceptions tend to adapt to facial expressions like happiness or anger.  The adaptation then causes people to see a subsequent neutral face as having the opposite facial expression.

Researchers say that there should be evidence for face aftereffects from one modality to the other if the visual system really does respond to signals from another modality.

In the study, participants were asked to touch facemasks hidden below a mirror.  After this adaptation period, participants were asked to look at a series of faces that had varying expressions.  They were then asked to classify the faces as happy or sad.

The findings revealed that participants' experiences touching the facemasks shifted their perception of the faces they saw.  They explained that the visual faces were perceived as having the opposite facial expression as the facemask.

Researchers found that the face aftereffects can also work the other way with sight influencing touch.

While current views on face processing assume that the visual system only receives facial signals from the visual modality, researchers say the latest findings suggest that face perception may be cross modal.

"These findings suggest that facial information may be coded in a shared representation between vision and haptics in the brain," said Matsumiya.  He added that the study might have implications for improving vision and telecommunication for the visually impaired.

The findings are published in the journal Psychological Science.

© 2023 Counsel & Heal All rights reserved. Do not reproduce without permission.

Join the Conversation

Real Time Analytics