Emotion expression database new resource for researchers

Kristie Auman-Bauer
February 27, 2020

UNIVERSITY PARK, Pa. — The ability to understand facial expressions is an important part of social communication. However, little is known about how complex facial expression signal emotions related to social behavior and inner thoughts. To answer these questions, Penn State researchers developed the Complex Emotion Expression Database (CEED), a digital stimulus set of 243 basic and 237 complex emotional facial expressions.

The research and development of the database, published recently in PLOS ONE, can be used as a reference for researchers interested in studying the developmental, behavioral, and neural mechanisms supporting the perception and recognition of complex emotion expressions.

Conducted by Margaret Benda, project coordinator of the Development of the Adolescent Social Health (DASH) Project, and Suzy Scherf, associate professor of psychology and principal investigator of the Laboratory of Developmental Neuroscience, the CEED was developed to satisfy the requirements of an ongoing research project.

“There are many subcategories of complex expressions, including social self-conscious expressions, such as guilt, and social sexual expressions, such as flirtatiousness,” said Scherf, who is also a Social Science Research Institute cofunded faculty member. “We were especially interested in creating a database of these expressions because they are often underrepresented in the literature and they are central to a hypothesis that we are exploring in our own work.”

The researchers knew from previous research that the ability to perceive and understand expressions emerges very early in caregiver-infant interactions. However, complex expression subtleties in romantic and friendship relationships don’t develop until adolescence.

“We theorize that puberty influences the way adolescents perceive some of these complex expressions.” Scherf said.

According to Benda, the CEED includes six basic expressions — angry, disgusted, fearful, happy, sad and surprised — and nine complex expressions —affectionate, attracted, betrayed, brokenhearted, contemptuous, desirous, flirtatious, jealous and lovesick — that were posed by eight formally trained, young black and white adult actors.

The 480 images in the database were reviewed by a total of 870 people, and each image was rated by at least 50 adults. In order for an image to be included in the database, 50 percent of raters needed to endorse the correct emotion label.

The resulting database is being shared on Databrary data library, co-founded and co-directed by Rick Gilmore, professor of psychology at Penn State, in coordination with Karen Adolph, professor of psychology and neural science at New York City University.

“The database could be useful for other researchers investigating how children and adults perceive various emotional expressions and whether this ability is related to the development of social relationships,” said Benda. “Additionally, we described a clear strategy on how the CEED was developed, making the process replicable for other researchers to develop their own databases.”

The project was funded by the National Institutes of Health and supported by Penn State’s Department of Psychology and the Social Science Research Institute.

(Media Contacts)

Last Updated May 18, 2020