Skip to main content

Main navigation

  • About BCS
    • Mission
    • History
    • Building 46
      • Building 46 Room Reservations
    • Leadership
    • Employment
    • Contact
      • BCS Spot Awards
      • Building 46 Email and Slack
    • Directory
  • Faculty + Research
    • Faculty
    • Areas of Research
    • Postdoctoral Research
      • Postdoctoral Association and Committees
    • Core Facilities
    • InBrain
      • InBRAIN Collaboration Data Sharing Policy
  • Academics
    • Course 9: Brain and Cognitive Sciences
    • Course 6-9: Computation and Cognition
      • Course 6-9 MEng
    • Brain and Cognitive Sciences PhD
      • How to Apply
      • Program Details
      • Classes
      • Research
      • Student Life
      • For Current Students
    • Molecular and Cellular Neuroscience Program
      • How to Apply to MCN
      • MCN Faculty and Research Areas
      • MCN Curriculum
      • Model Systems
      • MCN Events
      • MCN FAQ
      • MCN Contacts
    • Computationally-Enabled Integrative Neuroscience Program
    • Research Scholars Program
    • Course Offerings
  • News + Events
    • News
    • Events
    • Recordings
    • Newsletter
  • Community + Culture
    • Community + Culture
    • Community Stories
    • Outreach
      • MIT Summer Research Program (MSRP)
      • Post-Baccalaureate Research Scholars
      • Conferences, Outreach and Networking Opportunities
    • Get Involved (MIT login required)
    • Resources (MIT login Required)
  • Give to BCS
    • Join the Champions of the Brain Fellows Society
    • Meet Our Donors

Utility Menu

  • Directory
  • Apply to BCS
  • Contact Us

Footer

  • Contact Us
  • Employment
  • Be a Test Subject
  • Login

Footer 2

  • McGovern
  • Picower

Utility Menu

  • Directory
  • Apply to BCS
  • Contact Us
Brain and Cognitive Sciences
Menu
MIT

Main navigation

  • About BCS
    • Mission
    • History
    • Building 46
    • Leadership
    • Employment
    • Contact
    • Directory
  • Faculty + Research
    • Faculty
    • Areas of Research
    • Postdoctoral Research
    • Core Facilities
    • InBrain
  • Academics
    • Course 9: Brain and Cognitive Sciences
    • Course 6-9: Computation and Cognition
    • Brain and Cognitive Sciences PhD
    • Molecular and Cellular Neuroscience Program
    • Computationally-Enabled Integrative Neuroscience Program
    • Research Scholars Program
    • Course Offerings
  • News + Events
    • News
    • Events
    • Recordings
    • Newsletter
  • Community + Culture
    • Community + Culture
    • Community Stories
    • Outreach
    • Get Involved (MIT login required)
    • Resources (MIT login Required)
  • Give to BCS
    • Join the Champions of the Brain Fellows Society
    • Meet Our Donors

Events

News Menu

  • News
  • Events
  • Newsletters

Breadcrumb

  1. Home
  2. Events
  3. Towards understanding facial movements in real life
Department of Brain and Cognitive Sciences (BCS)
Thesis Defense

Towards understanding facial movements in real life

Speaker(s)
Tuan Le Mau, Brown Lab
Add to CalendarAmerica/New_YorkTowards understanding facial movements in real life12/14/2018 8:30 pm12/14/2018 10:30 pmBrain and Cognitive Sciences Complex, 43 Vassar Street, McGovern Seminar Room 46-3189, Cambridge MA
December 14, 2018
8:30 pm - 10:30 pm
Location
Brain and Cognitive Sciences Complex, 43 Vassar Street, McGovern Seminar Room 46-3189, Cambridge MA
Contact
Department of Brain and Cognitive Sciences
    Description

    It is commonly assumed that there is a reliable one-to-one mapping between a certain configuration of facial movements and the specific emotional state that is supposedly signals. One common way to test this one-to-one hypothesis is to ask people to deliberately pose the facial configurations that they believe they use to express emotions. Participants are randomly sampled, without concern for their emotional expertise, and are given a single emotion word or a single, brief statement to describe each emotion category. They then deliberately pose the facial configuration that they believe they make when expressing instances of this category. Such studies routinely find that participants from different countries show moderate to strong evidence for a one-to-one mapping between an emotion category and a single facial configuration (its presumed facial expression). In Study 1, we examined the facial configurations posed by emotion experts - famous actors who were provided with a diverse sample of richly described scenarios, full of context. Participants inferred the emotional meaning of the scenarios, which were then grouped into categories. Systematic coding of the facial poses for each emotion category revealed little evidence for the hypothesis that each category has a diagnostic facial expression. Instead, we observed a high degree of variability among experts' facial poses for any given emotion category, and little specificity for any pose. Furthermore, an unsupervised statistical analysis discovered 29 novel emotion categories with moderately consistent facial poses. In Study 2, participants were asked to infer the emotional meaning of each facial pose when presented alone, or when presented in the context of its eliciting scenario. In fact, the majority of studies designed to test the one-to-one hypothesis ask people from various cultures to judge posed configurations of facial movements, such as a scowl (the proposed facial expression for anger), a frown (the proposed expression for sadness), and so on, on the assumption that these facial configurations, as universal expressions of emotional states, co-evolved with the ability to recognize and read them. These studies routinely show participants one facial configuration posed by multiple posers for each emotion category and observe variable findings, depending on the experimental method used. Our analyses indicated that participantsā€˜inferences about the emotional meaning of the facial poses were influenced more by their eliciting scenarios than by the physical morphology of the facial configurations. These findings strongly replicate emerging evidence that the emotional meaning of any set of facial movements may be much more variable and context-dependent than hypothesized by the common one-to-one view which continues to influence the public understanding of emotion, and hence education, clinical practice, and applications in government and industry.

    Although more ecologically valid research on how people actually move their faces to express emotion is urgently needed, doing so was immensely difficult without the right tools that support the process of capturing facial data in real life, automatically processing these data, and finally supporting data verification and analysis. We developed a system of technological tools to support the investigations of facial movements during emotional episodes in naturalistic settings with the use of dynamic and longitudinal facial data. We then collected, pre-processed, verified and analyzed data from Youtube using our newly-developed tools. In particular, we examined two talk show hosts and presented preliminary insights on the answers to questions that were previously very difficult to investigate.

     

    This thesis and can be read here: https://www.dropbox.com/sh/7j2ly0wlhkyqd71/AAAoN-x77FPTWRrWWnUq4u4oa?dl=0

    Upcoming Events

    See All Events
    Don't miss our next newsletter!
    Sign Up

    Footer menu

    • Contact Us
    • Employment
    • Be a Test Subject
    • Login

    Footer 2

    • McGovern
    • Picower
    Brain and Cognitive Sciences

    MIT Department of Brain and Cognitive Sciences

    Massachusetts Institute of Technology

    77 Massachusetts Avenue, Room 46-2005

    Cambridge, MA 02139-4307 | (617) 253-5748

    For Emergencies | Accessibility

    Massachusetts Institute of Technology