๐Ÿ’— WIA Emotion AI Standard Ebook | Chapter 1 of 8


๐Ÿ’— Chapter 1: Introduction to Emotion AI

Hongik Ingan (ๅผ˜็›Šไบบ้–“)

"Benefit All Humanity"

The WIA Emotion AI Standard is built on the philosophy that understanding human emotions is fundamental to creating technology that truly serves humanity.


1.1 What is Emotion AI (Affective Computing)?

Emotion AI, also known as Affective Computing, is a multidisciplinary field that combines artificial intelligence, computer science, psychology, and cognitive science to develop systems that can recognize, interpret, process, and simulate human emotions.

[i] Definition: Affective Computing is the study and development of systems and devices that can recognize, interpret, process, and simulate human affects (emotions, moods, and attitudes).

1.1.1 The Origin: Rosalind Picard

The field of Affective Computing was established by Dr. Rosalind Picard at the MIT Media Lab in 1995. Her seminal paper "Affective Computing" laid the foundation for what would become a multi-billion dollar industry.

Founder Rosalind Picard, Sc.D.
Institution MIT Media Lab
Year 1995
Key Publication "Affective Computing" (1997 book)
Core Thesis Emotions are essential to human intelligence and decision-making

1.1.2 Why Emotions Matter in Computing

Picard's groundbreaking insight was that emotions are not separate from rational thoughtโ€”they are essential to it. Research in neuroscience, particularly by Antonio Damasio, showed that people with damage to emotional centers of the brain struggle to make even simple decisions.

Key Insights:


1.2 Market Size and Growth

The Emotion AI market has experienced explosive growth and is projected to continue expanding rapidly:

Year Market Size (USD) Growth Rate
2020 $21.6 billion -
2021 $24.8 billion 14.8%
2022 $28.5 billion 14.9%
2023 $32.7 billion 14.7%
2026 (projected) $37.1 billion CAGR 13.5%

1.2.1 Market Segments

The Emotion AI market spans multiple industry verticals:


1.3 Emotion Classification Models

1.3.1 Ekman's Discrete Model (6 Basic Emotions)

Dr. Paul Ekman's research in the 1970s identified six universal emotions that are recognized across all human cultures:

Emotion Description Facial Characteristics Universal Recognition
๐Ÿ˜Š Happiness Positive emotional state of joy Raised cheeks, crow's feet, lip corners up 93%
๐Ÿ˜ข Sadness Emotional pain and sorrow Inner brow raised, lip corners down 84%
๐Ÿ˜  Anger Strong displeasure response Lowered brows, tense jaw, narrowed eyes 90%
๐Ÿ˜จ Fear Response to perceived threat Wide eyes, raised brows, open mouth 85%
๐Ÿคข Disgust Revulsion or strong disapproval Wrinkled nose, raised upper lip 88%
๐Ÿ˜ฎ Surprise Brief emotional response to unexpected Raised brows, wide eyes, dropped jaw 81%

[i] Note: WIA Emotion AI Standard also includes Neutral as a seventh category, representing the absence of strong emotional expression.

1.3.2 The Dimensional Model (Valence-Arousal)

The dimensional model, developed by James Russell, represents emotions on a continuous two-dimensional space:

                    High Arousal
                         |
                    Excited | Tense
                         |
    Negative ----+-------+-------+---- Positive
    Valence      |       |       |     Valence
                    Bored   | Content
                         |
                    Low Arousal

Quadrant Mapping:
  High Valence + High Arousal = Excited, Happy, Elated
  High Valence + Low Arousal  = Calm, Relaxed, Serene
  Low Valence  + High Arousal = Angry, Afraid, Stressed
  Low Valence  + Low Arousal  = Sad, Depressed, Bored

Valence: Ranges from -1 (negative) to +1 (positive), representing the pleasantness of an emotion.

Arousal: Ranges from -1 (low activation) to +1 (high activation), representing the intensity or energy level.


1.4 FACS - Facial Action Coding System

1.4.1 History and Development

The Facial Action Coding System (FACS) was developed by Paul Ekman and Wallace V. Friesen in 1978. It provides a systematic way to describe facial movements in terms of Action Units (AUs).

1.4.2 Action Units

Action Units represent the contraction or relaxation of specific facial muscles. Each AU is assigned a number and name:

AU FACS Name Muscles Description
AU1 Inner Brow Raiser Frontalis (pars medialis) Raises inner portion of eyebrows
AU2 Outer Brow Raiser Frontalis (pars lateralis) Raises outer portion of eyebrows
AU4 Brow Lowerer Corrugator supercilii, Depressor supercilii Lowers and draws eyebrows together
AU5 Upper Lid Raiser Levator palpebrae superioris Raises upper eyelid
AU6 Cheek Raiser Orbicularis oculi (pars orbitalis) Raises cheeks, creates crow's feet
AU7 Lid Tightener Orbicularis oculi (pars palpebralis) Tightens eyelids
AU9 Nose Wrinkler Levator labii superioris alaeque nasi Wrinkles nose
AU10 Upper Lip Raiser Levator labii superioris Raises upper lip
AU12 Lip Corner Puller Zygomaticus major Pulls lip corners up (smile)
AU15 Lip Corner Depressor Depressor anguli oris Pulls lip corners down (frown)
AU17 Chin Raiser Mentalis Raises chin
AU20 Lip Stretcher Risorius Stretches lips horizontally
AU23 Lip Tightener Orbicularis oris Tightens lips
AU24 Lip Pressor Orbicularis oris Presses lips together
AU25 Lips Part Depressor labii inferioris, Relaxation of Mentalis Parts lips
AU26 Jaw Drop Masseter, Temporalis Drops jaw/opens mouth

1.4.3 Emotion-AU Mapping

Each basic emotion can be described by a combination of Action Units:

Emotion Typical AU Combination Description
Happiness AU6 + AU12 Cheek raiser + Lip corner puller (Duchenne smile)
Sadness AU1 + AU4 + AU15 Inner brow raise + Brow lower + Lip corner depress
Anger AU4 + AU5 + AU7 + AU23 Brow lower + Upper lid raise + Lid tighten + Lip tighten
Fear AU1 + AU2 + AU4 + AU5 + AU20 + AU26 Brow raise + Brow lower + Upper lid raise + Lip stretch + Jaw drop
Disgust AU9 + AU15 + AU16 Nose wrinkle + Lip corner depress + Lower lip depress
Surprise AU1 + AU2 + AU5 + AU26 Brow raise + Upper lid raise + Jaw drop

1.5 Input Modalities

Emotion AI systems can analyze emotions through multiple input channels:

1.5.1 Facial Expression Analysis

1.5.2 Voice/Speech Analysis

1.5.3 Text Sentiment Analysis

1.5.4 Biosignal Analysis


1.6 Major Use Cases

1.6.1 Healthcare

1.6.2 Marketing and Consumer Research

1.6.3 Education

1.6.4 Customer Service

1.6.5 Automotive

1.6.6 Gaming and XR

1.6.7 Accessibility


1.7 WIA Philosophy: Hongik Ingan

ๅผ˜็›Šไบบ้–“ (Hongik Ingan)

"Benefit All Humanity"

This ancient Korean philosophy guides the WIA Emotion AI Standard. We believe that:


1.8 Chapter Summary

[OK] Key Takeaways:

  1. Definition: Emotion AI (Affective Computing) enables machines to recognize and respond to human emotions
  2. Origin: Founded by Rosalind Picard at MIT Media Lab in 1995
  3. Market: Projected to reach $37.1 billion by 2026
  4. Models: Ekman's discrete model (6 emotions) and dimensional model (Valence-Arousal)
  5. FACS: Systematic coding of facial expressions using Action Units
  6. Modalities: Face, voice, text, and biosignals
  7. Applications: Healthcare, marketing, education, automotive, gaming

1.9 Review Questions

  1. Who founded the field of Affective Computing and when?
  2. Name the six basic emotions identified by Paul Ekman.
  3. What are the two dimensions in the Valence-Arousal model?
  4. What is FACS and what are Action Units?
  5. List four input modalities for emotion recognition.
  6. Describe three use cases for Emotion AI in healthcare.

1.10 Looking Ahead

In Chapter 2, we will explore the current challenges in Emotion AI, including cultural differences, privacy concerns, accuracy limitations, and the critical need for standardization that the WIA Emotion AI Standard addresses.


Chapter 1 Complete | Approximate pages: 16

Next: Chapter 2 - Current Challenges


WIA - World Certification Industry Association

Hongik Ingan - Benefit All Humanity

https://wiastandards.com