Our work involves efforts to understand the correlates of emotion that can potentially be identified by a computer; these are primarily behavioral and physiological expressions of emotion. Because we can measure physical events, and cannot recognize a person's thoughts, research in recognizing emotion is lmiited to correlates of emotional expression that can be sensed by a computer, including such things as physiology, behavior, and even word selection when talking. Emotion modulates not just memory retrieval and decision-making (things that are hard for a computer to know), but also many sense-able actions such as the way you pick up a pencil or bang on a mouse (things a computer can begin to observe).
In assessing a user's emotion, one can also measure an individual's self-report of how they are feeling. However, that is often considered unreliable, since self-report varies with thoughts and situations such as "what it is appropriate to say I feel in the office?" and "how do I describe how I feel now anyhow?" Many people have difficulty recognizing and/or verbally expressing their emotions, especially when there is a mix of emotions or when the emotions are nondescript. In many situations it is also inappropriate to interrupt the user for a self-report. Nonetheless, we think it is important that if a user wants to tell a system verbally about their feelings, the system should facilitate this. We are interested in emotional expression through verbal as well as non-verbal means, not just how something is said, but how word choice might reveal an underlying affective state.
Our focus begins by looking at physiological correlates, measured both during lab situations designed to arouse and elicit emotional response, and during ordinary (non-lab) situations, the latter via affective wearable computing.
There are a number of groups conducting research on emotion theory and physiology, which form a useful resource for informing our research.
Our first efforts toward affect recognition have focused on
detecting patterns in physiology that we receive from sensing devices. To this effect, we are
designing and conducting experiments to induce particular affective
responses. One of our primary goals is to be able to determine which
signals are related to which emotional states -- in other words, how
to find the link between the user's emotional state and its
corresponding physiological state. We are hoping to use, and build upon,
some of the work done by others on coupling physiological information
with affective states. Several projects in
affective pattern recognition are listed below.
Current efforts that use physiological sensing are focusing on:At
Consider the case where a subject is given the following three tasks to perform:
The following figure shows the GSR response of a test subject. Each of
the tasks described above is followed by a period of rest before the subject
is engaged in the next task.
Graph of Galvanic Skin Response (GSR).
There is no definitive model of emotions. Psychologists have been debating
for years how to define them. The pattern recognition problem consists of
sorting observed data into a set of states (classes or categories)
which correspond to several distinct (but possibly overlapping, or "fuzzy") emotional
states. Which tools are most suitable to accomplish this depends on the
nature of the signals observed. An overview of different models used for affective
signals can be found in Chapter 6 of
Affective Computing, (MIT Press, 1997)
One particular way we can model affective states is as a set of discrete states with defining characteristics that the user can transition to and from. The following diagram illustrates the idea:
A diagram of a Markov model for affective states.
Each emotional state in the diagram is defined by a set of features.
Features may be just about anything we can measure or compute -- e.g. the
rise time of a response, or the frequency range of a peak interval, etc.
Therefore, an important part of the pattern recognition process consists
of identifying functions of these features which differentiate one state
from another. Each state in this model is integrated into a larger scheme
which includes other affective states the user can move to and from. The
transitions in this (Markov) model are defined by transition probabilities.
For instance, if we believe that a user in the affective state labeled as
"Anger" is more likely to make a transition to a state of "Rage"
than he is to a state of "Sadness", we need to adjust the conditional
probabilities to reflect that. A model like this one is trained on observations
of suitable sentic signals (physiological or other signals through which affective
content is manifested) to be able to characterize each state and estimate
the transition probabilities.
Furthermore, the modeling of affective states can be adapted to reflect a particular user's affective map. Therefore, the notion of systems that learn from user interaction can be imported into the affective pattern recognition problem to develop robust systems. The following diagram offers an overview of the stages of the recognition process.
A diagram of the affect pattern recognition module.
The feature extraction stage is the area at which most of the current
research is directed to determine which are the relevant affective signal
features to submit to the learner. Since the learner operates on the extracted
features and not on the signals, it is possible, for instance, to use learners
developed to operate on the visual domain, and apply their operations to
the processing of affective signals.
RESEARCH AREAS: Emotions | Sensing | Recognizing | Understanding
Synthesizing | Applications | Interfaces | Communication | Wearables
Want to email us? For answers to frequently asked questions and for our email address, please visit the FAQ and email page. Thanks!