MLBytes Workshop: Building Emotionally Intelligent AI: From Sensing to Synthesis
Emotions play an important role in our everyday lives. They influence memory, decision-making and well-being. In order to advance the fundamental understanding of human emotions, build smarter affective technology, and ultimately help people, we need to perform research in-situ. Leveraging exciting advances in machine learning it is now possible to quantify emotional responses on a large scale using webcams and microphones in everyday environments. I will present novel methods for physiological and behavioral measurement via ubiquitous hardware. Then I will present state-of-the-art approaches for emotion synthesis that can be used to create rich human-agent or robot interactions. Finally, I will show examples of new human-computer interfaces and autonomous systems that leverage behavioral and physiological signals, including emotion-aware natural language conversation systems and vehicles with intrinsic emotional drives.
Daniel McDuff is a Researcher at Microsoft where he leads research and development of affective computing technology, with a focus on scalable tools to enable the automated recognition and analysis of emotions and physiology. Daniel completed his PhD in the Affective Computing Group at the MIT Media Lab in 2014 and has a B.A. and Masters from Cambridge University. His work on noncontact physiological measurement helped spawn a new field of imaging-based photoplethysmography.