When machines know what you’re feeling

0
129

Affectiva, a nearly 2-year-old startup that grew out of MIT’s Media Lab, is teaching machines to understand what you feel. Affectiva has developed a way for computers to recognize human emotions based on facial cues or physiological responses, and then use that emotion-recognition technology to help brands improve their advertising and marketing messages. For example, Affectiva might train a webcam on users while they watch ads, tracking their smirks, smiles, frowns and furrows to measure their levels of surprise, amusement or confusion throughout a commercial and compare them to other viewers across different demographics. Affectiva also makes a wearable biosensor that can monitor the user’s emotional state via her skin.
Which industries are interested
in applying Affectiva’s emotion-recognition technology to their work?
The closest use case is measuring responses to media — whether you’re watching an advertisement, movie trailer, movie, TV show or online video. If an ad is supposed to be funny, but we look at 100 participants and none laugh, then we know it’s not really effective. The idea is to enable media creators to optimize their content. Political polling is another big area for us — measuring people’s responses to a political debate. There are applications in games, in all things social, and in health, too. We can read your heart rate from a webcam without you wearing anything — we can just use the reflection of your face, which shows blood flow. Imagine having a camera on all the time monitoring your heart rate so that it can tell you if something’s wrong, if you need to get more fit, or if you’re furrowing your brow all the time and you need to relax.