Smart Eye Extends Use of Affectiva Emotion AI for Qualitative Research with Conversational Engagement and Valence Metrics

Smart Eye Extends Use of Affectiva Emotion AI for Qualitative Research with Conversational Engagement and Valence Metrics

These market-first capabilities provide deeper insight into consumer responses to content using facial analysis for online qualitative research.

Smart Eye announces new capabilities in its category-defining Affectiva Emotion AI, that provide deeper insights for online qualitative research not available before. This release adds conversational engagement and valence metrics that use facial expression analysis to understand the emotional states and reactions of participants speaking in online qualitative research studies, such as focus groups and verbatim video feedback.

The new conversational engagement and valence metrics augment the “human touch” of study moderators, who can now gain additional insight more quickly and effectively during online studies. This has become critically important during the pandemic, where research studies, that were previously conducted in person, moved online.

Affectiva’s Emotion AI technology is used by 70 percent of the world’s largest advertisers and 28 percent of Fortune Global 500 companies to understand viewers’ emotional reactions to content and experiences, maximizing brand ROI. With the help of Emotion AI, clients can test the unbiased and unfiltered emotional responses that consumers have with brand content, such as video ad content and longer TV programming. Affectiva’s technology has validated exclusive measures to give confidence in market performance, giving companies clear guidance on the emotional role of their brand. The device-agnostic system works across mobile, tablet, desktop and physical environments, and works with optical sensors, standard webcams, and near-infrared and RGB cameras. Today’s announcement expands the capabilities and uses of the company’s market-leading Affectiva Emotion AI.

Marketing Technology News: Playvox and Squaretalk Partner to Optimize Customer Experience

Conversational Metrics: Bringing back the “human touch” with deep analysis to a virtual research environment

While Affectiva’s Emotion AI for Media Analytics has traditionally worked on people watching content and expressing emotional responses to that content, today’s announcement brings an interactive element to the technology. In a world that has moved, and is expected to largely stay an online testing experience, it’s important for focus study group moderators to be able to pick up on the subtle discrepancies between what is said and felt in a virtual environment.

The Conversational Engagement and Conversational Valence metrics have been developed to augment the research community whose business it is to talk directly to users of services. Built on deep learning, the new metrics allow for the distortions in facial expression produced when people speak— thus allowing more accurate, and inference of responses based on those expressions. An adaptive version of the metrics is particularly suited for focus group discussion videos with multiple participants. Using a speech detector, it applies the new conversation metrics only to those sequences and individuals who are speaking – allowing the optimal measures to be deployed whether people are speaking, or reacting to the speech of others, as demonstrated in this video.

Smart Eye customers that augment their qualitative research using this Affectiva technology will provide validation to the “gut feel” their moderators have when conducting their research with participants. Conversational engagement and valence measurements can be used as an efficient tool to analyze study data in order to quickly identify emotional moments. These metrics can also provide more compelling evidence in debriefs of the emotional power of topics tested.

Marketing Technology News: MarTech Interview with Jake Athey, VP of Marketing and Sales at Widen

These metrics can be applied to most areas of qualitative research, and no other facial expression analysis tool offers this capability yet. While the focus remains on facial expressions of response, these metrics add a unique additional dimension which is designed to be used alongside the verbal responses given by respondents for a more complete picture of people’s responses.

“Affectiva has been helping market researchers for over a decade now understand how consumers emotionally respond to all sorts of content,” said Dr. Rana el Kaliouby, former Co-Founder and CEO of Affectiva, now Deputy CEO of Smart Eye. “The addition of conversational engagement and valence to our Emotion AI provides an even more robust way to detect viewer engagement and excitement when they are discussing ideas and concepts, rather than simply viewing content. I am excited that we are now able to bring this unique capability to our customers who have eagerly anticipated it.”

“Access to this kind of analysis has been a game changer for our moderators,” Affectiva Smart Eye customer Sarah Gorman, Director at Two Ears One Mouth, commented. “It’s exciting not only to see the next iteration of this innovative Emotion AI technology, but to collaborate with a partner that understands and prioritizes working with us to solve for the inevitable new challenges we face today and in the future.”

Conversational Engagement and Conversational Valence will be made available to Affectiva Emotion AI customers this September in both the Affectiva SDK and the core media analytics product.

Picture of Business Wire

Business Wire

For more than 50 years, Business Wire has been the global leader in press release distribution and regulatory disclosure.

You Might Also Like