What is the Emotion Recognition Model?
Sentiment is frequently constrained to three general classifications: positive, neutral, or negative. But your customers experience a much broader range of feelings and understanding those emotions will allow you to improve their interactions with your brand. Our new SmartAI™ emotion recognition model allows you to extract key experience emotions out of textual data. This model works at the sentence level to detect and visualize 7 core emotions your customers may experience: anger, disgust, fear, happiness, sadness, surprise, and trust.
Available to all current customers with a Standard or Amin seat.
When should this model be used?
You’ll want to use this to build on the existing sentiment analysis of your customer reviews, feedback surveys, or social media interactions. Due to the complexity of verbal interactions, this model isn’t quite ready for contact center or call data.
Applying the Emotion Recognition Model
In your chosen workspace, open the Settings tab by clicking the gear icon.
In the Deploy Models wizard, expand the How are our Customers feeling? menu by clicking the arrow.
From the available models, choose Emotion Recognition by clicking the plus icon (+).
Confirm your text field selection and click Start Analysis.
Or, for additional options, including translation services or type of model configuration (surveys vs. social media), switch to advanced setup.
It may take some time for the stream to finish processing.
In the widget editor, select the Emotion Recognition dimension under How are our customers feeling?. You can choose from sentence level emotions or top level emotions.
To finish creating your widget, choose other dimensions as desired and select a visualization format. You can also customize the coloration. In this example, we’ll use a pie visualization. Be sure to Save your widget selections.
The widget will be available in your workspace.
If you would like to view all of the emotions detected within a verbatim, consider navigating to the data tab, selecting the appropriate part of the data (Emotion Recognition). Then, click into a specific verbatim for a detailed breakdown.
Still have questions?
We're here to help! Don't hesitate to contact us for further assistance via chat or submit a ticket!