Get Your Face Read!

0
  • //www.neurosciencemarketing.com/wp-content/uploads/2012/02/affectiva-faces-2.jpg” alt=”Facial Expressions” title=”affectiva-faces” width=”540″ height=”144″ class=”aligncenter size-full wp-image-5024″ />
  • Back in November, I mentioned that //www.neurosciencemarketing.com/blog/companies/affectiva” title=”Affectiva”>Affectiva, a firm in our neuromarketing companies list, was working on doing facial expression analysis using the webcams connected to most computers. Now, if you want to see how this works, you can watch a selection of recent Super Bowl ads with your camera turned on – you’ll see how your expressions stack up to those of other viewers.
  • Here’s what mine looked like for a Chevy ad
  • //www.neurosciencemarketing.com/wp-content/uploads/2012/02/affectiva-results-540×372.jpg” alt=”Affectiva results” title=”affectiva-results” width=”540″ height=”372″ class=”aligncenter size-large wp-image-5021″ />
  • The system reports on a variety of metrics, including “smile” and the more cryptic “valence.” According to Affectiva,
  • Bayesian machine learning processes are used to combine the facial and head movements in order to recognize positive and negative displays of emotion as well as complex states such as interest and confusion.

  • //www.neurosciencemarketing.com/wp-content/uploads/2012/02/affectiva-plot.jpg” alt=”Affectiva Plot” title=”affectiva-plot” width=”540″ height=”170″ class=”aligncenter size-full wp-image-5022″ />
  • The firm strikes a cautious note in describing the limitations of the technique
  • Although some people claim that their face-reading technologies “recognize emotions,” it’s important to note that the state of the technology is such that it recognizes outward expressions, which may or may not correspond to true feelings.
  • So, //www.neurosciencemarketing.com/blog/” title=”Neuromarketing”>Neuromarketing readers, here’s your assignment:
    1. Go to the //www.affectiva.com/affdex/?#pane_tryit” title=”Face Reading Demo” target=”_blank”>Affectiva face-reading demo and view an ad or two with your webcam on.
    2. Come back here and let everyone know how it worked for you.
  • I’m not sure how I felt about the accuracy of my data. At one point, for example, I found the volume was too low and had to hunt for the controls; I can’t imagine that my expressions at that point had much to do with the ad. Plus, I’m not sure I emote very much when watching a typical ad. On the other hand, if you use a large enough sample, presumably interruptions and irrelevant expressions affecting one subject will be canceled out by the mass of data. Being a web-based solution that uses standard user equipment, scalability should be robust. Let us know what you think!

Leave A Reply

Your email address will not be published.