New Scientist posted an interesting article on Face-reading software to judge the mood of the masses – tech – 28 May 2012 – New Scientist. This is essentially a “many faces at once” version of MIT Affective Computing Lab’s Mindreader API – which has typical MIT traits of being cutting-edge, exciting, high-performance, and agonisingly unusable outside of that small research group’s glass walls.
Probably most of the reasoning for hiding the amazing tech behind this is to trap the commercialisation potential (and uncertainty over where the commercial value will be), as hinted at by their related spin-out Affectiva and their flagship product Affdex. Affdex is made to answer the troublesome question, does your product’s viral video really make your potential customers express the facial emotions you briefed and paid the advertising agency big bucks to achieve?
In anycase, Mindreader API really is amazing and (in it’s current trained form) can classify (Dynamicly Bayesian Network) these 5 emotions from a facial image: Smile, Attention, Surprise, Confusion, Dislike. I now resist the urge for a philosphical rant as to what is and what is not an emotion, but to the credit of the researchers involved they are attempting to combine top-down (empotional state categories) and bottom up (computer detectable facial marker sets). So the clash between these will result in something somewhat arbitary. And who knows, maybe these are exactly the golden geese for product brand video emotional affect – the jury is out.