January 2011

If you were lisitening to NPR’s “All Things Considered” broadcast on January 18, you might have heard a brief report on research that reveals regional differences (“dialects”) in word usage, spellings, slang and abbreviations in Twitter postings.  For example, Northern and Southern California use spelling variants koo and coo to mean “cool.”

Finding regional differences in these written expressions is interesting in its own right, but I’ve just finished reading the paper describing this research and there’s a lot more going on here than simply counting and comparing expressions across different geographic regions.  The paper is an excellent example of what market researchers might do to analyze social media.

The study authors–Jacob Eisenstein, Brendan O’Connor, Noah A. Smith, and Eric P. Xing–are affiliated with the School of Computer Science at Carnegie Mellon University (Eisenstein, who was interviewed for the ATC broadcast, is a postdoctoral fellow).  They set out to develop a latent variable model to predict an author’s geographic location from the characteristics of text messages.  As they point out, there work is unique in that they use raw text data (although “tokenized”) as input to the modeling.  They develop and compare a few different models, including a “geographic topic model” that incorporates the interaction between base topics (such as sports) and an author’s geographic location as well as additional latent variable models:  a “mixture of unigrams” (model assumes a single topic) and a “supervised linear Dirichlet allocation.”    If you have not yet figured it out, the models, as described, use statistical machine learning methods.  That means that some of the terminology may be unfamiliar to market researchers, but the description of the algorithm for the geographic topic model resembles the hierarchical Bayesian methods using the Gibb’s sampler that have come into fairly wide use in market research (especially for choice-based conjoint analysis).

This research is important for market research because it demonstrates a method for estimating characteristics of individual authors from the characteristics of their social media postings.  While we have not exhausted the potential of simpler methods (frequency and sentiment analyses, for example), this looks like the future of social media analysis for marketing.

Copyright 2011 by David G. Bakken.  All rights reserved.


There’s an interesting article by Jonah Lehrer in the Dec. 13 issue of The New Yorker, “The Truth Wears Off:  Is there something wrong with the scientific method?” Lehrer reports that a growing number of scientists are concerned about what psychologist Joseph Banks Rhine termed the “decline effect.”  In a nutshell, the “decline effect” is an observed tendency for the size of an observed effect to decline over the course of studies attempting to replicate that effect.  Lehrer cites examples from studies of the clinical outcomes for a class of once-promising antipsychotic drugs as well as from more theoretical research.  This is a scary situation given the inferential nature of most scientific research.  Each set of observations represents an opportunity to disconfirm a hypothesis.  As long as subsequent observations don’t lead to disconfirmation, our confidence in the hypothesis grows.  The decline effect suggests that replication is more likely, over time, to disconfirm a hypothesis than not.  Under those circumstances, it’s hard to develop sound theory.

Given that market researchers apply much of the same reasoning as scientists in deciding what’s an effect and what isn’t, the decline effect is a serious threat to creating customer knowledge and making evidence-based marketing decisions. (more…)

Last week Starbucks CEO Howard Schultz unveiled a new logo for the brand.  By now we should know that any attempt to mess with a popular brand (formula, name or logo) is going to elicit at least a few expressions along the lines of “what are they thinking?”

Schultz tried to communicate just what they were thinking in an online video, saying that the new design respects the brand’s hearitage but also looks toward a future for the firm that’s not just coffee.  And on the Starbucks website one of the company’s creative managers, calling the logo change “the project of a lifetime,” says:  “From the start, we wanted to recognize and honor the important equities of the iconic Starbucks logo.”

Firms do change logos without the flack that The Gap generated a couple of months ago, but that change (abandoned in the wake of criticism) and Starbucks new logo may offer important lessons about when and why to change logos, and what not to change.  As I’ve expressed in a previous post, I’m skeptical about neuromarketing, but logos are one area where I think companies might do well to invest in a little neuroscience before making big changes.  Logos provide a sort of cognitive shortcut, and we humans are cognitive misers.  The more “iconic” a logo is, the more effective the cognitive shortcut.  To be iconic, a logo needs to be distinctive (not easily confused with other symbols) and consistently associated with with a specific set of brand experiences.  Moreover, the iconic elements of a logo are usually basic properties like shape, color, typeface and spatial relationships within the logo.  The true test of an iconic element is the extent to which you can minimize or degrade the image (showing just a small part, for example) and evoke the complete logo or the associated brand.  For example, in the hit CBS drama NCIS the main characters drink coffee from cups that have some sort of green and black circular logo with white lettering–just enough to suggest Starbucks.  I think one problem with The Gap’s proposed logo was that it changed several iconic elements at once (background color, typeface, and spatial relationships within the logo).

Having been on the inside of a few re-positioning efforts–some involving new logos or package designs–I think that we often over-intellectualize the meaning of the elements in a logo.  Starbucks appears to believe that the mermaid (or siren) in the center of the logo is the symbol of the brand, but some of the comments posted to their website indicate that many customers never really noticed that the element in the center of the logo was a mermaid.

I don’t know whether Starbucks (or The Gap, for that matter) did any consumer research in the course of developing their new logos.  However, the typical research methods for this type of thing rely on verbal responses, while logos operate mostly at a nonverbal level.  Perceptual research methods are more appropriate for this type of problem than focus groups and surveys.

Many companies have modified their logos, of course, but those that succeed seem to recognize that it’s more important to understand the Gestalt of the logo than any potential symbolic meaning of the logo or individual elements.  You can tweak a typeface, for example, to look more modern, or play with the saturation and tint of the color scheme, but if several elements change at the same time, or the spatial relationships change, you can expect the kind of feedback that The Gap and Starbucks have experienced.

Starbucks may well stick by its new logo.  In that case, I suspect customers still will be able to find a store when they want, or recognize Starbucks coffee on their supermarket shelf.  My main issue with these logo changes is that they are superficial.  Consumers don’t buy “Starbucks” or “the siren”–they buy a particular constellation of experiences and solutions to problems.  If you get the latter right and you’re lucky enough to have an iconic logo, why abandon it?

Copyright 2011 by David G. Bakken.  All rights reserved.