A featured article by Elizabeth Sullivan in the February 28 issue of Marketing Newsreports on a trend among some companies to incorporate crowdsourcing into their innovation processes.  The basic idea behind “crowdsourcing-led innovation” is that a large and diverse pool of people is both more efficient and effective at solving problems or generating ideas than any single individual or a small group.

Crowdsourcing has its origins in open source software platforms that allow communities of programmers to develop applications, but the idea took on new meaning with “Web 2.0” and the growth of user-generated Internet content. The concept got another big boost from the publication of The Wisdom of Crowds by James Surowieki.  Surowieki surveyed a lot of research to make his argument that sometimes crowds do a better job of solving problems than even the most gifted individuals.  And crowdsourcing seems to have found its way into brand positioning.  Microsoft has been airing a series of ads–most prominently during the Olympics–in which “ordinary” PC users claim credit for Windows 7 because of suggestions they made to Microsoft.

Because innovation is both important to most firms’ success and hard to do, it’s not surprising that any new approach that might improve the process would get a lot of attention.  Whether crowdsourcing will work for your business most likely will depend on the way you approach both crowdsourcing and innovation.First, it’s probably worth mentioning that innovation comes in different varieties, with differing challenges.  Most innovation involves small incremental improvements to existing solutions.  These improvements may reflect small technical achievements or address “problems in use” expressed by current customers.  Sullivan cites the example of Dell Computer responding to customer requests to offer Linux as a pre-loaded operating system option.

One step up from small incremental improvements are innovations that require solutions to technical problems or contradictions.  For services, these may be “business process innovations.”  In products, these are more likely to involve some technological advance.  The development of superabsorbent gels, for example, changed disposable diapers, and Kodak found a technical solution in lens making that made a big improvement in the image quality produced by one time use (disposable) cameras.

Above these lower difficulty levels, innovation may require significant technological breakthroughs or even new scientific discovery.

Crowdsourcing comes in different varieties as well.  One form might be called the “web suggestion box.”  This form is somewhat unstructured, relying on text mining and other tools to find content on the web that reflects ideas, suggestions, and consumer needs or desires that might lead to new products.  Prediction markets are another form of crowdsourcing.  Challenges or contests comprise the third form of crowdsourcing.

The web suggestion box will most likely produce small incremental improvements in existing solutions rather than game-changing innovation.  Most consumers are good at identifying problems in use and dissatisfaction with existing solutions, but unable to come up really new ideas.

Prediction markets are not idea generators so much as tools for evaluating ideas.  Some firms have employed internal prediction markets as part of the filtering process for new ideas, and Ely Dahan of UCLA has developed prediction markets to estimate the value of different product features (in effect, an alternative to conjoint analysis).  If the short life of Predictify is an indication, the early enthusiasm around prediction markets may be waning.  Predictify launched in 2007  as a service that allowed users to vote on potential outcomes for current news stories.  The site shut down in August, 2009.

Netflix’ Cinematch prize contest has been perhaps the most visible example of challenge crowdsourcing.  Other examples can be found at the website of Innocentive, which acts as a sort of clearing house, matching organizations with problems to solve with individuals willing to take a crack at the problem.  This form of crowdsourcing is far different from both the web suggestion box and prediction markets.  For one thing, many (but not all) of the challenges offer some sort of monetary reward.  Second, challenges are usually targeted at specific skill sets.  In order to even think about competing for the Netflix prize, competitors needed a lot of math aptitude, some programming skills, and experience with (or the ability to master) machine learning and statistical modeling.

Challenge crowdsourcing seems to be a way to add a lot of headcount for specific projects without paying much for it.  Netflix offered a prize of $1 million for achieving a 10% improvement in the performance of a Cinematch algorithm.  If, as Elizabeth Sullivan reports, 50,000 “scientists” entered the competition, that works out to $20 per scientist, most of whom presumably spent at least a few hours working on the problem.

The Netflix prize seems to have been particularly well-designed to encourage collaboration and sharing, which may be the key to effective crowdsourcing.  Challenge crowdsourcing is most likely to work for fairly well defined problems where “parallel processing” in the form of many reasonably skilled people working on the problem increases the likelihood of a solution.  For less well-defined problems such as “reinventing the male grooming category” I’m not sure that crowdsourcing will be as effective as more structured consumer insights approaches.

I doubt that most firms will abandon more traditional approaches to innovation, but those that embrace crowdsourcing may find that it does not quite live up to the promise.  After all, research on the effectiveness of brainstorming by groups of “average” individuals shows that, at best, it produces the same number of high quality ideas as one or a few very smart or specialist individuals would produce.  And let’s not forget the “auteur” model of innovation embodied by Steve Jobs.

Copyright 2010 by David G. Bakken.  All rights reserved.