Mob wisdom means business

So-called 'crowdsourcing' lets companies create massive focus groups, garner fresh ideas, and even predict the future
  • Lena West (InfoWorld)
  • 04 January, 2008 07:14

Crowdsourcing, mob wisdom, interactive ideation, artificial AI -- call it what you will, but the idea of tapping external collaborators to develop or enhance products and services is far from revolutionary.

What is revolutionary about this Web-era twist on traditional BI tools such as focus groups and customer surveys is the breadth and depth of intelligence that can be gathered, the ease with which such projects can be undertaken, and the scale of returns that can be achieved with little upfront investment.

Yet opening up corporate conundrums to crowds in search of answers is not for the faint of heart. A targeted approach is best for those seeking to cash in on what blogger and Wired Contributing Editor Jeff Howe calls the "application of Open Source principles to fields outside of software."

Buying in to mob wisdom

What differentiates crowdsourcing from focus groups is that focus groups typically bring a select number of people together in one location for a set period of time, whereas most crowdsourcing initiatives are open-ended, soliciting feedback from a swath of geographically dispersed participants who share little in common other than an interest in the topic at hand.

Also, as opposed to relying on traditional recording devices, checklists, and the famous two-way mirror, crowdsourcing aggregates feedback using customized Web-based databases that enable stakeholders to slice and dice data in ways heretofore unimaginable. Crowdsourcing bears fruit in its potential to uncover keen competitive insights in many directions at once.

Yet with corporate professionals still struggling for approval to launch a mere blog, those who see value in tapping the wisdom of crowds face an uphill battle in launching an initiative. After all, divulging corporate challenges to millions of people is not something many C-suite executives have the stomach for.

What's more, lingering disappointment with the results of old-guard methodologies have cast a pallor over external collaboration as a viable BI-gathering technique. Focus groups and customer surveys have been lambasted for creating an uncommon set of favorable circumstances for participants, resulting in little bottom-line insight, mostly because what people say and do are usually two different things.

James Surowiecki, author of "The Wisdom of Crowds: Why the Many Are Smarter than the Few and How Collective Wisdom Shapes Business, Economies, Societies, and Nations," points out another internal hurdle to launching a crowdsourcing initiative at most organizations.

"The barriers there tend to be more institutional or maybe even psychological," he says. "There might be a wariness on behalf of management of hearing what people actually think or the effects of that information on morale or strategy."

But if you think crowdsourcing is alive only among the hip, latte-drinking, highly funded Web 2.0 startup crowd, think again. Major brands such as Dell, Eli Lilly, Proctor & Gamble, Google, and Best Buy are already reaching their corporate hands into the collective wisdom pot in hopes of extracting knowledge-laden honey.

The crowdsourcing conundrum

When it comes to launching a crowdsourcing initiative, knowing which corporate problems can withstand public scrutiny is key.

As Lloyd Tabb, CTO of LiveOps, says, "It's all about the value of the solution versus the cost of revealing the problem. So you have to evaluate on an individual basis: 'Is it worth revealing this problem so that I can get a solution?' "

Spearheading arguably one of the largest distributed call centers, with more than 16,000 active service agents, Tabb has deep roots in crowdsourcing. One of the founders of, Tabb also previously worked as the information architect of Netscape Communicator, where he was a key player in establishing Open Directory -- a project that set the stage for products such as Wikipedia.

Page Break

Kicked off in June 1998, with 5,000 people working from home to create and edit a Web-based content directory, Open Directory was one of the first crowdsourcing initiatives. Supplying its content freely to the public, Open Directory eventually became the indexing backbone for many of today's popular search engines.

"If a problem is that hard to solve," Tabb says, "then no one else is solving it either. If your competitors have figured it out, then you can copy them. But if no one has figured it out, then [the problem is] worth revealing."

Netflix, for example, wanted to improve its capability of predicting whether customers would like a particular movie recommendation based on past preferences and the selections of similar individuals. To improve its chances, the company launched the Netflix Prize Web site, offering US$1 million for the best solution. The project, which has yet to award a grand prize, opened the industry leader's customer data sets to the public -- a significant competitive risk that Netflix believes will reap substantial rewards.

"It's about taking a Net-based economy, letting people operate freely within that economy, and getting information from that," Tabb explains, speaking of crowdsourcing's value proposition. "The underlying premise is that the market of the whole will predict better than any of the individuals in the market."

Proof of the crowdsourcing concept

For many organizations, crowdsourcing has become synonymous with predictive markets, which create and tap a community of people to help predict the outcomes of certain scenarios, such as a presidential election. Such markets have the potential to provide invaluable data that goes well beyond what can be gleaned from focus groups, especially when linked with granular demographic information about the participants.

Predictive markets are but one form of crowdsourcing. Another popular mode is "human computing," in which companies create online games for people to play; the outcome of the game is information. Google, for example, sought to index billions of photos. Rather than have folks on staff devote years to tagging and categorizing images, the company launched Google Image Labeler in September 2006.

Google's labeling game is based on the ESP Game created by Luis von Ahn at Carnegie Mellon University. Two random people, called pairs, are connected online. They are shown an image and are asked to label it. The moment they type in the same term, Google's system immediately connects that term with that image. The reasoning is if two people use the same term to describe an image, it is likely that others will as well. The gaming element comes into play because each person is then given a point and assigned a ranking. Top pairs and all-time top contributors are then recognized on the Google Image Labeler home page.

Other initiatives forgo play for pay

Labeling its foray into crowdsourcing "artificial artificial intelligence," has established the Mechanical Turk project, essentially "piece work" for a knowledge-based economy.

With Mechanical Turk, companies create HITs (Human Intelligence Tasks). Everyday people accept these tasks in exchange for small sums of money. The HITs are tasks that humans can quickly and easily accomplish, but would take many hours of programming for computers to carry out -- such as examining a scanned receipt and pulling out specific pieces of data. Mechanical Turk helps companies answer business-essential rank-and-file questions, and the folks answering those questions get paid.

Michael Dell raised eyebrows with the release of Dell Idea Storm. One of the questions Dell put to this crowdsourcing market was, "What product do you really want us to create?" The overwhelming response was to create a laptop that had the option of no operating system or the Linux operating system. Dell listened to its market to significant results: Reports state that Dell has sold more than 40,000 laptops installed with the Linux-based Ubuntu OS.

Page Break

Group crowdsourcing is also cropping up. InnoCentive describes its community as "open innovation." It is an alliance of various companies, called Seekers, designed to tap in to the creativity and collective intelligence of a global network of subject matter experts, called Solvers. In InnoCentive's Open Innovation Marketplace, Solvers are handsomely rewarded with prizes reaching $100,000 for their solutions.

Ensuring optimal results

Crowdsourcing success very much depends on the quality and quantity of participation. The best way to maximize participation is to have a firm grasp on the desired outcome and to reward people for contributing. If the desired outcome is to get as much information about a particular topic as possible, then the rules of the "game" need to reward participants who deliver quality information about that topic and penalize those who don't.

It's easy to see how crowdsourcing, by its nature, can quickly get out of hand. It's helpful to staff your crowdsourcing team with people who can think like both computer scientists and economists -- able to see 10 moves ahead. The ability to look for unintended as well as intended outcomes and to determine how to manage those facets for the betterment of the project is a critical skill.

Once your initiative has been launched, keeping a close eye on crowd engagement is essential to success. This means more than just noting whether folks are signing up and participating. You must continually assess whether your ability to forecast outcomes has improved as a result of your crowdsourcing data or, qualitatively speaking, whether the opportunities surfaced are gravitating toward solutions that make sense to implement.

Here, time is an important factor. In the short run, flukes are possible, but long-term improvement tracking is a key component of crowdsourcing success. Establish policies for analyzing results with the long term in mind.

The value of predictions

The most popular use of crowdsourcing by far is the prediction market, in which participants are given fake money or stock credits to use in trading-style transactions.

Prediction markets use the flow of the crowd's currency to predict the occurrence of future events. Accurate predictions are important because they help companies allocate resources more effectively and benefit financially from projections.

Best Buy used an internal prediction market to determine when it would open its first store in China. The winning employees were rewarded with gift cards, and Best Buy was rewarded with near-prescient knowledge of when to prepare for its China debut.

For the past two years, Google has operated a very successful internal crowdsourcing project open only to its employees. The project forecasts product launch dates, new office openings, and other strategic corporate decisions. Google's employees have accurately predicted the probability of more than 200 separate events, which might be one of the reasons Google is able to gain and maintain such wide competitive margins.

Crowdsourcingat your service

Several software development firms make it easier for companies to engage in predictive markets by selling or leasing a hosted, turnkey version of predictive market software. Two such companies are Inkling and Predictify.

Inkling has developed a product called Inkling Markets that helps customers tap the collective wisdom of partners, employees, and customers.

"In most companies, people work in one department, and although they have been 'narrowcasted' to focus on one position, they probably have a good idea of what's going on in other areas of the company," says Adam Seigel, CEO and co-founder of Inkling. "And they probably have informed opinions about all sorts of topics. Before prediction markets, there wasn't really a good way to capture the collective intelligence within a corporation."

Page Break

ABC7, the San Francisco affiliate of the ABC network, has implemented Inkling Markets on the station's Web site in the ABC7 Futures Market. ABC7 asks its viewers questions about local and international news items to generate both short- and long-range predictions. Some questions range from "Who will be Barbara Walters' most fascinating person of 2007?" to "Will the price of oil top US$100 a barrel in 2007?"

Ellen Conlan, vice president of station marketing and research at ABC7, says, "We refer to the prediction market results in our newscasts. It performs the function of a poll, but it gives us a better indication not of what the predictors want to happen -- as is the case with a poll -- but rather what they think is most likely to happen. So it tends to be very accurate."

Predictify offers the same type of service, except that, in addition to working with companies, it opens its prediction markets to the public. Anyone can pose a question to the market at any time.

"We also collect demographic information about those users, and we allow the question-askers to filter the data based on those demographic attributes. So you can understand not just what the crowd is saying but who they are, which allows you to see that women are more optimistic than men about a certain topic, and evaluate what that means for you," says Parker Barrile, co-founder and CEO of Predictify.

Aggregation and scarcity in predictive markets

Predictive markets are known to be extremely effective in forecasting political elections. They also help candidates see which issues they lead on, as well as the areas where their messaging needs work.

Smaller civic organizations can also benefit. The University of Iowa's on-campus club the University of Iowa Democrats -- the largest Democratic club in the state of Iowa -- uses Predictify's tool to pull greater insights out of political public opinion polls.

Atul Nakhasi, a junior at the University of Iowa and president of the University of Iowa Democrats, is amazed at the insights.

"We're going to have 120,000 caucus-goers in Iowa on the evening of Jan. 3, and if you just have two or three people making decisions, it's not going to be reflective of the whole group. But now, if we can expand this throughout our entire membership and organization, we're going to start reflecting the thoughts of the state of Iowa and coming up with a more accurate picture than what the polls are saying," Nakhasi says.

The key to success in using predictive markets is to create the right amount of scarcity. If the online "market" that a company creates has too much of what it values as currency, participants' "purchases" don't predict anything because everyone is buying everything simply because they have the means to do so.

For most b-to-c companies, it's much easier to predict outcomes by studying the average budget-conscious consumer than it is to study multibillionaires.

The cautious approach to collective wisdom

Several factors are contributing to the rise of crowdsourcing in today's competitive business landscape. Foremost is the fact that organizations can gain hard-to-obtain information without a lot of investment, since the economic barrier to entry for crowdsourcing initiatives is very low.

Also, there are fewer moving parts in crowdsourcing than in traditional external collaboration projects, such as focus groups and customer surveys. Once the project is launched, participation is not likely to require screening committees, nor active monitoring. The data-rich nature of the Web -- when coupled with a well-designed database and UI -- takes care of that automatically.

Page Break

But the real carrot for companies looking into crowdsourcing is the knowledge that, when it comes to forecasting, wisdom is collective.

But for all its upsides, crowdsourcing does not work for every project or company. As a rule, crowdsourcing best fits structured transactions -- such as assigning keywords to an image, buying and selling stock, taking phone calls -- as opposed to more amorphous, customized tasks such as developing a marketing plan or corporate strategy. Let's not forget, opening the gates to the masses does open your company to the usual risks.

Netflix's predictive recommendations project, for example, almost burned corporate shorts when two computer scientists from the University of Texas were able to determine that for movies other than the 100 most popular, user ratings and the dates of those ratings when coupled with reviews found elsewhere on the Internet could be used to identify sources which were supposed to remain anonymous.

Yet for many organizations, there is just too much untapped knowledge within the company walls to forgo giving crowdsourcing at least an in-house chance.

As "The Wisdom of Crowds" author Surowiecki says, "Set aside the question of trying to reach outside the organization. One of the things that companies need to do a better job of [is] tapping the collective knowledge of the people inside their organizations. Just doing that would be an important first step."