In the same way that different authors define crowdsourcing in different ways, different authors classify the types of crowdsourcing initiatives differently.
In an article I have published with my PhD director in the journal “El Profesional de la Información”, entitled “Tasks based classification of crowdsourcing initiatives”, we have analyzed the different classifications made by different authors (Howe, Brabham, Kleeman et al., Greets, Reichwald & Piller and Burger-Helmchen & Penin) to develop a new integrated typology. All these typologies have in common the classification criteria: the type of task to be performed by the crowd.
Here you have the typology we have developed:
- Crowdcasting. In this type of crowdsourcing initiatives, an individual, company or organization proposes to the crowd a problem or task, being rewarded who solves it first or who provides the best solution. A paradigmatic example is InnoCentive, platform that allows the proposal of tasks such as developing a treatment to reduce the coefficient of friction in metal parts made of stainless steel, awarded with $10,000. In these tasks, the crowd brings specific expertise in a particular area, solving a problem individually (Doan et al., 2011).
- Crowdcollaboration. This type considers such initiatives in which communication occurs between individuals of the crowd, while the company which initiates the process stays out. In these tasks, individuals contribute with their knowledge to solve problems or raise ideas collaboratively (Doan et al., 2011) and usually there is no financial reward. We find two subtypes that differ in the ultimate goal:
- Crowdstorming. These initiatives are online brainstorming sessions, in which ideas are proposed and the crowd gets involved with their comments and votes, as in the case of Ideajam platform.
- Crowdsupport. In this case, the customers or users themselves are the ones who solve the questions or problems of others, without having to go to after-sales service. The difference with Crowdstorming is that the crowdsupport seeks to help, as in the case of Getsatisfaction.
- Crowdcontent: in these tasks, the crowd use their labor and knowledge to create or find content of various kinds (Doan et al., 2011). It differs from crowdcasting because crowdcontent is not a competition, but each individual will work individually and at the end, the outcomes of all the participants comes together. Thus, you can find three subtypes that differ in their relation to the contents treated:
- Crowdproduction. Here, the crowd has to create content, like the one that is created when translating little text fragments or tagging images in some tasks proposed by Amazon Mechanical Turk.
- Crowdsearching. In this case, the participants have to search available content in the Internet with some goal. Although there are projects based in this kind of taks, like the Peer to Patent Review, there are also smaller tasks like some of the proposed in Amazon Mechanical Turk. For example, finding mails of certain kind of enterprises.
- Crowdanalyzing. This case is similar to crowdsearching, with the diference that the search is not done with Internet text documents, but with multimedia documents like images or videos. An example is stardust@home, in which anyone can look for stardust analyzing 3D images of the Stardust spaceship.
- Crowdfunding. In this type of initiative, an individual, organization or company seeking funding from the crowd in exchange for some reward. For example, the Spanish film “El Cosmonauta” was financed in this way; the producers offer the funders merchandising or appearing in the credits, among other rewards. In the world of sports, we can find a soccer team that is funded in this way: Myfootballclub.
- Crowdopinion. In this case, the goal is to hear from users about a topic or product. A clear example is Modcloth, an english clothes shop where any registered user can review products that have not yet gone on sale, obtaining information about their potential market acceptance. In the latter case, the crowd gives his opinion or judgment to make assessments (Doan et al., 2011). Sometimes the user’s view is not expressed through a vote, but buying and selling shares linked to a result as the presidential election. For these crowdopinion initiatives, specialized platforms called “online prediction markets” will be used crowdvoting, as Intrade o Inkling Markets.
- Estellés-Arolas, Enrique; González-Ladrón-de-Guevara,
Fernando. “Clasificación de iniciativas de crowdsourcing basada en tareas”. El profesional de la información, 2012 (in press)
- Doan, A.; Ramakrishnan, R.; Halevy, A.Y. “Crowdsourcing systems on the World-Wide Web”. Communications of the ACM. 2011, v. 54, n. 4, pp. 86-96