In the lastest posts, I’ve discussed the issue of the advantages of using crowdsourcing and its drawbacks or complications in the business world. This analysis is fine, but if a company doesn’t find a real value to crowdsourcing, if it doesn’t find a task to be solved by crowdsourcing, all this remains anecdotal.
When companies hear about crowdsourcing, it is explained using clear examples but often examples away from the compenies real situation, far away from its reality. For example, people usually talk about InnoCentive, where the tasks to solve are usually related to R & D of Threadless, where the task is to design t-shirts, or other platforms such as those focused on designing logos.
The problem is that for a company that already has a logo, whose enterprise objective isn’t designing T-shirts and in which they have to work hard to survive each month, more than work to improve its R&D deparment (if any), the crowdsourcing practice appears as a group of applications whose goals are far away from its real and present objectives.
However, crowdsourcing is much more than R&D, T-shirts and logos design, and has a clear application in companies of any size. In this sense, a type of tasks any company can perform using crowdsourcing in an affordable way, which involves virtually no risk and can be very useful are the microtasks.
Such tasks are formed basically by dividing a project into smaller tasks, clearly defined, that can be performed independently.
One of the most popular platforms that deals with these tasks is Amazon Mechanical Turk (AMT). In this platform, microtasks are called HITs (Human Intelligent Tasks), implying that this microtasks can not be done by a computer or a particular software, but need the participation and intelligence of a person. In a recent online seminar (webminar), it was noted that microtasks are those whose realization does not take more than an hour (although this criterion may vary between companies).
In this webminar, Sharon Chiarella, Vice President of AMT, said that on this platform, the microtasks are generally divided between 4 groups:
- Translation tasks. For example, divide a text into smaller fragments (in sentences or paragraphs), being the microtask the translation of one or more of these fragments.
- Transcription tasks. The crowd is asked to put in writing what was said in an audio, video or even image file that represent text.
- Research tasks. This type of tasks try the crowd to answer for example questionnaires of psychological or sociological studies..
- Classification tasks. For example tagging images describing what can be seen on those images or the categorization of a set of twitter users.
Obviously, this doen’t mean that any other kind of task can be proposed to the crowd. In fact, if you take a look at the HIT list, you can see that there are other kind of tasks:
- Resize Images
- Create textual content writing descriptions that can go from 40 to 150 words
- Identify if two images correspond to the same place
- Read a given text and record it in an audio file with the correct pronunciation
Among the advantages of using microtasking, the diversity and flexibility of the crowd can be found. Moreover, this kind of platforms have a very large online-community that usually perform microtasks, that means a large group of people with experience.