Monday, April 29, 2013

Crowdsourcing Caught the Boston Bombers

I think it is interesting that Crowdsourcing caught the Bombers within about 4 days. Yes, there were many different factors like the Bombers not leaving the scene very far and not leaving  the country. But, now Crowdsourcing has been so effective I think others in the U.S. and around the world will also use crowdsourcing too to quickly capture terrorists and criminals in this way.

However, it also concerns me because some countries might not be telling the truth about people or there might be erroneous information out there about people and the wrong people also could be brought to justice by a misinformed Mob through Crowdsourcing. Because just like other things this can be used to good ends or misused. So, I think you will see it used both ways in the future. So, it is something to consider using very carefully. So, be very careful if you can about this so only good results come from whatever crowdsourcing you participate in.

Crowdsourcing - Wikipedia, the free encyclopedia

en.wikipedia.org/wiki/Crowdsourcing
Crowdsourcing is, according to the Merriam-Webster Dictionary, the practice of obtaining needed services, ideas, or content by soliciting contributions from a ...

Crowdsourcing

From Wikipedia, the free encyclopedia
Jump to: navigation, search
Crowdsourcing is, according to the Merriam-Webster Dictionary, the practice of obtaining needed services, ideas, or content by soliciting contributions from a large group of people, and especially from an online community, rather than from traditional employees or suppliers.[1] Often used to subdivide tedious work or to fund-raise startup companies and charities, this process can occur both online and offline.[2] The general concept is to combine the efforts of crowds of volunteers or part-time workers, where each one could contribute a small portion, which adds into a relatively large or significant result. Crowdsourcing is different from an ordinary outsourcing since it is a task or problem that is outsourced to an undefined public rather than to a specific, named group.
Although the word "crowdsourcing" was coined in 2006, it can apply to a wide range of activities.[3] Crowdsourcing can involve division of labor for tedious tasks split to use crowd-based outsourcing, but it can also apply to specific requests, such as crowdvoting, crowdfunding, a broad-based competition, and a general search for answers, solutions, or a missing person.

Contents

Definitions

Jeff Howe, contributing editor at Wired Magazine, posited the first definition of "crowdsourcing" in a companion blog post to his June 2006 Wired magazine article:
"Simply defined, crowdsourcing represents the act of a company or institution taking a function once performed by employees and outsourcing it to an undefined (and generally large) network of people in the form of an open call. This can take the form of peer-production (when the job is performed collaboratively), but is also often undertaken by sole individuals. The crucial prerequisite is the use of the open call format and the large network of potential laborers."[4]
The first written use of the word "crowdsourcing" was by Steve Jurvetson in February 2006, to describe a collective effort to manage an online discussion forum on flickr.
Daren C. Brabham was the first to define "crowdsourcing" in the scientific literature in a February 1, 2008, article:
"Crowdsourcing is an online, distributed problem-solving and production model."[5]
In the classic use of the term, problems are broadcast to an unknown group of solvers in the form of an open call for solutions. Users—also known as the crowd—submit solutions which are then owned by the entity which broadcasted the problem—the crowdsourcer. In some cases, the contributor of the solution is compensated monetarily, with prizes, or with recognition. In other cases, the only rewards may be kudos or intellectual satisfaction. Crowdsourcing may produce solutions from amateurs or volunteers, working in their spare time, or from experts or small businesses which were unknown to the initiating organization.[2]
Crowdsourcers are primarily motivated by its benefits. One of these includes the ability to gather large numbers of solutions and information at a relatively inexpensive cost. Users are motivated to contribute to crowdsourced tasks by both intrinsic motivations, such as social contact, intellectual stimulation, and passing time, and by extrinsic motivations, such as financial gain.
Due to the blurred limits of crowdsourcing, many collaborative activities are considered crowdsourcing even when they are not. Another consequence of this situation is the proliferation of definitions in the scientific literature.[6] Different authors give different definitions of crowdsourcing according to their specialties, losing in this way the global picture of the term.
After studying more than 40 definitions of crowdsourcing in the scientific and popular literature, Enrique Estellés-Arolas and Fernando González Ladrón-de-Guevara developed a new integrating definition:
"Crowdsourcing is a type of participative online activity in which an individual, an institution, a non-profit organization, or company proposes to a group of individuals of varying knowledge, heterogeneity, and number, via a flexible open call, the voluntary undertaking of a task. The undertaking of the task, of variable complexity and modularity, and in which the crowd should participate bringing their work, money, knowledge and/or experience, always entails mutual benefit. The user will receive the satisfaction of a given type of need, be it economic, social recognition, self-esteem, or the development of individual skills, while the crowdsourcer will obtain and utilize to their advantage that what the user has brought to the venture, whose form will depend on the type of activity undertaken".[6]
Henk van Ess emphasizes the need to "give back" the crowdsourced results to the public on ethical grounds. His non-scientific, non-commercial definition is widely cited in the popular press:
"Crowdsourcing is channeling the experts’ desire to solve a problem and then freely sharing the answer with everyone" [7]
Crowdsourcing systems are used to accomplish a variety of tasks. For example, the crowd may be invited to develop a new technology, carry out a design task (also known as community-based design[8] or distributed participatory design), refine or carry out the steps of an algorithm (see human-based computation), or help capture, systematize, or analyze large amounts of data (see also citizen science).

History

The term "crowdsourcing" is a portmanteau of "crowd" and "outsourcing," coined by Steve Jurvetson online and then published by Jeff Howe in a June 2006 Wired magazine article "The Rise of Crowdsourcing".[2] It has been argued that crowdsourcing can only exist on the Internet and is thus a relatively recent phenomenon;[5] however, long before modern crowdsourcing systems were developed, there were a number of notable examples of projects that utilized distributed people to help accomplish tasks.

Historical examples

The Oxford English Dictionary

The Oxford English Dictionary (OED) may provide one of the earliest examples of crowdsourcing. An open call was made to the community for contributions by volunteers to index all words in the English language and example quotations of their usages for each one. They received over 6 million submissions over a period of 70 years. The making of the OED is detailed in The Surgeon of Crowthorne, by Simon Winchester.[9]

Crowdsourcing in genealogy research

Genealogical research was using crowdsourcing techniques long before personal computers were common. Beginning in 1942 members of The Church of Jesus Christ of Latter-Day Saints (LDS Church) encouraged members to submit information about their ancestors. The submitted information was gathered together into a single collection. In 1969 in order to encourage more people to participate in gathering genealogical information about their ancestors, the church started the three-generation program. In this program church members were asked to prepare documented family group record forms for the first three generations. The program was later expanded to encourage members to research at least 4 generations and became known as the four-generation program.[10]
Institutes that have records of interest to genealogical research have used crowds of volunteers to create catalogs and indexes to records.

Early crowdsourcing competitions

Crowdsourcing has often been used in the past as a competition in order to discover a solution. The French government proposed several of these competitions, often rewarded with Montyon Prizes, created for poor Frenchmen who had done virtuous acts.[11] These included the Leblanc process, or the Alkali Prize, where a reward was provided for separating the salt from the alkali, and the Fourneyron's Turbine, when the first hydraulic commercial turbine was developed.[12]
In response to a challenge from the French government, Nicholas Appert won a prize for inventing a new way of food preservation that involved sealing food in air-tight jars.[13] The British government provided a similar reward to find an easy way to determine a ship's longitude in the The Longitude Prize. During the Great Depression, out-of-work clerks tabulated higher mathematical functions in the Mathematical Tables Project as an outreach project.[14]

Modern methods

Today, crowdsourcing has transferred mainly to the Internet. The Internet provides a particularly good venue for crowdsourcing since individuals tend to be more open in web-based projects where they are not being physically judged or scrutinized and thus can feel more comfortable sharing. This ultimately allows for well-designed artistic projects because individuals are less conscious, or maybe even less aware, of scrutiny towards their work. In an online atmosphere, more attention can be given to the specific needs of a project, rather than spending as much time in communication with other individuals.[15]
Crowdsourcing can either take an explicit or an implicit route. Explicit crowdsourcing lets users work together to evaluate, share, and build different specific tasks, while implicit crowdsourcing means that users solve a problem as a side effect of something else they are doing.
With explicit crowdsourcing, users can evaluate particular items like books or webpages, or share by posting products or items. Users can also build artifacts by providing information and editing other people's work.
Implicit crowdsourcing can take two forms: standalone and piggyback. Standalone allows people to solve problems as a side effect of the task they are actually doing, whereas piggyback takes users' information from a third-party website to gather information.[16]

Types of crowdsourcing

In coining the term of "crowdsourcing", Jeff Howe has also indicated some common categories of crowdsourcing that can be used effectively in the commercial world. Some of these web-based crowdsourcing efforts include crowdvoting, wisdom of the crowd, crowdfunding, microwork, creative crowdsourcing and inducement prize contests. Although these may not be an exhaustive list, they cover the current major ways in which people use crowds to perform tasks.[17]
According to definition by Henk van Ess that has been widely cited in the popular press,
"The crowdsourced problem can be huge (epic tasks like finding alien life or mapping earthquake zones) or very small ('where can I skate safely?'). Some examples of successful crowdsourcing themes are problems that bug people, things that make people feel good about themselves, projects that tap into niche knowledge of proud experts, subjects that people find sympathetic or any form of injustice."[18]

Crowdvoting

Crowdvoting occurs when a website gathers a large group's opinions and judgment on a certain topic. The Iowa Electronic Market is a prediction market that gathers crowds' views on politics and tries to ensure accuracy by having participants pay money to buy and sell contracts based on political outcomes.[19]
Threadless.com selects the t-shirts it sells by having users provide designs and vote on the ones they like, which are then printed and available for purchase. Despite the small nature of the company, thousands of members provide designs and vote on them, making the website's products truly created and selected by the crowd, rather than by the company alone.[5] Some of the most famous examples have made use of social media channels: Domino's Pizza, Coca Cola, Heineken and Sam Adams have thus crowdsourced a new pizza, song, bottle design or beer, respectively.[20]

Crowdsourcing creative work

Creative crowdsourcing spans sourcing creative projects such as graphic design, architecture, apparel design, writing, illustration etc.

Crowdfunding

Crowdfunding is the process of funding your projects by a multitude of people contributing a small amount in order to attain a certain monetary goal.[21] Goals may be for donations or for equity in a project. The dilemma right now for equity crowdfunding in the USA is how the SEC is going to regulate the entire process. As it stands rules and regulations are being refined by the SEC and they will have until Jan. 1st, 2013 to tweak the fundraising methods. The regulators are on edge because they are already overwhelmed trying to regulate Dodd – Frank and all the other rules and regulations involving public companies and the way they trade. Advocates of regulation claim that crowdfunding will open up the flood gates for fraud, have called it the "wild west" of fundraising, and have compared it to the 1980s days of penny stock "cold-call cowboys." The process allows for up to 1 million dollars to be raised without a lot of the regulations being involved. Companies under the current proposal will have a lot of exemptions available and be able to raise capital from a larger pool of persons which can include a lot lower thresholds for investor criteria whereas the old rules required that the person be an "accredited" investor. These people are often recruited from social networks, where the funds can be acquired from an equity purchase, loan, donation, or pre-ordering. The amounts collected have become quite high, with requests that are over a million dollars for software like Trampoline Systems, which used it to finance the commercialization of their new software.[22]
A well-known crowdfunding tool is Kickstarter, which is the biggest website for funding creative projects. It has raised over $100 million, despite its all-or-nothing model which requires one to reach the proposed monetary goal in order to acquire the money. UInvest is another example of a crowdfunding platform that was started in Kiev, Ukraine in 2007. Crowdrise brings together volunteers to fundraise in an online environment.[23]
Most recently, the adult industry gained its own site in the way of Offbeatr. Offbeatr allows the community to cast votes on projects they would like to see make it to the funding phase.[24]

"Wisdom of the crowd"

Wisdom of the crowd is another type of crowdsourcing that collects large amounts of information and aggregates them to gain a complete and accurate picture of a topic, based on the idea that a group of people is, on average, more knowledgeable than an individual. This idea of collective intelligence proves particularly effective on the Internet because people from diverse backgrounds can contribute in real-time within the same forums.[5]
iStockPhoto provides a platform for people to upload photos and purchase them for low prices. Clients can purchase photos through credits, giving photographers a small profit. Again, the photo collection is determined by the crowd's voice for very low prices.[5]
In February 2012, a stock picking game called Ticker Picker Pro was launched, using crowdsourcing to create a hedge fund that would buy and sell stocks based on the ideas coming out of the game. These crowdsourced ideas, coming from so many people, could help one pick the best stocks based on this idea that collective ideas are better than individual ones.[25]

Microwork

Microwork is a crowdsourcing platform where users do small tasks for which computers lack aptitude for low amounts of money. Amazon’s popular Mechanical Turk has created many different projects for users to participate in, where each task requires very little time and offers a very small amount in payment.[2] The Chinese versions of this, commonly called Witkey, are similar and include such sites as Taskcn.com and k68.cn. When choosing tasks, since only certain users “win”, users learn to submit later and pick less popular tasks in order to increase the likelihood of getting their work chosen.[26] An example of a Mechanical Turk project is when users searched satellite images for images of a boat in order to find lost researcher Jim Gray.[16]

Inducement prize contests

Web-based idea competitions, or inducement prize contests often consist of generic ideas, cash prizes, and an Internet-based platform to facilitate easy idea generation and discussion. An example of these competitions includes an event like IBM's 2006 "Innovation Jam", attended by over 140,000 international participants and yielding around 46,000 ideas.[27][28] Another example is the Netflix Prize in 2009. The idea was to ask the crowd to come up with a recommendation algorithm as more accurate than Netflix's own algorithm. It had a grand prize of US$1,000,000, and it was given to the BellKor's Pragmatic Chaos team which bested Netflix's own algorithm for predicting ratings, by 10.06%.
Another example of competition-based crowdsourcing is the 2009 DARPA balloon experiment, where DARPA placed 10 balloon markers across the United States and challenged teams to compete to be the first to report the location of all the balloons. A collaboration of efforts was required to complete the challenge quickly and in addition to the competitive motivation of the contest as a whole, the winning team (MIT, in less than nine hours) established its own "collaborapetitive" environment to generate participation in their team.[29] A similar challenge was the Tag Challenge, funded by the US State Department, which required locating and photographing individuals in 5 cities in the US and Europe within 12 hours based only on a single photograph. The winning team managed to locate 3 suspects by mobilizing volunteers world-wide using a similar incentive scheme to the one used in the Balloon Challenge.[30]
Open innovation platforms are a very effective way of crowdsourcing people's thoughts and ideas to do research and development. The company InnoCentive is a crowdsourcing platform for corporate research and development where difficult scientific problems are posted for crowds of solvers to discover the answer and win a cash prize, which can range from $10,000 to $100,000 per challenge.[5] InnoCentive, of Waltham, MA and London, England is the leader in providing access to millions of scientific and technical experts from around the world. The company has provided expert crowdsourcing to international Fortune 1000 companies in the US and Europe as well as government agencies and nonprofits. The company claims a success rate of 50% in providing successful solutions to previously unsolved scientific and technical problems. IdeaConnection.com challenges people to come up with new inventions and innovations and Ninesigma.com connects clients with experts in various fields. The X PRIZE Foundation creates and runs incentive competitions where one can win between $1 million and $30 million for solving challenges. Local Motors is another example of crowdsourcing. A community of 20,000 automotive engineers, designers and enthusiasts competes to build offroad rally trucks.[23]

Implicit crowdsourcing

Implicit crowdsourcing is less obvious because users do not necessarily know they are contributing, yet can still be very effective in completing certain tasks. Rather than users actively participating in solving a problem or providing information, implicit crowdsourcing involves users doing another task entirely where a third party gains information for another topic based on the user's actions.[5]
A good example of implicit crowdsourcing is the ESP game, where users guess what images are and then these labels are used to tag Google images. Another popular use of implicit crowdsourcing is through reCAPTCHA, which asks people to solve Captchas in order to prove they are human, and then provides Captchas from old books that cannot be deciphered by computers in order to try and digitize them for the web. Like Mechanical Turk, this task is simple for humans but would be incredibly difficult for computers.[16]
Piggyback crowdsourcing can be seen most frequently by websites such as Google that data-mine a user's search history and websites in order to discover keywords for ads, spelling corrections, and finding synonyms. In this way, users are unintentionally helping to modify existing systems, such as Google's ad words.[31]

Crowdsourcers

There are a number of motivations for businesses to use crowdsourcing to accomplish tasks, find solutions for problems, or to gather information. These include the ability to offload peak demand, access cheap labor and information, generate better results, access a wider array of talent than might be present in one organization, and undertake problems that would have been too difficult to solve internally.[32] Crowdsourcing allows businesses to submit problems on which contributors can work, such as problems in science, manufacturing, biotech, and medicine, with monetary rewards for successful solutions. Although it can be difficult to crowdsource complicated tasks, simple work tasks can be crowdsourced cheaply and effectively.
Crowdsourcing also has the potential to be a problem-solving mechanism for government and nonprofit use. Urban and transit planning are prime areas for crowdsourcing. One project to test crowdsourcing's public participation process for transit planning in Salt Lake City was underway from 2008 to 2009, funded by a U.S. Federal Transit Administration grant.[33] Another notable application of crowdsourcing to government problem solving is the Peer to Patent Community Patent Review project for the U.S. Patent and Trademark Office.[34]
Researchers have used crowdsourcing systems, in particular Mechanical Turk, to aid with research projects by crowdsourcing some aspects of the research process, such as data collection, parsing, and evaluation. Notable examples include using the crowd to create speech and language databases,[35][36] and using the crowd to conduct user studies.[31] Crowdsourcing systems provide these researchers with the ability to gather large amount of data. Additionally, using crowdsourcing, researchers can collect data from populations and demographics they may not have had access to locally, but that improve the validity and value of their work.[37]
Artists have also utilized crowdsourcing systems. In his project the Sheep Market, Aaron Koblin used Mechanical Turk to collect 10,000 drawings of sheep from contributors around the world.[38] Sam Brown (artist) leverages the crowd by asking visitors of his website explodingdog to send him sentences that he uses as inspirations for paintings.[39] Art curator Andrea Grover argues that individuals tend to be more open in crowdsourced projects because they are not being physically judged or scrutinized.[15] As with other crowdsourcers, artists use crowdsourcing systems to generate and collect data. The crowd also can be used to provide inspiration and to collect financial support for an artist's work.[40]
Additionally, crowdsourcing from 100 million drivers is being used by INRIX to collect users' driving times to provide better GPS routing and real-time traffic updates.

Demographics

The crowd is an umbrella term for people who contribute to crowdsourcing efforts. Though it is sometimes difficult to gather data about the demographics of the crowd, a study by Ross et al. surveyed the demographics of a sample of the more than 400,000 registered crowdworkers using Amazon Mechanical Turk to complete tasks for pay. While a previous study in 2008 by Ipeirotis found that users at that time were primarily American, young, female, and well-educated, with 40% having incomes >$40,000/yr, in 2009 Ross found a very different population. By Nov. 2009, 36% of the surveyed Mechanical Turk workforce was Indian. Two-thirds of Indian workers were male, and 66% had at least a Bachelor’s degree. Two-thirds had annual incomes less than $10,000/yr, with 27% sometimes or always depending on income from Mechanical Turk to make ends meet.[41]
The average US user of Mechanical Turk earned $2.30 per hour for tasks in 2009, versus $1.58 for the average Indian worker. While the majority of users worked less than 5 hours per week, 18% worked 15 hours per week or more. This is less than minimum wage in either country, which Ross suggests raises ethical questions for researchers who use crowdsourcing.
The demographics of http://microworkers.com/ differ from Mechanical Turk in that the US and India together account for only 25% of workers. 197 countries are represented among users, with Indonesia (18%) and Bangladesh (17%) contributing the largest share. However, 28% of employers are from the US.[42]
Another study of the demographics of the crowd at iStockphoto found a crowd that was largely white, middle- to upper-class, higher educated, worked in a so-called "white collar job" and had a high-speed Internet connection at home.[43]
Studies have also found that crowds are not simply collections of amateurs or hobbyists. Rather, crowds are often professionally trained in a discipline relevant to a given crowdsourcing task and sometimes hold advanced degrees and many years of experience in the profession.[43][44][45][46] Claiming that crowds are amateurs, rather than professionals, is both factually untrue and may lead to marginalization of crowd labor rights.[47]

Motivations

Many scholars of crowdsourcing suggest that there are both intrinsic and extrinsic motivations that cause people to contribute to crowdsourced tasks, and that these factors influence different types of contributors.[43][44][46][48][49][50] For example, students and people employed full-time rate Human Capital Advancement as less important than part-time workers do, while women rate Social Contact as more important than men do.[48]
Intrinsic motivations are broken down into two categories: enjoyment-based and community-based motivations. Enjoyment-based motivations refer to motivations related to the fun and enjoyment that the contributor experiences through their participation. These motivations include: skill variety, task identity, task autonomy, direct feedback from the job, and pastime. Community-based motivations refer to motivations related to community participation, and include community identification and social contact.
Extrinsic motivations are broken down into three categories: immediate payoffs, delayed payoffs, and social motivations. Immediate payoffs, through monetary payment, are the immediately received compensations given to those who complete tasks. Delayed payoffs are benefits that can be used to generate future advantages, such as training skills and being noticed by potential employers. Social motivations are the rewards of behaving pro-socially, such as altruistic motivations. Chandler and Kapelner found that US users of the Amazon Mechanical Turk were more likely to complete a task when told they were going to “help researchers identify tumor cells,” than when they were not told the purpose of their task. However, of those who completed the task, quality of output did not depend on the framing of the task.[48]
Another form of social motivation is prestige or status. The International Children's Digital Library recruits volunteers to translate and review books. Because all translators receive public acknowledgment for their contributions, Kaufman and Schulz cite this as a reputation-based strategy to motivate individuals who want to be associated with institutions that have prestige. The Amazon Mechanical Turk uses reputation as a motivator in a different sense, as a form of quality control. Crowdworkers who frequently complete tasks in ways judged to be inadequate can be denied access to future tasks, providing motivation to produce high-quality work.[51]

Criticisms

There are two major categories of criticisms about crowdsourcing: (1) the value and impact of the work received from the crowd, and (2) the ethical implications of low wages paid to crowdworkers. Most of these criticisms are directed towards crowdsourcing systems that provide extrinsic monetary rewards to contributors, though some apply more generally to all crowdsourcing systems.

Impact of crowdsourcing on product quality

There is susceptibility to faulty results caused by targeted, malicious work efforts. Since crowdworkers completing microtasks are paid per task, there is often a financial incentive to complete tasks quickly rather than well. Verifying responses is time-consuming, and so requesters often depend on having multiple workers complete the same task to correct errors. However, having each task completed multiple times increases time and monetary costs.[52]
Crowdworkers are a nonrandom sample of the population. Many researchers use crowdsourcing in order to quickly and cheaply conduct studies with larger sample sizes than would be otherwise achievable. However, due to low worker pay, participant pools are skewed towards poor users in developing countries.[52][53]
Increased likelihood that a crowdsourced project will fail due to lack of monetary motivation or too few participants. Crowdsourcing markets are not a first-in-first-out queue. Tasks that are not completed quickly may be forgotten, buried by filters and search procedures so that workers do not see them. This results in a long tail power law distribution of completion times.[54] Additionally, low-paying research studies online have higher rates of attrition, with participants not completing the study once started.[37] Even when tasks are completed, crowdsourcing does not always produce quality results. When Facebook began its localization program in 2008, it encountered some criticism for the low quality of its crowdsourced translations.[55]
One of the problems of crowdsourcing products is the lack of interaction between the crowd and the client. Usually there is little information about the final desired product, and there is often very limited interaction with the final client. This can decrease the quality of product because client interaction is a vital part of the design process.[56]
It is usually expected from a crowdsourced project to be unbiased by incorporating a large population of participants with a diverse background. However, most of the crowdsourcing works are done by people who are paid or directly benefit from the outcome (e.g. most of open source projects working on Linux). In many other cases, the end product is the outcome of a single person's endeavour, who creates the majority of the product, while the crowd only participates in minor details.[57]

Concerns for crowdsourcers

There are ethical concerns. Because crowdworkers are considered independent contractors rather than employees, they are not guaranteed a minimum wage. In practice, workers using the Amazon Mechanical Turk generally earn less than the minimum wage, even in India.[41][58] Some researchers who have considered using Mechanical Turk to get participants, for research studies, have argued that the wage conditions might be unethical.[37][59]
With below-market wages, the average US user of Mechanical Turk earned $2.30 per hour for tasks in 2009, versus $1.58 for the average India-based worker. While the majority of users worked less than 5 hours per week, 18% worked 15 hours per week or more, and 27% of Indian users said income from Mechanical Turk is sometimes or always necessary for their main source of income. This is less than minimum wage in either country, which Ross et al. suggest raises ethical questions for researchers who use crowdsourcing.[41] When Facebook began its localization program in 2008, it received criticism for using crowdsourcing to obtain free labor.[55]
Typically, no written contracts, non-disclosure agreements, or employee agreements are made with crowdsourced employees. For users of the Amazon Mechanical Turk, this means that requestors have final say over whether users' work is acceptable; if not, they will not be paid.[60] Critics claim that crowdsourcing arrangements exploit individuals in the crowd, and there has been a call for crowds to organize for their labor rights.[47]
Difficulties in collaboration of crowd members, especially in the context of competitive crowd sourcing. Crowdsourcing site InnoCentive allows organizations to solicit solutions to scientific and technological problems; only 10.6% of respondents report working in a team on their submission.[44]

See also

References

  1. ^ http:// www.merriam-webster.com/dictionary/crowdsourcing
  2. ^ a b c d Jeff Howe (2006). "The Rise of Crowdsourcing". Wired.
  3. ^ Bruno, Elena (14). "Smithsonian Crowdsourcing Since 1849!". The Bigger Picture. Smithsonian Institution Archives. Retrieved 6 January 2013.
  4. ^ Howe, Jeff (June 2, 2006). "Crowdsourcing: A Definition". Crowdsourcing Blog. Retrieved January 2, 2013.
  5. ^ a b c d e f g Brabham, Daren (2008), "Crowdsourcing as a Model for Problem Solving: An Introduction and Cases", Convergence: The International Journal of Research into New Media Technologies 14 (1): 75–90
  6. ^ a b Estellés-Arolas, Enrique; González-Ladrón-de-Guevara, Fernando (2012), "Towards an Integrated Crowdsourcing Definition", Journal of Information Science 38 (2): 189–200
  7. ^ Maurice Claypole "Learning through crowdsourcing is deaf to the language challenge", The Guardian',
  8. ^ David Whitford (January 8, 2010). "Crowd Sourcing Turns Business On Its Head". CNN. Retrieved February 27, 2012.
  9. ^ Nate Lanxon (January 31, 2011). "How the Oxford English Dictionary started out like Wikipedia". Retrieved April 4, 2012.
  10. ^ "What Is the Four-Generation Program?". The Church of Jesus Christ of Latter-Day Saints. Retrieved January 30, 2012.
  11. ^ "Antoine-Jean-Baptiste-Robert Auget, Baron de Montyon". New Advent. Retrieved February 25, 2012.
  12. ^ "It Was All About Alkali". Chemistry Chronicles. Retrieved February 25, 2012.
  13. ^ "Nicolas Appert". John Blamire. Retrieved February 25, 2012.
  14. ^ "9 Examples of Crowdsourcing, Before ‘Crowdsourcing’ Existed". MemeBurn. Retrieved February 25, 2012.
  15. ^ a b DeVun, Leah (November 19, 2009). "Looking at how crowds produce and present art.". Wired News. Retrieved February 26, 2012.
  16. ^ a b c Doan, A; Ramarkrishnan, R; Halevy, A (2011), "Crowdsourcing Systems on the World Wide Web", Communications of the ACM 54 (4): 86–96
  17. ^ Howe, Jeff (2008), "Crowdsourcing: Why the Power of the Crowd is Driving the Future of Business", The International Achievement Institute.
  18. ^ Ess, Henk van "Crowdsourcing: how to find a crowd", ARD ZDF Akademie 2010, Berlin, p. 99,
  19. ^ Robson, John (February 24, 2012). "IEM Demonstrates the Political Wisdom of Crowds". Canoe.ca. Retrieved March 31, 2012.
  20. ^ "4 Great Examples of Crowdsourcing through Social Media".
  21. ^ "What Is Crowdfunding And How Does It Benefit The Economy". accessed November 27, 2012.
  22. ^ Belleflame, Paul (2011), "Crowdfunding: Tapping the Right Crowd", Core Discussion Paper
  23. ^ a b "Beyond XPrize: The 10 Best Crowdsourcing Tools and Technologies". February 20, 2012. Retrieved March 30, 2012.
  24. ^ "Offbeatr - A Kickstarter for Porn". The Huffington Post.
  25. ^ Rulison, Larry (February 14, 2012). "A Winning App? They Hope So". Times Union (Albany).
  26. ^ Yang, J; Adamic, L; Ackerman, M (2008), "Crowdsourcing and Knowledge Sharing: Strategic User Behavior on Taskcn", Proceedings of the 9th ACM Conference on Electronic Commerce
  27. ^ Leimeister, J.M.; Huber, M; Bretschneider, U; Krcmar, H (2009), "Leveraging Crowdsourcing: Activation-Supporting Components for IT-Based Ideas Competition", Journal of Management Information Systems 26 (1): 197–224
  28. ^ Ebner, W; Leimeister, J; Krcmar, H (2009), "Community Engineering for Innovations: The Ideas Competition as a method to nurture a Virtual Community for Innovations", R&D Management 39 (4): 342–356
  29. ^ "DARPA Network Challenge". DARPA Network Challenge. Retrieved November 28, 2011.
  30. ^ "Social media web snares 'criminals'". New Scientist. Retrieved April 4, 2012.
  31. ^ a b Kittur, A; Chi, E.H.; Sun, B (2008), "Crowdsourcing user studies with Mechanical Turk", CHI 2008
  32. ^ Noveck, Beth Simone (2009), Wiki Government: How Technology Can Make Government Better, Democracy Stronger, and Citizens More Powerful, Brookings Institution Press
  33. ^ Federal Transit Administration Public Transportation Participation Pilot Program, U.S. Department of Transportation
  34. ^ Peer-to-Patent Community Patent Review Project, Peer-to-Patent Community Patent Review Project
  35. ^ Callison-Burch, C; Dredze, M (2010), "Creating Speech and Language Data With Amazon’s Mechanical Turk", Human Language Technologies Conference: 1–12
  36. ^ McGraw, I; Seneff, S (2011), "Growing a Spoken Language Interface on Amazon Mechanical Turk", Interspeech: 3057–3060
  37. ^ a b c Mason, W; Suri, S (2010), "Conducting Behavioral Research on Amazon’s Mechanical Turk", Behavior Research Methods
  38. ^ Koblin, A (2008), "The sheep market", Creativity and Cognition
  39. ^ Explodingdog
  40. ^ Linver, D. (2010), Crowdsourcing and the Evolving Relationship between Art and Artist
  41. ^ a b c Ross, J; Irani, L; Silberman, M.S.; Zaldivar, A; Tomlinson, B (2010). "Who are the Crowdworkers? Shifting Demographics in Mechanical Turk". CHI 2010.
  42. ^ Hirth, M; Hoßfeld, T; Train-Gia, P (2011), Human Cloud as Emerging Internet Application – Anatomy of the Microworkers Crowdsourcing Platform
  43. ^ a b c Brabham, Daren C. (2008). "Moving the Crowd at iStockphoto: The Composition of the Crowd and Motivations for Participation in a Crowdsourcing Application". First Monday.
  44. ^ a b c Lakhani et al. (2007). The Value of Openness in Scientific Problem Solving. Retrieved February 26, 2012.
  45. ^ Brabham, Daren C. (2012). "Managing Unexpected Publics Online: The Challenge of Targeting Specific Groups with the Wide-Reaching Tool of the Internet". International Journal of Communication.
  46. ^ a b Brabham, Daren C. (2010). "Moving the Crowd at Threadless: Motivations for Participation in a Crowdsourcing Application". Information, Communication & Society.
  47. ^ a b Brabham, Daren C. (2012). "The Myth of Amateur Crowds: A Critical Discourse Analysis of Crowdsourcing Coverage". Information, Communication & Society.
  48. ^ a b c Kaufmann, N; Schulze, T; Viet, D (2011). "More than fun and money. Worker Motivation in Crowdsourcing – A Study on Mechanical Turk". Proceedings of the Seventeenth Americas Conference on Information Systems.
  49. ^ Brabham, Daren C. (2012). "Motivations for Participation in a Crowdsourcing Application to Improve Public Engagement in Transit Planning". Journal of Applied Communication Research.
  50. ^ Lietsala, Katri; Joutsen, Atte (2007). "Hang-a-rounds and True Believers: A Case Analysis of the Roles and Motivational Factors of the Star Wreck Fans". MindTrek 2007 Conference Proceedings.
  51. ^ Quinn; Bederson (2010). "Human Computation: A Survey and Taxonomy of a Growing Field". CHI 2011.
  52. ^ a b Ipeirotis; Provost; Wang (2010). Quality Management on Amazon Mechanical Turk.
  53. ^ Hirth; Hoßfeld; Tran-Gia (2011), Human Cloud as Emerging Internet Application - Anatomy of the Microworkers Crowdsourcing Platform
  54. ^ Ipeirotis (2010). "Analyzing the Amazon Mechanical Turk Marketplace". XRDS: Crossroads, The ACM Magazine for Students - Comp-YOU-Ter (ACM) 17 (2). Retrieved February 26, 2012.
  55. ^ a b Tomoko A. Hosaka (April 2008). "Facebook asks users to translate for free". MSNBC.
  56. ^ Darice Britt. "Crowdsourcing: The Debate Roars On". Retrieved 4 December 2012.
  57. ^ Dan Woods (28 September 2009). "The Myth of Crowdsourcing". Retrieved 4 December 2012.
  58. ^ "Fair Labor Standards Act Advisor". Retrieved 28 February 2012.
  59. ^ Norcie (2011). Ethical and Practical Considerations for Compensation of Crowdsourced Research Participants.
  60. ^ Paolacci, Chandler (2010). "Running Experiments on Amazon Mechanical Turk". Judgement and Decision Making 5 (5): 411–419. Retrieved February 26, 2012.
 
 
end quote from Wikipedia under the heading "Crowdsourcing"

No comments: