Exploring Crowdsourced Work in Low-Resource Settings

  • Manu Chopra ,
  • Indrani Medhi Thies ,
  • Joyojeet Pal ,
  • Colin Scott ,
  • Bill Thies ,

2019 Human Factors in Computing Systems |

Published by ACM

PDF

While researchers have studied the benefits and hazards of crowdsourcing for diverse classes of workers, most work has focused on those having high familiarity with both computers and English. We explore whether paid crowdsourcing can be inclusive of individuals in rural India, who are relatively new to digital devices and literate mainly in local languages. We built an Android application to measure the accuracy with which participants can digitize handwritten Marathi/Hindi words. The tasks were based on the real-world need for digitizing handwritten Devanagari script documents. Results from a two-week, mixed-methods study show that participants achieved 96.7% accuracy in digitizing handwritten words on low-end smartphones. A crowdsourcing platform that employs these users performs comparably to a professional transcription firm. Participants showed overwhelming enthusiasm for completing tasks, so much so that we recommend imposing limits to prevent overuse of the application. We discuss the implications of these results for crowdsourcing in low-resource areas.