MTurk workers use of social media

Don’t be misled by the title of this article. It is called “Crowdsourcing Social Media for Military Operations” but it really isn’t about that. It is a report of a survey of almost 800 Turkers and which social media they use.

Almost all (93.1%) use Facebook, about 60% use Twitter, and all the way down the list until we learn about 12% use Yahoo Answers, which I’ve never even heard of. Interestingly, Snapchat (the kids’ choice) was left off the list.

The study makes some connections to military operations but it really is more of a summary of how Turkers use social  media. It isn’t a great article, by any means (no stats among other things) but an interesting snapshot of a community.

Comparing MTurk to PA, CF and CBDR

How does MTurk compare to Prolific Academic, Crowdflower and the Carnegie Mellon poll (also known as CBDR)?

This new articles tells you.

CBDR was the ‘control’, so to speak, and comparing MTurk to PA and CF shows:

“In two studies, we found that participants on both platforms  (PA and CF) were more naïve and less dishonest compared to MTurk participants. Across the three platforms, CF provided the best response rate, but CF participants failed more attention-check questions and did not reproduce known effects replicated on ProA and MTurk. Moreover, ProA participants produced data quality that was higher than CF’s and comparable to MTurk’s. ProA and CF participants were also much more diverse than participants from MTurk.”

So it looks like a thumbs up to Prolific Academic.

Nice job from some very expert MTurk researchers!

Eyal Peer, Laura Brandimarte, Sonam Samat, Alessandro Acquisti, Beyond the Turk: Alternative platforms for crowdsourcing behavioral research, Journal of Experimental Social Psychology, Volume 70, May 2017, Pages 153-163, ISSN 0022-1031,

Resisting exploitation of gig workers

Here’s a great piece, published in the UK, by Mark Graham and Alex Wood.

Every word in this is excellent, but here is the most important thing (IMHO):


“…because almost all large online work platforms are currently privately owned firms, they rarely have the best interests of workers at heart. They capture large rents – often 20 per cent of wages – by simply providing a platform that allows clients to meet workers. There is no reason that platforms cannot instead be run by and for workers, as cooperatives, in order to allow workers to capture more of the value that they are creating.”

New Chapter on Law and Crowdwork

This abstract came across my feed today and it looks like it could be a great read about novel thinking in legal protection of crowdworking. The citations (which aren’t behind a paywall) include lots of MTurk stuff (and if you’re looking for a good ‘primer’ on law and crowdwork, this bibliography does a great job of providing that).


Prassl, Jeremias, and Martin Risak. “The Legal Protection of Crowdworkers: Four Avenues for Workers’ Rights in the Virtual Realm.” In Policy Implications of Virtual Work, pp. 273-295. Springer International Publishing, 2017.

In which I compare MTurk to Qualtrics and Students and more

Well, it isn’t just me, and there isn’t much ‘more’, but a recent article I co-authored is available for free download through May. Just go here.


Data collection using Internet-based samples has become increasingly popular in many social science disciplines, including advertising. This research examines whether one popular Internet data source, Amazon’s Mechanical Turk (MTurk), is an appropriate substitute for other popular samples utilized in advertising research. Specifically, a five-sample between-subjects experiment was conducted to help researchers who utilize MTurk in advertising experiments understand the strengths and weaknesses of MTurk relative to student samples and professional panels. In comparisons across five samples, results show that the MTurk data outperformed panel data procured from two separate professional marketing research companies across various measures of data quality. The MTurk data were also compared to two different student samples, and results show the data were at least comparable in quality. While researchers may consider MTurk samples as a viable alternative to student samples when testing theory-driven outcomes, precautions should be taken to ensure the quality of data regardless of the source. Best practices for ensuring data quality are offered for advertising researchers who utilize MTurk for data collection.

And for your amusement, I have a colleague who hates MTurk. He wrote an anti-Turk article and it is available for free too, along with our response to him. You can find those pieces, along with several other pieces about different types of methodology, here:

I was just in Boston and I organized a panel about MTurk where it turns out the Colleague/Hater said that he ‘used to represent Qualtrics’ at academic conferences. I’m not sure exactly what he meant by that, but it sounds like there’s a bit of—oh I don’t know—biast? there.


Aussies look at ‘new tool’: MTurk

The Conversation has–what? An opinion piece?–about MTurk.

Not sure exactly what we should call this–it isn’t an article in an academic sense as there are no citations for some of the facts (and allegations).

And calling MTurk ‘new’ is—oh what? Just silly? Because it has been around for 11 years or so and thousands of studies–real academic studies–have been published using it.

The scant academic literature included via links is old—a few pubs from 2011, one from 2015–and an odd link to an MTurk Grind forum discussion.