{"id":6024,"date":"2017-05-06T07:41:47","date_gmt":"2017-05-06T07:41:47","guid":{"rendered":"http:\/\/peerproduction.net\/editsuite\/?page_id=6024"},"modified":"2017-05-22T13:52:24","modified_gmt":"2017-05-22T13:52:24","slug":"reviews","status":"publish","type":"page","link":"http:\/\/peerproduction.net\/editsuite\/issues\/issue-10-peer-production-and-work\/peer-reviewed-papers\/crowdsourcing-citizen-science-exploring-the-tensions-between-paid-professionals-and-users\/reviews\/","title":{"rendered":"Reviews (Crowdsourcing Citizen Science)"},"content":{"rendered":"
\n

Review A<\/h2>\n
\n

Reviewer:\u00a0Anonymous<\/strong><\/p>\n

This is an interesting paper on citizen science and its relationship to crowdsourcing models. It is based on rich and extensive empirical material, and is well written and argued. Yet I would argue that it needs substantial changes in order to be considered for publication.<\/p>\n

My main concern is that the paper is quite shallow on the theoretical side. While the topic and material seem to call for a thick analysis of processes of citizen participation to science, the paper fails at framing its empirical findings within a strong theoretical background. Authors argue that their key argument is that you can \u201cexplore the tensions and contradictions\u201d of citizen science. But how and in which direction?<\/p>\n

Indeed I believe that authors should on the one hand focus their paper more clearly: what is the main question and how do you contribute to knowledge in this area? This surfaces in the paper but would benefit from a stronger focus.<\/p>\n

Second, and linked to my first point, the authors need to make use of a more substantial body of literature. For example, the topic of the paper made me think of a number of debates:<\/p>\n

    \n
  • The relation between \u201cstrong\u201d and \u201cweak\u201d forms of digital participation in media studies (for example Carpentier, The concept of participation. If they have access and interact, do they really participate?, or Hyde et al, What is collaboration anyway?)<\/li>\n
  • The incredibly wealthy tradition of expertise studies within science & technology studies (STS). This spans 20 years of scholarship in the field and can not be limited to Lewenstein\u2019s work.<\/li>\n
  • The political economy of open and collaborative science, with its ambivalences in terms of power and profit<\/li>\n
  • The continuous construction of the social boundaries of the scientific enterprise, another important thread in STS<\/li>\n
  • The increasingly dense conversation on digital labor in media and STS studies. For example, authors base their paper on non-scholarly piece (Scholz\u2019s) while neglecting work such as for example Lilly Irani\u2019s, which has produced ethnographic analysis of microwork (The cultural work of microwork, 2015).<\/li>\n<\/ul>\n

    Now, I am not suggesting that the authors should work on all these threads, but a careful work of theoretical positionining would add depth to a paper in which journalistic or technical literature seems to prevail over social sciences.<\/p>\n

    Without this, I find it difficult to make sense of this article as a scholarly contribution. Another example: in the conclusions authors mention the distinction between non-expert involvement and democratic inclusion as an unresolved tension within Zooniverse. This point is at the core of the STS tradition and the authors seem to be well positioned to contribute to the debate. I believe that a clearer angle and a stronger theoretical framing might help them do so.<\/p>\n

    On a minor, formal, note: the abstract seems to be way too long and probably should be cut in half<\/p>\n<\/div>\n

    Review B<\/h2>\n
    \n

    Reviewer: Sara Tocchetti<\/strong><\/p>\n

    1. Is the subject matter relevant?<\/strong><\/p>\n

    Yes, both in respect of the special issue theme and more generally the question of the \u201cpeer economy\u201d as brought forward by JOPP<\/p>\n

    2. Is the treatment of the subject matter intellectually interesting? Are there citations or bodies of literature you think are essential to which the author has not referred?<\/strong><\/p>\n

    The subject matter treated is intellectually interesting, although there are some bodies of literature that could be revised, and there is, in my opinion a structural issue with the paper meaning is not always clear, at least to me, what is the question that the authors are following and how they are answering it.<\/p>\n

    3. Are there any noticeable problems with the author\u2019s means of validating assumptions, interpreting data or making judgments?<\/strong><\/p>\n

    The major problem, in my opinion, is that is not always clear what is the focus: unpaid\/paid labor in crowdfunding citizen science, citizen science as a \u201cdemocratization of science\u201d, black boking and hidden work?<\/p>\n

    4. Is the article well written?<\/strong><\/p>\n

    There are a couple of phrases that are not clear for me, but otherwise the text is well written, in term of grammar and phrase structure but see comment C<\/p>\n

    5. Are there portions of the article that you recommend be shortened, excised or expanded?<\/strong><\/p>\n

    Not specifically, I would suggest the use of footnotes, or giving some more details, but I am not aware of which are the length and publication constraint so what I am saying in this respect might not be relevant.<\/p>\n<\/div>\n

    Review C<\/h2>\n
    \n

    Reviewer:\u00a0Anonymous<\/strong><\/p>\n

    1. Is the subject matter relevant?<\/strong><\/p>\n

    The subject is relevant to the special issue. The article concerns a citizen science crowdsourcing platform with paid and unpaid contributors. The author presents the organisation and the different motivations of both categories through seven parts.<\/p>\n

    The introduction presents the case study. The author insists on the difference between citizen science with contributors working for free on micro work and a citizen science as a peer community with common decision-making process.<\/p>\n

    The second part is dedicated to the literature review of crowdsourcing. The author defines crowdsourcing and compares Zooniverse with the Amazon Mechanical Turk. The review is focused on the contributors\u2019 motivation and the mix paid and unpaid people in micro-work activities.<\/p>\n

    The third part presents the methodology based on qualitative data. The author uses a large set of interviews with paid and unpaid contributors, and he made an ethnographic field work.<\/p>\n

    The fourth part explored the history of the platform and explains how a single project platform is passed to a fifty shared projects. The relationships between distant researcher teams and the open source ideology have shaped the organisation of the website and the process of the shared tasks.<\/p>\n

    The fifth part is about the distinction between paid and unpaid labour. The author insists on the innovation process allowed with participation of citizens in the scientific work.<\/p>\n

    The sixth part analyses the empirical findings about individual motivations in citizen science projects. The most quoted motivations are: research and fun. This part also highlights some limits in the organisation. The tasks lose their senses because of the large number of contributions (my contribution really makes much of a difference?) and because it\u2019s hard to understand what is the origin of the material and what is its place in the scientific process?<\/p>\n

    The conclusion insists on the difficulties to maintain the participation in citizen science projects. The author underlines the possible control of citizens on this shared research process.<\/p>\n

    2. Is the treatment of the subject matter intellectually interesting? Are there citations or bodies of literature you think are essential to which the author has not referred?<\/strong><\/p>\n

    The literature review of online platforms and crowdsourcing is significant. The paper is written in a normative way and presents some strengths and weaknesses in crowdsourcing platforms. In other words, crowdsourcing is presented as a positive innovation in scientific activities, even though some limits appear to concern the individual motivations.<\/p>\n

    3. Are there any noticeable problems with the author\u2019s means of validating assumptions or making judgments?<\/strong><\/p>\n

    I felt a lack of a broader literature in economics or sociology about work and organisation. The only general quotation is the Karl Marx\u2019s theory of labour. Maybe this reference is underused because the case studied question the capitalist production paradigm deeply. Theoretically the capital owner is using work to obtain value and realise further production cycle. In the Zooniverse case, what is the value? What is the capital? What is the further production step? The link between the innovation process and money is unclear. Is not there money, neither some rules of property in the black box?<\/p>\n

    In the conclusion the only resources seems the individual motivations but it seems other resources are managed. For example, the role of the grants is unclear. The paragraph page\u00a08 (The management of the Zooniverse can be initially\u2026) implies that the Zoonivese website depends on a few persons with grants whereas there is structured by research teams. What is the role of these teams and their means in the use of Zooniverse?<\/p>\n

    There is a kind of contradiction in a paragraph page\u00a07 (The organisational culture of the Zooniverse). In the one hand the Oxford University environment is important to start work on the platform. In the second hand, the platform needs nothing to start and run. In other words, it\u2019s hard to understand what kind of tangible and intangible resources are enlisted with Zooniverse?<\/p>\n

    I didn\u2019t understand why the findings of motivation concern only users and not also researchers. Moreover the motivation of research participation and fun could be deepened, especially if these motivations wane. The presentation of contributors\u2019 career (in a sociological sense) or the evolution of work in a specific project (maybe with some online metrics) should give additional perspectives to the motivations.<\/p>\n

    4. Is the article well written?<\/strong><\/p>\n

    The article is clear and well written. The only difficulty (for a non-native English as I) is the term of \u2018user\u2019 to talk about contributors. After some research in the dictionary, I have understood that user in technological context means \u2018advanced users\u2019 and not only a person who passively use something.<\/p>\n

    However the user has a meaning of passivity in online communities like open source projects. Moreover, there is no clear distinction between paid user, and unpaid user. Is it possible to have both of status, or are there any bridges between them? The terms like volunteers, or unpaid contributors, or external participants seem to me more explicitly to express different status.<\/p>\n

    5. Are there portions of the article that you recommend be shortened, excised or expanded?<\/strong><\/p>\n

    Maybe the use of Karl Marx could be more developed in the introduction and recalled in conclusion. It\u2019s hard to understand how much the observed situation is transitory or institutionalised. The micro work in citizen science platform questions the configuration between capital and work. The paper is not giving a clear idea if the Zooniverse may stay as an alternative way of science production or not. In other words, is it a challenger to the other way of making science?<\/p>\n

    Maybe you can give some ideas why there is no project about sociology, economics, translation, contemporary history? Is it because researchers not linked in the same social network, or because these materials are not fun, or because of the technical and ethical, limitations in micro work process?<\/p>\n

    In conclusion, the subject is rich and very interesting. My remarks are not negative but reflect my curiosity awakened by the paper.<\/p>\n<\/div>\n<\/div>\n\n","protected":false},"excerpt":{"rendered":"

    Reviewer:\u00a0Anonymous This is an interesting paper on citizen science and its relationship to crowdsourcing models. It is based on rich and extensive empirical material, and is well written and argued. Yet I would argue that it needs substantial changes in order to be considered for publication. My main concern is<\/p>\n

    Read more<\/a><\/p>\n","protected":false},"author":26,"featured_media":0,"parent":6003,"menu_order":2,"comment_status":"closed","ping_status":"closed","template":"template_full_width.php","meta":[],"tags":[],"_links":{"self":[{"href":"http:\/\/peerproduction.net\/editsuite\/wp-json\/wp\/v2\/pages\/6024"}],"collection":[{"href":"http:\/\/peerproduction.net\/editsuite\/wp-json\/wp\/v2\/pages"}],"about":[{"href":"http:\/\/peerproduction.net\/editsuite\/wp-json\/wp\/v2\/types\/page"}],"author":[{"embeddable":true,"href":"http:\/\/peerproduction.net\/editsuite\/wp-json\/wp\/v2\/users\/26"}],"replies":[{"embeddable":true,"href":"http:\/\/peerproduction.net\/editsuite\/wp-json\/wp\/v2\/comments?post=6024"}],"version-history":[{"count":7,"href":"http:\/\/peerproduction.net\/editsuite\/wp-json\/wp\/v2\/pages\/6024\/revisions"}],"predecessor-version":[{"id":6161,"href":"http:\/\/peerproduction.net\/editsuite\/wp-json\/wp\/v2\/pages\/6024\/revisions\/6161"}],"up":[{"embeddable":true,"href":"http:\/\/peerproduction.net\/editsuite\/wp-json\/wp\/v2\/pages\/6003"}],"wp:attachment":[{"href":"http:\/\/peerproduction.net\/editsuite\/wp-json\/wp\/v2\/media?parent=6024"}],"wp:term":[{"taxonomy":"post_tag","embeddable":true,"href":"http:\/\/peerproduction.net\/editsuite\/wp-json\/wp\/v2\/tags?post=6024"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}