Citizen science: Is it for me?The UKEOF-funded report Understanding Motivations for Citizen Science included a useful summary of the barriers and challenges to stakeholder participation in citizen science projects. Here we have adapted the original table in order to reveal the key barriers faced by science, policy and practice stakeholders. Whilst these are not exhaustive summaries, this information will be useful for anyone looking to collaborate with other stakeholder groups on a citizen science or crowdsourcing initiative, as well as identify potential challenges in advance.

Key:

 SCI icon (science-related stakeholder) Science-related stakeholder

POL icon (policy and evidence stakeholder) Policy & evidence stakeholder

PRA icon (Practice-based stakeholder) Practice-based stakeholder

Data quality and biases

  • Inadequate equipment (e.g. low quality sensors) SCI icon
  • Mistrust data from non-professionals SCI icon PRA icon
  • Biases influencing decision to participate SCI icon
  • Scalability of data SCI icon
    Example: Quality of data is not at a level for use on a wide scale
  • Partnerships with local authorities SCI icon
    Example: Local authorities can lack manpower to assist
  • Patchiness of data SCI icon
    Example: Statistical techniques available to even out patchiness
  • Specific evidence need beyond the scope of citizen science SCI icon
    Example: “it wasn’t our direct need, and with low resources that wasn’t our priority” (Scientist, monitoring, policy)

Peer review / mistrust

  • Peer reviewer reservations during publication process SCI icon
  • Citizen science frowned upon by colleagues SCI icon
    Example: Scientist (university) told: “should be doing proper science”. Another explained: “We as scientists are often a bit snobbish about our ability in comparison to other people’s ability, and you see that in the response to citizen science from a lot of the policymaking community. Citizen science equals poor data, that’s their starting point” (Scientist, monitoring, policy).
  • Institutional reservations about citizen science SCI icon
    Example: Need to get board members on side

Requirement of specialist equipment / knowledge

  • Training required SCI icon PRA icon
    Example: Scientist indicates specialist training is required due to challenges of identification: “Gone are the days where you used to have a huge visible injury on vegetation has gone, because there’s been acute exposure to everything.” (Scientist, policy, monitoring)
  • Difficulties of recruitment and commitment of volunteers PRA icon
    Example: “And then continued engagement, the enthusiasm barrier, because you get a drop off, an exponential drop off of participation as time goes on. So how do you keep the exponential drop off … as low as possible?” (Practitioner, science, engagement)
  • Inaccessible sites SCI icon
    Example: Linked to patchiness of data, “It’s difficult to tell people to go to a site that they think will be rubbish as well, I think if you want to see lots of dragonflies you go to a good dragonfly site rather than just anywhere” (Scientist, policy, monitoring)
  • Unable to keep up with technological developments PRA icon
    Example: Once technology is in place, it must be maintained and updated
  • Getting people to use technology SCI icon PRA icon
    Example: “a survey that started off on paper, it’s actually very hard to move people over” (Practitioner, science, engagement)
  • Crowded marketplace for citizen science projects in certain areas SCI icon PRA icon
    Example: Technology and online data options are flooding the market with similar citizen science projects

Time consuming / resource issues

  • Promoting citizen science SCI icon POL icon
    Example: Launching apps requires time and resources for promotion and maintenance
  • Communication SCI icon
    Example: No time to explain key scientific ideas (e.g. recording absence is as important as presence)
  • Slow process POL icon
    Example: Policy want answers yesterday
  • Individual interactions SCI icon
    Example: Individual requests for support are time-consuming
  • Mobilising and maintaining citizen science project SCI icon POL icon PRA icon
    Example: Time spent validating, verifying, selecting appropriate technology, calibrating sensors; lack of funding, short-termism of funding, unable to prove concept, no funding for essential technical development
  • Volunteers threaten job opportunities/security of professionals SCI icon
  • Lack of interpretation may lead to poorly-informed public demands SCI icon

Institutional politics

  • Liability of organisers if don't act on citizen data PRA icon
  • Unaware of using citizen science data SCI icon
    Example: Scientists often use published data sets unaware that they are citizen science data
  • Nobody championing citizen science on high level PRA icon
    Example: Projects should identify someone high level to champion their project institutionally
  • Differing science and engagement objectives PRA icon
    Example: “you have to decide where you sit along the spectrum for the mass engagement versus data quality question” (Practitioner, science, engagement)
  • Activities require legislatory approval SCI icon
    Example: Challenges when species become scientific instruments. Legislated by Home Office
  • Lack of interest in engaging the public through citizen science SCI icon

Unaware of audience

  • Lack of attention to needs and expectations of the citizen science audience POL icon
    Example: Must understand demands on participant’s time and project’s requirements may differ
  • Little acknowledgement of different types of volunteer PRA icon
    Example: Differing volunteer types, e.g. paper-based volunteer, don’t want to lose them due to quality
  • Need more volunteers POL icon
    Example: There aren’t enough volunteers participating
  • Survey design by committee, by professionals only PRA icon
    Example: Too much discussion of small issues, but decision has to eventually be taken

Survey design / implementation issues

  • Lack of clear research question PRA icon
    Example: Avoid “reverse engineering to a question”, instead “[be] led by a question” (Practitioner, science, engagement)
  • Survey is inaccessible PRA icon
    Example: Participants must be able to understand questions
  • Language barrier (scientific and linguistic) PRA icon
    Example: Avoid over-complication
  • Assumption that people have access to the internet and to a mobile phone SCI icon PRA icon
    Example: Not everyone has a mobile phone on a data plan, nor access to the internet
  • Assumption that people are comfortable with technology SCI icon PRA icon
    Example: Technological literacy should never be assumed
  • Over-reliance on web-based solutions SCI icon PRA icon
    Example: Web-based solutions can be exclusionary
  • Designed a ‘boring’, yet scientifically important, survey SCI icon
    Example: “Well, it could be that it’s a really … important square, and when you go there and count your … your butterflies, you might only, you might see none, you might see one. And that’s a really boring day out” (Scientist, policy, monitoring)