Much can be at stake depending on the choice of words used to describe citizen science, because terminology impacts how knowledge is developed. Citizen science is a quickly evolving field that is mobilizing people's involvement in information development, social action and justice, and large-scale information gathering. Currently, a wide variety of terms and expressions are being used to refer to the concept of 'citizen science' and its practitioners. Here, we explore these terms to help provide guidance for the future growth of this field. We do this by reviewing the theoretical, historical, geopolitical, and disciplinary context of citizen science terminology; discussing what citizen science is and reviewing related terms; and providing a collection of potential terms and definitions for 'citizen science' and people participating in citizen science projects. This collection of terms was generated primarily from the broad knowledge base and on-the-ground experience of the authors, by recognizing the potential issues associated with various terms. While our examples may not be systematic or exhaustive, they are intended to be suggestive and invitational of future consideration. In our collective experience with citizen science projects, no single term is appropriate for all contexts. In a given citizen science project, we suggest that terms should be chosen carefully and their usage explained; direct communication with participants about how terminology affects them and what they would prefer to be called also should occur. We further recommend that a more systematic study of terminology trends in citizen science be conducted.
Reproducibility and reusability of research results is an important concern in scientific communication and science policy. A foundational element of reproducibility and reusability is the open and persistently available presentation of research data. However, many common approaches for primary data publication in use today do not achieve sufficient long-term robustness, openness, accessibility or uniformity. Nor do they permit comprehensive exploitation by modern Web technologies. This has led to several authoritative studies recommending uniform direct citation of data archived in persistent repositories. Data are to be considered as first-class scholarly objects, and treated similarly in many ways to cited and archived scientific and scholarly literature. Here we briefly review the most current and widely agreed set of principle-based recommendations for scholarly data citation, the Joint Declaration of Data Citation Principles (JDDCP). We then present a framework for operationalizing the JDDCP; and a set of initial recommendations on identifier schemes, identifier resolution behavior, required metadata elements, and best practices for realizing programmatic machine actionability of cited data. The main target audience for the common implementation guidelines in this article consists of publishers, scholarly organizations, and persistent data repositories, including technical staff members in these organizations. But ordinary researchers can also benefit from these recommendations. The guidance provided here is intended to help achieve widespread, uniform human and machine accessibility of deposited data, in support of significantly improved verification, validation, reproducibility and re-use of scholarly/scientific data.
A scientific publication is fundamentally an argument consisting of a set of ideas and expectations supported by observations and calculations that serve as evidence of its veracity. An argument without evidence is only a set of assertions. Consider the difference between the statement “The hairy woodpecker population is declining in the northwest region of the United States” and the statement “Hairy woodpecker populations in the northwest region of the United States have declined by 11% between 1992 and 2003, according to data from the Institute for Bird Populations (http://www.birdpop.org/).” Both or neither of these statements could be true, but only the second one can be verified. Scientific papers do, of course, present specific data points as evidence for their arguments, but how well do papers guide readers to the body of those data, where the the data's integrity can be further examined? In practice, a chasm may lie across the path of a reviewer seeking the source data of a scientific argument.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.