Consumers have grown increasingly aware of the impact of packaging on the environment. Therefore, interest has grown in more environmentally friendly packaging, but we wondered how consumers recognize ‘green’ as distinct from ‘grey’ when evaluating packaging. We asked over 3,000 respondents from Germany, France and the United States how they recognize environmentally friendly packaging. To solicit responses that we may not have anticipated, we used an open‐ended format, which we then followed with a closed‐ended format so that we could compare the two sets of responses. Not surprisingly, in both sets of responses, we found labelling to be the attribute consumers rely upon most, as well as evidence of misleading labels. We also found consumers in Germany and the United States relied on information on the packaging and named searching for information as one of their preferred ways to decide whether packaging is environmentally friendly. French consumers seemed less trusting of published information and more trusting of the look and feel, especially the material, of the package. Our results point to the importance of cultural influences in the acquisition of perceptual cues by the consumer.
Combining surveys and digital trace data can enhance the analytic potential of both data types. We present two studies that examine factors influencing data sharing behaviour of survey respondents for different types of digital trace data: Facebook, Twitter, Spotify and health app data. Across those data types, we compared the relative impact of four factors on data sharing: data sharing method, respondent characteristics, sample composition and incentives. The results show that data sharing rates differ substantially across data types. Two particularly important factors predicting data sharing behaviour are the incentive size and data sharing method, which are both directly related to task difficulty and respondent burden. In sum, the paper reveals systematic variation in the willingness to share additional data which need to be considered in research designs linking surveys and digital traces.
Citizen scientists play an increasingly important role in biodiversity monitoring. Most of the data, however, are unstructured—collected by diverse methods that are not documented with the data. Insufficient understanding of the data collection processes presents a major barrier to the use of citizen science data in biodiversity research. We developed a questionnaire to ask citizen scientists about their decision-making before, during and after collecting and reporting species observations, using Germany as a case study. We quantified the greatest sources of variability among respondents and assessed whether motivations and experience related to any aspect of data collection. Our questionnaire was answered by almost 900 people, with varying taxonomic foci and expertise. Respondents were most often motivated by improving species knowledge and supporting conservation, but there were no linkages between motivations and data collection methods. By contrast, variables related to experience and knowledge, such as membership of a natural history society, were linked with a greater propensity to conduct planned searches, during which typically all species were reported. Our findings have implications for how citizen science data are analysed in statistical models; highlight the importance of natural history societies and provide pointers to where citizen science projects might be further developed.
More survey results are available today than ever before. This increase in survey data has been accompanied by growing concerns about their quality. With the present study, we aim to investigate to what extent the public draws on survey quality information when evaluating the trustworthiness of survey results. We implemented a vignette experiment in an online panel survey (N = 3,313), in which respondents each received four different survey descriptions with varying methodological information. Compared with respondent characteristics, survey quality information had only a minor effect on perceptions of trustworthiness. However, trust in the survey results was significantly influenced by sample size and sample balance. Finally, the relevance of survey quality information increased with the cognitive ability of the respondent.
As our modern world has become increasingly digitalized, various types of data from different data domains are available that can enrich survey data. To link survey data to other sources, consent from the survey respondents is required. This article compares consent to data linkage requests for seven data domains: administrative data, smartphone usage data, bank data, biomarkers, Facebook data, health insurance data, and sensor data. We experimentally explore three factors of interest to survey designers seeking to maximize consent rates: consent question order, consent question wording, and incentives. The results of the study using a German online sample (n = 3,374) show that survey respondents have a relatively high probability of consent to share smartphone usage data, Facebook data, and biomarkers, while they are least likely to share their bank data in a survey. Of the three experimental factors, only the consent question order affected consent rates significantly. Additionally, the study investigated the interactions between the three experimental manipulations and the seven data domains, of which only the interaction between the data domains and the consent question order showed a consistent significant effect.
Modern social and marketing research relies heavily on surveys to collect data. At the same time, it is well established that survey responses are influenced by response style biases that vary across individuals, countries and cultures. Investigating such biases, we focused on Mexico and South Korea, two uprising markets mostly neglected in response style research. Data came from a survey instrument of 28 questions focusing on environmental attitudes, individual responsibilities and green packaging characteristics, administered to 500 Mexican and 525 South Korean respondents. We computed response style metrics and compared these to predictions made using scores on Hofstede and Minkov's quantitative cultural research scale. The predictions made using this scale were largely confirmed through the response style metrics. While respondents in both countries preferred answering items with "Agree" or "Strongly Agree," respondents in Mexico were about twice as willing to "Disagree" or "Strongly Disagree" than those in South Korea. Overall, respondents in Mexico showed a bias toward extreme responses, while those in South Korea showed a response bias toward mid-point values. Our approach can be used to assist survey design and to interpret the significance of survey results. Data captured from Mexican and South Korean respondents is now available to add to the general body of knowledge on response styles.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.