Purpose -Items that ask the respondent for a comment in his/her own words are becoming increasingly popular in online employee surveys, but research on such comments is scarce. The purpose of this paper is to analyze, theoretically and empirically, what kind of comments are generated in employee surveys by respondents who differ in terms of job satisfaction and commitment. Design/methodology/approach -The data studied here are from an online employee survey conducted in 2004 in a multinational IT organization with a German headquarter. Some 24,000 employees generated about 75,000 comments focusing on 15 topic fields. The comments were additionally coded on their tone, using a computerized dictionary approach especially developed for this purpose. Frequencies, wordiness, and the tone of different types of comments are measured. The statistical relationship of comments to job satisfaction and to organizational commitment is analyzed. Findings -Some 40 per cent of the respondents provide comments. Most comments have a negative tone. Negative comments are wordier than positive ones. The likelihood of writing comments is inversely related to the respondents' job satisfaction and to their organizational commitment. Dissatisfied employees and employees with low commitment also write more negative and wordier comments. Practical implications -The study sets benchmarks on what to expect when using open-ended comment fields in employee surveys. A methodology for analyzing huge text files with respect to their tone is presented. Originality/value -This is the first paper that investigates a realistically large data set of comments. It also shows how to use computer methods to relate frequencies, wordiness, and tone of comments to standard variables such as job satisfaction or commitment.
This article investigates item nonresponse in open-ended survey questions because such item nonresponse is much higher than in closed questions. The difference is a result of the higher cognitive burden placed on the respondent. To study item nonresponse, we manipulate different questionnaire design characteristics, such as the size of the answer box and the inclusion of motivation texts, as well as respondent-specific characteristics, in a randomized web experiment using a student sample. The results show that a motivation text increases the frequency of responses to open-ended questions for both small and large answer boxes. However, large answer boxes earn higher item nonresponse than small answer boxes regardless of the usage of a motivation text. In addition, gender and the respondent's field of study affected the answering of open-ended questions; being a woman or studying social sciences increased the frequency of a response. As the major finding and in contrast to previous findings, our results indicate that particularly large answer boxes should be avoided, because they reduce respondents' willingness to respond.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations鈥揷itations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.