This study investigates how relative performance information (RPI) affects employee performance and allocation of effort across tasks in a multi-task environment. Based on behavioral theories, we predict that the social comparison process inherent in RPI induces both a motivation effect that results in increased effort as well as an effort distortion effect that results in the distortion of effort allocations across tasks away from the firm-preferred allocations. We also predict that both effects are magnified when the RPI is public compared to private. We argue that although the motivation effect will generally benefit performance, the effort distortion effect may be detrimental to performance. We design an experiment that isolates these two effects. Consistent with our predictions, we find that RPI induces both motivation and effort distortion effects and that both effects are magnified when the RPI is public rather than private. Although the motivation effect increases performance, we demonstrate that the effort distortion effect can decrease performance. By isolating the motivation and effort distortion effects, our study provides insights into the costs and benefits of RPI in a multi-task environment. As such, it informs accountants regarding the design of information systems and when tasks should be aggregated or disaggregated across employees.
Data Availability: Data are available from the authors upon request.
This study investigates the effects of relative performance feedback and incentive compensation method on performance. We examine whether the presence and the content of relative performance feedback have different effects on performance when participants are compensated via a tournament or an individual incentive scheme. Our experimental results show a disordinal interaction between incentive scheme and feedback. Specifically, providing relative performance feedback improves the mean performance of participants compensated under an individual incentive scheme regardless of the precision or specific content of the feedback. In contrast, providing relative performance feedback deteriorates the mean performance of participants compensated under a tournament incentive scheme, but only if the feedback is sufficiently precise. Supplementary analysis suggests that this deterioration in performance is due to ineffective task strategies rather than reduced effort. We also find that in the absence of relative performance feedback, participants compensated under a tournament incentive scheme perform better, and their performance improves to a greater extent over time, compared to participants compensated under an individual incentive scheme. These results have implications for the design of accounting, control, and reporting systems in firms.
When using a tournament in multi-period settings, firms have discretion in selecting the tournament horizon. For example, firms can use a single tournament (a grand tournament) or a sequence of multiple tournaments, each with a shorter horizon than a grand tournament (a repeated tournament). Firms have also begun to use a combination of both in which a repeated tournament is embedded within a grand tournament (a hybrid tournament). Using an experiment, we investigate whether the effect of tournament horizon on performance depends on the dynamic complexity of the task, which reflects the potential for effort in one period to influence the link between effort and performance in future periods. When dynamic task complexity is low, we find that performance is greatest in the hybrid tournament, followed by the repeated and then the grand tournament. In contrast, when dynamic task complexity is high, we find that performance is greatest in the repeated tournament, followed by the grand and hybrid tournaments, with similar performance in the latter two tournaments. More generally, the results of our experiment suggest that the effect of tournament horizon on performance depends on dynamic task complexity. These results can help firms make better decisions when designing their tournaments by reinforcing the need to align the tournament horizon with the task.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.