Abstract:We conducted a systematic review of 143 empirical studies of advice-based decision making published in management or psychology between 2006 and 2020. We identified two distinct streams of the literature. The first, behavioral research, features experimental research on advice-based decisions conducted in laboratories. The second, organizational research, features observational field research on advice-based decisions in organizations. We organized the findings from the two research streams around three sequen… Show more
Advice taking and related research are dominated by deterministic weighting indices such as ratio-of-differences-based formulas for investigating informational influence. They are intuitively simple but entail various measurement problems and restrict research to a certain paradigmatic approach. As a solution, we propose process-consistent mixed-effects regression modeling for specifying how strongly peoples’ judgment is influenced by external information. Our formal derivation of the proposed weighting measures is accompanied by a detailed elaboration on their most important technical and statistical subtleties. Essentially, the approach differentiates between components of endogenous (i.e., final judgment) and exogenous (e.g., initial judgment and advice) nature by relying on accordingly specified multilevel models. Corresponding mixed-effects regression coefficients of various exogenous sources of information hence also reflect individual weighting but are based on a conceptually consistent representation of the endogenous judgment process. We use this modeling approach to revisit empirical findings from sequential collaboration and advice taking paradigms. Specifically, whereas we do not obtain evidence for systematic order effects in sequential collaboration, we document recency effects in the weighting of sequentially sampled advice. We argue that process-consistent modeling of information sampling and utilization has the potential to increase the replicability of our science and opens up new avenues for innovative research. Moreover, the proposed method is relevant beyond sequential collaboration and advice taking. Mixed-effects regression weights can also inform research on related cognitive phenomena such as multidimensional belief updating, anchoring effects, hindsight biases, or attitude change.
Advice taking and related research are dominated by deterministic weighting indices such as ratio-of-differences-based formulas for investigating informational influence. They are intuitively simple but entail various measurement problems and restrict research to a certain paradigmatic approach. As a solution, we propose process-consistent mixed-effects regression modeling for specifying how strongly peoples’ judgment is influenced by external information. Our formal derivation of the proposed weighting measures is accompanied by a detailed elaboration on their most important technical and statistical subtleties. Essentially, the approach differentiates between components of endogenous (i.e., final judgment) and exogenous (e.g., initial judgment and advice) nature by relying on accordingly specified multilevel models. Corresponding mixed-effects regression coefficients of various exogenous sources of information hence also reflect individual weighting but are based on a conceptually consistent representation of the endogenous judgment process. We use this modeling approach to revisit empirical findings from sequential collaboration and advice taking paradigms. Specifically, whereas we do not obtain evidence for systematic order effects in sequential collaboration, we document recency effects in the weighting of sequentially sampled advice. We argue that process-consistent modeling of information sampling and utilization has the potential to increase the replicability of our science and opens up new avenues for innovative research. Moreover, the proposed method is relevant beyond sequential collaboration and advice taking. Mixed-effects regression weights can also inform research on related cognitive phenomena such as multidimensional belief updating, anchoring effects, hindsight biases, or attitude change.
“…The streams of research are categorized by methodology; behavioral research uses “the controlled environment of the laboratory” (Kämmer et al, 2022, p. 6), while organizational research uses field studies and surveys. Both methodologies have tradeoffs that constrain what can be studied.…”
mentioning
confidence: 99%
“…Both methodologies have tradeoffs that constrain what can be studied. The influence of advice solicitation, the advisor’s perspective, and the social relationship between decision-maker and advisor are understudied in the behavioral stream of research, because as Kämmer et al (2022) note, “many studies provide unsolicited advice to decision-makers” (p. 19). This is largely due to the experimental method that makes it difficult to create situations in which people would spontaneously solicit advice.…”
mentioning
confidence: 99%
“…Pennebaker (2022) notes, as the methods have expanded “we are asking a series of new theoretical and practical questions” (p. 576). Kämmer et al (2022) note a challenge to research on advice is focusing on the process of advising and how advisors communicate their advice. Process and interaction data that was painstakingly difficult to collect, transcribe, and analyze 15–20 years ago (van Swol & Kane, 2019) is now readily available.…”
mentioning
confidence: 99%
“…
How methodology shaped the two lines of research identified in Kämmer et al (2022) is explored in this short commentary. Behavioral research has generally used methodology with lab experiments and focused on advice utilization and whether advice can improve decision-making, but this line of research has been less able to study advice solicitation or the advising relationship.
How methodology shaped the two lines of research identified in Kämmer et al. (2022) is explored in this short commentary. Behavioral research has generally used methodology with lab experiments and focused on advice utilization and whether advice can improve decision-making, but this line of research has been less able to study advice solicitation or the advising relationship. Organizational research has generally used survey methodology and examined solicitation of advice but due to the longitudinal nature of organizational decision-making, it has focused less on advice utilization. However, given trends in new methodology toward big data online and natural language processing, methods in the field of advice research are likely to change substantially in the coming decade.
When making complex decisions, such as a medical diagnosis, decision makers typically gather, analyze, and synthesize (integrate) information. In a previous study, we showed that delegating such complex decisions to collaborating pairs increases decision quality substantially compared to that of individuals, without requiring different information gathering. Given the higher costs associated with teamwork, however, it is of great practical interest to understand when in the process the performance benefits of teams may arise, so that particular subtasks can be delegated to teams when most appropriate. We thus conducted an experimental study in which fourth‐year medical students (n = 109) worked either in pairs or alone on two separate subtasks of the diagnostic process: (1) analyzing diagnostic test results (e.g., X‐rays) and (2) integrating previously interpreted test results into diagnoses. Linear mixed‐effects models revealed a small benefit of collaborating pairs over individuals in both subtasks. We conclude that collaborating with a peer may pay off both when analyzing information and when integrating it into a diagnosis as it provides the opportunity to correct each other's errors and to make use of a greater knowledge base. These findings encourage the strategic use of collaboration with a colleague when making complex decisions. Further research into the underlying processes is needed.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.