This study investigates how the timing of the consideration of Big Data visualizations affects an auditor's evaluation of evidence and professional judgments. In addition, we examine whether the use of an intuitive processing mode, as compared to a deliberative processing mode, influences an auditor's use and evaluation of Big Data visualizations. We conduct an experiment with 127 senior auditors from two Big 4 firms and find that auditors have difficulty recognizing patterns in Big Data visualizations when viewed before more traditional audit evidence. Our findings also indicate that auditors who view Big Data visualizations containing patterns that are contrary to management assertions after they view traditional audit evidence have greater concerns about potential misstatements and increase budgeted hours more. Overall, our results suggest that Big Data visualizations used as evidential matter have fewer benefits when they are viewed before auditors examine more traditional audit evidence.
Audit data analytics (ADAs) allow auditors to analyze the entire population of transactions which has measurable benefits for audit quality. However, auditors caution that the level of assurance on the financial statements is not incrementally increased. We examine whether the testing methodology and the type of ICFR opinion issued affect jurors' perceptions of auditor negligence. We predict and find that when auditors issue an unqualified ICFR opinion, jurors make higher negligence assessments when auditors employ statistical sampling than when they employ ADAs. Further, when auditors issue an adverse ICFR opinion, jurors attribute less blame to auditors and more blame to the investor for an audit failure. Additionally, jurors perceive the use of ADAs as an indicator of higher audit quality and are less likely to find auditors negligent. However, jurors do not perceive a difference in the level of assurance provided when auditors use ADAs versus sampling testing methods.
SUMMARY: On December 17, 2015, the International Auditing and Assurance Standards Board (IAASB) issued an Invitation to Comment entitled Enhancing Audit Quality in the Public Interest: A Focus on Professional Skepticism, Quality Control and Group Audits (hereafter, the ITC). The ITC highlights the IAASB's discussions regarding the three separate, but related, topics: professional skepticism, quality control, and group audits, in order to solicit feedback on these topics from various stakeholders. The ITC also discusses potential standard-setting activities the IAASB could participate in to enhance audit quality. The comment period ended on May 16, 2016. This commentary summarizes the contributors' views on selected questions posed in the ITC. Data Availability: The invitation to comment (as of May 23, 2016) is available at: https://www.ifac.org/system/files/publications/files/IAASB-Invitation-to-Comment-Enhancing-Audit-Quality.pdf
Two experiments examine the effects of visualizing uncertainty on attention, cognitive arousal, and incorporation of uncertainty information into judgments. The first experiment employs psychophysiological measurements to understand how different presentations of uncertainty information influence decision-making processes. Results indicate that participants attend more to uncertainty information when uncertainty is incorporated directly into a visualization. Pupillometry and eye tracking analyses indicate that participants exhibit greater attention to uncertainty information, fixate more on the bounds of uncertainty, and spend more time examining uncertainty information when uncertainty is visualized, compared to when uncertainty is depicted textually (i.e., not visually). In addition, the decisions of participants who view visualizations directly depicting uncertainty better integrate the level of uncertainty in the underlying data. The second experiment reveals that experienced auditors are more likely to appropriately use uncertainty information when it is visualized.
We investigate effects of audit evidence in the form of big data visualizations on jurors’ decisions. Using an experiment with mock juror participants (n = 582), the study examines how visualization design features and audit evidence reliability affect jurors’ negligence assessments. We find evidence for interactive effects of visualization design and evidence reliability where mock jurors make higher negligence likelihood judgments when audit evidence reliability is higher and visualizations are more vivid. Mediation results indicate that the combination of more vivid visualizations and more reliable audit evidence produces stronger emotional responses related to the auditor defendant; these negative emotional responses increase the likelihood of finding the auditor to be negligent. Overall, we find that data visualization techniques that can improve audit quality may expose auditors to increased litigation risk. Our study informs academics, auditors and regulators about the potential effects of audit evidence visualization choices on lay evaluators’ judgments.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.