We report two experiments that investigated the regulation of memory accuracy with a new regulatory mechanism: the plurality option. This mechanism is closely related to the grain-size option but involves control over the number of alternatives contained in an answer rather than the quantitative boundaries of a single answer. Participants were presented with a slideshow depicting a robbery (Experiment 1) or a murder (Experiment 2), and their memory was tested with five-alternative multiple-choice questions. For each question, participants were asked to generate two answers: a single answer consisting of one alternative and a plural answer consisting of the single answer and two other alternatives. Each answer was rated for confidence (Experiment 1) or for the likelihood of being correct (Experiment 2), and one of the answers was selected for reporting. Results showed that participants used the plurality option to regulate accuracy, selecting single answers when their accuracy and confidence were high, but opting for plural answers when they were low. Although accuracy was higher for selected plural than for selected single answers, the opposite pattern was evident for confidence or likelihood ratings. This dissociation between confidence and accuracy for selected answers was the result of marked overconfidence in single answers coupled with underconfidence in plural answers. We hypothesize that these results can be attributed to overly dichotomous metacognitive beliefs about personal knowledge states that cause subjective confidence to be extreme.
Please refer to published version for the most recent bibliographic citation information. If a published version is known of, the repository item page linked to above, will contain details on accessing it.
The confidence-accuracy relationship has primarily been studied through recognition tests and correlation analysis. However, cued recall is more ecological from a forensic perspective. Moreover, there may be more informative ways of analysing the confidence-accuracy relationship than correlations. In the present study, participants viewed a video of a bank robbery and were asked cued recall questions covering general knowledge and the video itself. Confidence ratings were collected, and correlations, calibration and discrimination measures were calculated. All measures indicated a strong confidence-accuracy relationship that was better for general knowledge than eyewitness memory questions. However, there were no differences in confidence ratings for correct answers, suggesting that the differences could be limited to the evaluation of incorrect answers. We concluded that confidence may be a good marker for accuracy with cued recall, but that further research using ecological tests and more informative data analysis techniques is needed.Confidence in answers has been studied as a possible predictor of actual performance. Research on the confidence-accuracy relationship has been conducted using both general knowledge and eyewitness memory questions. Studies involving general knowledge questions have primarily been aimed at testing hypotheses and theories about how metamemory judgments work, whereas studies posing eyewitness memory questions have focused on application, based on the hypothesis that in a forensic context, confidence in answers helps distinguish between correct and incorrect information in a police interrogation.However, research conducted to date on the confidenceaccuracy relationship has some important shortcomings. First, it has primarily made use of recognition memory tests (e.g.
Two experiments are reported that investigate the impact of misinformation on memory accuracy and metacognitive resolution. In Experiment 1, participants viewed a series of photographs depicting a crime scene, were exposed to misinformation that contradicted details in the slides, and later took a recognition memory test. For each answer, participants were required to indicate whether they were willing to testify (report) their answer to the Court and to rate confidence. Misinformation impaired memory accuracy but it had no effect on resolution, regardless of whether resolution was indexed with confidence‐rating measures (γ correlation and mean confidence) or a report‐option measure (type‐2 discrimination: d′). In Experiment 2, a similar accuracy‐confidence dissociation was found, and the misinformation effect occurred mostly with fine‐grained responses, suggesting that responding was based on recollected details. We argue that the results support source‐monitoring (SM) accounts of accuracy and resolution rather than accounts based on trace strength. Copyright © 2010 John Wiley & Sons, Ltd.
Research on conversational exchanges shows that people attempt to optimise their responses' relevance when they definitely know the correct answer (e.g., "What time is it?"). However, such certainty is often unavailable while speakers may still be under social pressure to provide an answer. We investigated how social context influences the informativeness level when answering questions under uncertainty. In three experiments, participants answered difficult general-knowledge questions placed in different social contexts (formal vs. informal). Participants generated their answers, then they were presented with a given context, and decided on the number of alternative responses they wanted to provide (single, with one alternative vs. plural, with several alternatives) and whether the answer should be reported or withheld (report option). Participants reported more answers in the informal context. In the formal context, single answers were preferred, and they were more frequently reported. We conclude that social context influences the level of informativeness in a conversation, affecting achievable accuracy. Our results also show the joint influence of the confidence and the social context on willingness to share information.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.