This meta-analysis has two aims: (a) to address the main effects of problem based learning on two categories of outcomes: knowledge and skills; and (b) to address potential moderators of the effect of problem based learning. We selected 43 articles that met the criteria for inclusion: empirical studies on problem based learning in tertiary education conducted in reallife classrooms. The review reveals that there is a robust positive effect from PBL on the skills of students. This is shown by the vote count, as well as by the combined effect size. Also no single study reported negative effects. A tendency to negative results is discerned when considering the effect of PBL on the knowledge of students. The combined effect size is significantly negative. However, this result is strongly influenced by two studies and the vote count does not reach a significant level. It is concluded that the combined effect size for the effect on knowledge is non-robust. As possible moderators of PBL effects, methodological factors, expertise-level of students, retention period and type of assessment method were investigated. This moderator analysis shows that both for knowledge-and skills-related outcomes the expertise-level of the student is associated with the variation in effect sizes. Nevertheless, the results for skills give a consistent positive picture. For knowledge-related outcomes the results suggest that the differences encountered in the first and the second year disappear later on. A last remarkable finding related to the retention period is that students in PBL gained slightly less knowledge, but remember more of the acquired knowledge.
This meta-analysis investigated the influence of assessment on the reported effects of problem-based learning (PBL) by applying Sugrue's (1995) model of cognitive components of problem solving. Three levels of the knowledge structure that can be targeted by assessment of problem solving are used as the main independent variables: (a) understanding of concepts, (b) understanding of the principles that link concepts, and (c) linking of concepts and principles to conditions and procedures for application. PBL had the most positive effects when the focal constructs being assessed were at the level of understanding principles that link concepts. The results suggest that the implications of assessment must be considered in examining the effects of problem-based learning and probably in all comparative education research.KEYWORDS: assessment, meta-analysis, problem-based learning.Problem-based learning (PBL) represents a major development in higher education practice that continues to have a large impact across subjects and disciplines around the world. As indicated by many authors (Engel
In problem-based learning (PBL), implemented worldwide, students learn by discussing professionally relevant problems enhancing application and integration of knowledge, which is assumed to encourage students towards a deep learning approach in which students are intrinsically interested and try to understand what is being studied. This review investigates: (1) the effects of PBL on students’ deep and surface approaches to learning, (2) whether and why these effects do differ across (a) the context of the learning environment (single vs. curriculum wide implementation), and (b) study quality. Studies were searched dealing with PBL and students’ approaches to learning. Twenty-one studies were included. The results indicate that PBL does enhance deep learning with a small positive average effect size of .11 and a positive effect in eleven of the 21 studies. Four studies show a decrease in deep learning and six studies show no effect. PBL does not seem to have an effect on surface learning as indicated by a very small average effect size (.08) and eleven studies showing no increase in the surface approach. Six studies demonstrate a decrease and four an increase in surface learning. It is concluded that PBL does seem to enhance deep learning and has little effect on surface learning, although more longitudinal research using high quality measurement instruments is needed to support this conclusion with stronger evidence. Differences cannot be explained by the study quality but a curriculum wide implementation of PBL has a more positive impact on the deep approach (effect size .18) compared to an implementation within a single course (effect size of −.05). PBL is assumed to enhance active learning and students’ intrinsic motivation, which enhances deep learning. A high perceived workload and assessment that is perceived as not rewarding deep learning are assumed to enhance surface learning.
The purpose of this paper is to gain insight into the relationships between hands-on experiences with formative assessment, students' assessment preferences and their approaches to learning. The sample consisted of 108 university first-year Bachelor's students studying criminology. Data were obtained using the Revised two-factor study process questionnaire (R-SPQ-2F) and the Assessment preferences inventory (API). The study shows that differences in assessment preferences are correlated with differences in approach to learning. Students' preferences for assessment methods with higherorder thinking tasks are significantly lower after actual experience with a formative assessment. Moreover, students also changed their approaches to learning after hands-on experience with a formative mode of assessment. Surprisingly, this change evinced a more 'surface approach' to learning. Nevertheless, this is in line with other recent research findings. The paper ends with some possible explanations, and new directions for research are proposed.
The purpose of the present study is to gain more insight into the relationship between students' approaches to learning and students' quantitative learning outcomes, as a function of the different components of problem-solving that are measured within the assessment. Data were obtained from two sources: the revised two factor study process questionnaire (R-SPQ-2F) and students' scores in their final multiple-choice exam. Using a model of cognitive components of problem-solving translated into specifications for assessment, the multiple-choice questions were divided into three categories. Three aspects of the knowledge structure that can be targeted by assessment of problem-solving were used as the distinguishing categories. These were: understanding of concepts; understanding of the principles that link concepts; and linking of concepts and principles to application conditions and procedures. The 133 second year law school students in our sample had slightly higher scores for the deep approach than for the surface approach to learning. Plotting students' approaches to learning indicated that many students had low scores for both deep and surface approaches to learning. Correlational analysis showed no relationship between students' approaches to learning and the components of problem-solving being measured within the multiple choice assessment. Several explanations are discussed.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.