A 3-step strategy is proposed for cognitive/information processing task analysis which may lead to practical procedures for task analysis and instructional design. The three steps are: (1) concept hierarchy analysis, (2) analysis of example sets to teach relations among concepts, ~/nd (3) analysis of problem sets to build a progressively larger schema for the problem space. The strategy avoids extremely detailed information flow analysis performed in much descriptive research on human information processing. The research basis of the strategy is outlined by offering a descriptive model of human performance which identifies four dimensions: (1) Knowledge System, (2) Cognitive/Information Processing System, (3) Physiological System, and (4) Motivational/ Emotional System. The Knowledge Dimension is analyzed as consisting of problem-solving tasks, because of the basic cognitive process of perception. Problem-solving is structured by expectancies, which are in turn structured by schemata. Hence, the strategy proposed for task analysis focuses on identifying the structure of the schema underlying a problem space. Implications of the strategy for design of instruction to teach the schema are discussed. Contrasts are drawn with conventional (Gagn6-style) methods. Examples from aviation training are presented.
Patriot High School (PHS) adopted a remediation strategy to help its 10th-grade students at risk of failing the Math portion of MCAS, the state's end of year competency exam. The centerpiece of that strategy was a computer-based instructional (CBI) course. PHS used a commercially available CBI product to align the course content with the competencies covered on the MCAS exam. This case study examines the overall effectiveness of the PHS strategies, and in particular, the role of CBI. Participant MCAS scores and CBI performance (measured by module-mastery data) are analyzed, and an interview with the course instructor is summarized. Finally, PHS scores were compared to the overall state MCAS scores for the same years. Overall scores of all 10th graders increased significantly compared to their 8th-grade scores, students who participated in the CBI course improved more than the students who did not. The passing rate at PHS improved from 40% in 1999 to 84% in 2001, compared to an improvement of from 47% to 75% statewide. A significant correlation was identified between the MCAS scores and the program usage data, with student CBI module mastery correlated with higher MCAS scores. Overall, the instructor was positive about the impact of the course and believed that the course gave many under-performers a chance to succeed when more traditional methods had failed. It seems likely that CBI contributed to PHS's success. Although we report herein on just one case, we argue that CBI might play an important a role in the high stakes test environment in the USA and eleswhere.
The Public Health Service policy, Animal Welfare Act regulations, and the Guide for the Care and Use of Laboratory Animals all require that institutions provide training for personnel engaged in animal research. Most research facilities have developed training programs to meet these requirements but may not have developed ways of assessing the effectiveness of these programs. Omission of this critical activity often leads to training that is ineffective, inefficient, or unnecessary. Evaluating the effectiveness of biomedical research and animal care training should involve a combination of assessments of performance, competence and knowledge, and appropriate tests for each type of knowledge, used at appropriate time intervals. In this article, the hierarchical relationship between performance, competence, and knowledge is described. The discussion of cognitive and psychomotor knowledge includes the important distinction between declarative and procedural knowledge. Measurement of performance is described and can include a variety of indirect and direct measurement techniques. Each measurement option has its own profile of strengths and weaknesses in terms of measurement validity, reliability, and costs of development and delivery. It is important to understand the tradeoffs associated with each measurement option, and to make appropriate choices of measurement strategy based on these tradeoffs arrayed against considerations of frequency, criticality, difficulty of learning, logistics, and budget. The article concludes with an example of how these measurement strategies can be combined into a cost-effective assessment plan for a biomedical research facility.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.