2017
DOI: 10.19030/jaese.v4i1.9973
|View full text |Cite
|
Sign up to set email alerts
|

Development Of The EGGS Exam Of GeoloGy Standards To Measure Students’ Understanding Of Common Geology Concepts

Abstract: Geoscience education researchers have considerable need for criterion-referenced, easy-to-administer, easy-to-score, conceptual surveys for undergraduates taking introductory science survey courses in order for faculty to monitor the learning impacts of innovative teaching. In response, this study establishes the reliability and validity of a 28-item, multiple-choice, pre- and post- EGGS Exam of GeoloGy Standards. EGGS addresses 11 concepts derived from a systematic analysis of the overlapping ideas from natio… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
5
0

Year Published

2018
2018
2022
2022

Publication Types

Select...
7

Relationship

0
7

Authors

Journals

citations
Cited by 7 publications
(5 citation statements)
references
References 49 publications
0
5
0
Order By: Relevance
“…Thus, geologists interpret these features and make hypotheses about their formation in a geological framework. Despite being central practices in geology, research demonstrates that rock classification is challenging for students (Dove, 1996;Francek, 2013;Frøyland, Remmen, & Sørvik, 2016;Guffey, Slater, & Slater, 2017;Happs, 1982;Kusnick, 2002;Stofflett, 1993;1994). Instead of observing features of rocks, students memorise rocks they have seen earlier, leading to a 'namedropping' strategy that does not align with scientific practice (Dove, 1998;Frøyland et al, 2016).…”
Section: Research On Students' Use Of Observation When Classifying Rocksmentioning
confidence: 99%
See 1 more Smart Citation
“…Thus, geologists interpret these features and make hypotheses about their formation in a geological framework. Despite being central practices in geology, research demonstrates that rock classification is challenging for students (Dove, 1996;Francek, 2013;Frøyland, Remmen, & Sørvik, 2016;Guffey, Slater, & Slater, 2017;Happs, 1982;Kusnick, 2002;Stofflett, 1993;1994). Instead of observing features of rocks, students memorise rocks they have seen earlier, leading to a 'namedropping' strategy that does not align with scientific practice (Dove, 1998;Frøyland et al, 2016).…”
Section: Research On Students' Use Of Observation When Classifying Rocksmentioning
confidence: 99%
“…These examples demonstrate that the incorrect use of observation (e.g. noticing ambiguous features or features that are irrelevant in the situation) can lead to incorrect rock classification (Dove, 1998;Ford, 2005;Frøyland et al, 2016;Guffey et al, 2017;Happs, 1982;Kortz & Murray, 2009). Connecting features to scientific theories of rock formation is another challenge for students, and a number of non-scientific and erroneous ideas about rock formation have been identified (Kusnick, 2002;Stofflett, 1993Stofflett, , 1994.…”
Section: Research On Students' Use Of Observation When Classifying Rocksmentioning
confidence: 99%
“…In response, Slater (2014) developed the Test Of Astronomy STandards, TOAST. Before the development of test questions on the TOAST, the team needed to identify the core concepts that students were expected to know and conceptually understand (Guffey et al, 2017). At the same time, the test needed to short and manageable.…”
Section: Background and Contextmentioning
confidence: 99%
“…The development of the CFA laid on development methods like some other tools. Science Teaching through its Astronomical Roots -STAR instrument (Sadler, 1998), Astronomy Diagnostic Test -ADT - (Zeilik, 2002), TOAST Test Of Astronomy STandards (Slater, 2014), Exam of GeoloGy Standards (Guffey, Slater, & Slater, 2017). The validation and reliability of the CFA discussed in a previous paper (Pundak, 2016).…”
Section: The Cfa Assessment Instrumentmentioning
confidence: 99%