Scientific Reasoning and Argumentation 2018
DOI: 10.4324/9780203731826-1
|View full text |Cite
|
Sign up to set email alerts
|

The Roles of Domain-Specific and Domain-General Knowledge in Scientific Reasoning and Argumentation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
3
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(7 citation statements)
references
References 1 publication
1
3
0
Order By: Relevance
“…A digital platform (Mindomo) was used for the presentation of the arguments. In line with previous studies (Katharina et al, 2018;Valero Haro et al, 2019a) the students learned how to construct an argument before engaging in a computer-supported collaborative argumentation platform. The control group received assignments that included reading academic material and addressing related questions.…”
Section: Activity Designsupporting
confidence: 81%
“…A digital platform (Mindomo) was used for the presentation of the arguments. In line with previous studies (Katharina et al, 2018;Valero Haro et al, 2019a) the students learned how to construct an argument before engaging in a computer-supported collaborative argumentation platform. The control group received assignments that included reading academic material and addressing related questions.…”
Section: Activity Designsupporting
confidence: 81%
“…Based on this preliminary validation of the U.S. assessment in Germany, we modified the conceptual framework (Section Theoretical Components of COR) to accommodate for the close relationship between COR and generic critical thinking, multiple-source comprehension, scientific reasoning and informal argumentation approaches (Walton, 2006;Fischer et al, 2014Fischer et al, , 2018Goldman and Brand-Gruwel, 2018;Jahn and Kenner, 2018), and expanded the U.S. assessment framework to cover all online sources that students use for learning. We developed the scoring rubrics accordingly to validly measure the critical online reasoning (COR) ability of higher education students of all degree programs in Germany in accordance with our construct definition (Section Construct Definition of Critical Online Reasoning).…”
Section: Critical Online Reasoning Assessment (Cora)mentioning
confidence: 99%
“…We focus here on a more general set of CQs, but these various approaches are complementary. Also, scientific reasoning has both domain-general and domain-specific components (Engelmann et al, 2018), and the CQMAA is not meant to downplay the importance of domain knowledge, as domain knowledge is needed to address the CQs. The CQMAA is therefore complementary to other tools that help relinquish teacher control, such as texts, student experiments, or evidence cards, which are meant to enhance domain knowledge.…”
Section: Completenessmentioning
confidence: 99%