2023
DOI: 10.1111/medu.15067
|View full text |Cite
|
Sign up to set email alerts
|

Whose problem is it anyway? Confronting myths of ‘problems’ in health professions education

Abstract: Introduction:The growing interest in knowledge translation and implementation science, both in clinical practice and in health professions education (HPE), is reflected in the number of studies that have sought to address what are believed to be evidence-practice gaps. Though this effort may be intended to ensure practice improvements are better aligned with research evidence, there is a common assumption that the problems researchers explore and the answers they generate are meaningful and applicable to pract… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 37 publications
0
1
0
Order By: Relevance
“…To perform this prompt engineer-ing, fine-tuning parameters, and human reinforcement learning from human feedback (RLHF) techniques are used to meet specific needs [4]. Prompt Engineering is the process of designing effective prompts or inputs for language models to generate desired outputs, particularly for those based on the Transformer architecture like GPT (Generative Pre-trained Transformer) models [19]. Language models like GPT are trained to generate text based on the input they receive, making the quality and specificity of the input.…”
Section: Phases Of Generative Ai Project Life Cyclementioning
confidence: 99%
“…To perform this prompt engineer-ing, fine-tuning parameters, and human reinforcement learning from human feedback (RLHF) techniques are used to meet specific needs [4]. Prompt Engineering is the process of designing effective prompts or inputs for language models to generate desired outputs, particularly for those based on the Transformer architecture like GPT (Generative Pre-trained Transformer) models [19]. Language models like GPT are trained to generate text based on the input they receive, making the quality and specificity of the input.…”
Section: Phases Of Generative Ai Project Life Cyclementioning
confidence: 99%