Proceedings of the 2023 Conference on Innovation and Technology in Computer Science Education v. 2 2023
DOI: 10.1145/3587103.3594206
|View full text |Cite
|
Sign up to set email alerts
|

Transformed by Transformers: Navigating the AI Coding Revolution for Computing Education: An ITiCSE Working Group Conducted by Humans

Abstract: The recent advent of highly accurate and scalable large language models (LLMs) has taken the world by storm. From art to essays to computer code, LLMs are producing novel content that until recently was thought only humans could produce. Recent work in computing education has sought to understand the capabilities of LLMs for solving tasks such as writing code, explaining code, * Randomly-ordered Co-leaders Permission to make digital or hard copies of part or all of this work for personal or classroom use is gr… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
4
3
2

Relationship

0
9

Authors

Journals

citations
Cited by 23 publications
(8 citation statements)
references
References 11 publications
(10 reference statements)
0
4
0
Order By: Relevance
“…The GLLMMs can also generate creative content such as artwork, music, poetry, etc. [11] Because they are also typically pretrained on very large data sets, GLLMMs can be quickly applied to other fields or tasks without requiring training for the agent from scratch, so they are much more quickly deployable to specific tasks. The kinds of tasks to which GLLMMs can be applied include pre-training in content creation (such as text, audio tracks, images, or videos) that become part of a larger undertaking such as designing art, composing music, or telling a story.…”
Section: A Brief Primer On the Concepts And Nomenclature Of Aimentioning
confidence: 99%
“…The GLLMMs can also generate creative content such as artwork, music, poetry, etc. [11] Because they are also typically pretrained on very large data sets, GLLMMs can be quickly applied to other fields or tasks without requiring training for the agent from scratch, so they are much more quickly deployable to specific tasks. The kinds of tasks to which GLLMMs can be applied include pre-training in content creation (such as text, audio tracks, images, or videos) that become part of a larger undertaking such as designing art, composing music, or telling a story.…”
Section: A Brief Primer On the Concepts And Nomenclature Of Aimentioning
confidence: 99%
“…Furthermore, as suggested by some of our interviewees, identifying core elements everyone should be learning and starting to characterise a subject-specific pedagogy would significantly contribute to the acceptance of CE in the context(s) of undergraduate CS programmes. Perhaps these types of changes will be increasingly feasible and tractable in a world in which we are trying to better understand the emerging impact and ramifications of generative AI [41,91] as part of a post-COVID "new normal" [23,27], where there is a renewed focus on high-quality learning, teaching and assessment, and specifically what this means for effective pedagogy and practice in CS and CE [106,107]. Ultimately, we hope that an open international community of practice will form around this work.…”
Section: Conclusion and Next Stepsmentioning
confidence: 99%
“…The advent of large language models (LLMs) that can generate code is having a rapid and significant impact on computing education practice, particularly at the introductory level [24]. Traditional pedagogical approaches have focused on helping students learn how to write code.…”
Section: Introductionmentioning
confidence: 99%