2022
DOI: 10.1017/pds.2022.185
|View full text |Cite
|
Sign up to set email alerts
|

Generative Pre-Trained Transformer for Design Concept Generation: An Exploration

Abstract: Novel concepts are essential for design innovation and can be generated with the aid of data stimuli and computers. However, current generative design algorithms focus on diagrammatic or spatial concepts that are either too abstract to understand or too detailed for early phase design exploration. This paper explores the uses of generative pre-trained transformers (GPT) for natural language design concept generation. Our experiments involve the use of GPT-2 and GPT-3 for different creative reasonings in design… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
15
0
1

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
4
2

Relationship

0
10

Authors

Journals

citations
Cited by 37 publications
(16 citation statements)
references
References 38 publications
0
15
0
1
Order By: Relevance
“…Zhu and Luo (2021) fine-tune GPT-2 for mapping the problems (including categories) to solutions using problem-solution data obtained from RedDot 54 . They also explore the capabilities of GPT-3 that support analogy-by-design in terms of generating text descriptions upon providing source-target domain labels as inputs.…”
Section: Discussionmentioning
confidence: 99%
“…Zhu and Luo (2021) fine-tune GPT-2 for mapping the problems (including categories) to solutions using problem-solution data obtained from RedDot 54 . They also explore the capabilities of GPT-3 that support analogy-by-design in terms of generating text descriptions upon providing source-target domain labels as inputs.…”
Section: Discussionmentioning
confidence: 99%
“…BERT only encodes and generates a language model, so an encoder is enough. It has powerful ability to learn and understand natural language representations such as BERT [14], GPT [17], ELECTRA [18], etc.…”
Section: Related Work a Pre-trained Language Modelsmentioning
confidence: 99%
“…Furthermore, the hybrid model has obtained a greater accuracy Recently, large language models based on transformers [26] have achieved remarkable success in learning language representations through the utilization of vast amounts of data. Prominent examples of such models are OpenAI GPT [27], BERT [28], and RoBERTa [29]. These models have demonstrated exceptional performance when fine-tuned for various downstream tasks such as text classification, question answering, and natural language inference.…”
Section: Mandal and Senmentioning
confidence: 99%