Proceedings of the Twenty-Ninth International Joint Conference on Artificial Intelligence 2020
DOI: 10.24963/ijcai.2020/381
|View full text |Cite
|
Sign up to set email alerts
|

Toward a neuro-inspired creative decoder

Abstract: Creativity, a process that generates novel and meaningful ideas, involves increased association between task-positive (control) and task-negative (default) networks in the human brain. Inspired by this seminal finding, in this study we propose a creative decoder within a deep generative framework, which involves direct modulation of the neuronal activation pattern after sampling from the learned latent space. The proposed approach is fully unsupervised and can be used off- the-shelf. Several novelty me… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
6
0

Year Published

2022
2022
2022
2022

Publication Types

Select...
2
1

Relationship

1
2

Authors

Journals

citations
Cited by 3 publications
(7 citation statements)
references
References 7 publications
1
6
0
Order By: Relevance
“…We test this hypothesis through group-based subset scanning over the activation space that encodes groups of artifacts that may appear anomalous when analyzed together. Given that the proposed approach is model agnostic, we test it under two generative models, a creative VAE Decoder and a Creative Generator variant of ArtGAN architectures [Das et al, 2020;Tan et al, 2019]. We scan both the pixel and activation space.…”
Section: Methodsmentioning
confidence: 99%
See 4 more Smart Citations
“…We test this hypothesis through group-based subset scanning over the activation space that encodes groups of artifacts that may appear anomalous when analyzed together. Given that the proposed approach is model agnostic, we test it under two generative models, a creative VAE Decoder and a Creative Generator variant of ArtGAN architectures [Das et al, 2020;Tan et al, 2019]. We scan both the pixel and activation space.…”
Section: Methodsmentioning
confidence: 99%
“…We scan both the pixel and activation space. We also validate the proposed approach across multiple datasets, including images from MNIST, Fashion-MNIST (FMNIST) [Xiao et al, 2017], Combo [Das et al, 2020], which is a combination of the previous two datasets, and WikiArt (with ArtGAN) [Tan et al, 2019].…”
Section: Methodsmentioning
confidence: 99%
See 3 more Smart Citations