2016
DOI: 10.1109/taslp.2016.2531286
|View full text |Cite
|
Sign up to set email alerts
|

Exploiting Turn-Taking Temporal Evolution for Personality Trait Perception in Dyadic Conversations

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
12
0

Year Published

2017
2017
2023
2023

Publication Types

Select...
6
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 22 publications
(12 citation statements)
references
References 29 publications
0
12
0
Order By: Relevance
“…With the wider adoption of deep learning techniques and the incremental availability of data and machine resources, novel studies emerged. In [31], Su et al used LIWC to make grammar annotations and they applied it with dialogue transcripts. Deep learning methods translate text into a vectorial space through the computation of the word distribution inside textual documents.…”
Section: Lexical Hypothesis and Nlp In Personality Estimationmentioning
confidence: 99%
“…With the wider adoption of deep learning techniques and the incremental availability of data and machine resources, novel studies emerged. In [31], Su et al used LIWC to make grammar annotations and they applied it with dialogue transcripts. Deep learning methods translate text into a vectorial space through the computation of the word distribution inside textual documents.…”
Section: Lexical Hypothesis and Nlp In Personality Estimationmentioning
confidence: 99%
“…It was just in recent years that researchers moved towards deep learning. Kalghatgi et al [28] and Su et al [29] used neural networks by feeding them a number of meticulously hand-crafted features, the first about syntax and social behavior while the latter regarding grammar and LIWC annotations extracted from a dialogue. Majumder et al [30] use a CNN to derive a fixed-length feature vector starting from word2vec word embeddings [31], which they extend with eighty four additional features from Mairesse's library [32].…”
Section: Related Workmentioning
confidence: 99%
“…Recurrent networks are good for taking care of language dependencies. [92] make use of vanilla RNNs for modeling short term temporal evolution in conversation. A 255-dimensional linguistic feature vector is coupled Hidden Markov Model (C-HMM) is employed for detecting the personalities of two speakers across speaker turns in each dialog by using long-term turntaking temporal evolution and cross-speaker contextual information.…”
Section: Textmentioning
confidence: 99%