2021
DOI: 10.2139/ssrn.3971286
|View full text |Cite
|
Sign up to set email alerts
|

Compensation Disclosure: A Study via Semantic Similarity

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(1 citation statement)
references
References 45 publications
0
1
0
Order By: Relevance
“…The authors compare their model with seven baseline models and report that their model produces the highest accuracy of document classification. Gaulin and Peng (2021) use the Doc2Vec word embedding model, which extends Word2Vec by adding document vectors to generate a vector representation for each document. The authors use these vectors as inputs to the cosine similarity model to capture semantic similarity of executive compensation disclosures across firms.…”
Section: Word Embedding Methodsmentioning
confidence: 99%
“…The authors compare their model with seven baseline models and report that their model produces the highest accuracy of document classification. Gaulin and Peng (2021) use the Doc2Vec word embedding model, which extends Word2Vec by adding document vectors to generate a vector representation for each document. The authors use these vectors as inputs to the cosine similarity model to capture semantic similarity of executive compensation disclosures across firms.…”
Section: Word Embedding Methodsmentioning
confidence: 99%