2021
DOI: 10.48550/arxiv.2109.09605
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

JobBERT: Understanding Job Titles through Skills

Abstract: Job titles form a cornerstone of today's human resources (HR) processes. Within online recruitment, they allow candidates to understand the contents of a vacancy at a glance, while internal HR departments use them to organize and structure many of their processes. As job titles are a compact, convenient, and readily available data source, modeling them with high accuracy can greatly benefit many HR tech applications. In this paper, we propose a neural representation model for job titles, by augmenting a pre-tr… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
9
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
3
1
1

Relationship

0
5

Authors

Journals

citations
Cited by 6 publications
(9 citation statements)
references
References 6 publications
(12 reference statements)
0
9
0
Order By: Relevance
“…Furthermore, Decorte et al [17] proposed a neural representation model for job titles, called JobBERT, by supplementing the pre-trained language model with co-occurrence information from skill labels extracted from job postings. The model leads to significant improvements over the use of generic sentence encoders for the title normalization work task, for which we publish a new evaluation benchmark.…”
Section: Related Workmentioning
confidence: 99%
“…Furthermore, Decorte et al [17] proposed a neural representation model for job titles, called JobBERT, by supplementing the pre-trained language model with co-occurrence information from skill labels extracted from job postings. The model leads to significant improvements over the use of generic sentence encoders for the title normalization work task, for which we publish a new evaluation benchmark.…”
Section: Related Workmentioning
confidence: 99%
“…In addition, they presented an encode-decode based fusion method to obtain a unified representation from the multi-view representations. JobBERT is the latest effort in this trajectory, where the authors of [39] presented a neural representation model for job titles, by augmenting a pre-trained language model with co-occurrence information from skill labels extracted from job vacancies.…”
Section: B Representation Learningmentioning
confidence: 99%
“…Although job titles generally follow a certain structure [1], they are often freely structured and could contain some degree of bias. There was some work carried out aiming at normalizing job titles and relating their free form to the standardized job title form [1,4], [5].…”
Section: Introductionmentioning
confidence: 99%
“…However [4] points out that only 65 % of job titles could be linked to their corresponding normalized taxonomy without ambiguity. This leaves 35 % of Job Ads for which it is of utmost importance to analyze requirements specified in the job description section to map the job to the taxonomy.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation