2018
DOI: 10.1109/access.2018.2869585
|View full text |Cite
|
Sign up to set email alerts
|

Query Intent Recognition Based on Multi-Class Features

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
7
0
2

Year Published

2019
2019
2023
2023

Publication Types

Select...
3
2
2

Relationship

0
7

Authors

Journals

citations
Cited by 15 publications
(9 citation statements)
references
References 15 publications
0
7
0
2
Order By: Relevance
“…Adversarial training method for the multi-task and multi-lingual joint modelling [Mohasseb et al 2018] Grammar feature exploration Grammar-based framework with 3 main features [Xie et al 2018] Short text; Semantic feature expansion Semantic Tag-empowered combined features [Qiu et al 2018] Potential consciousness information mining A similarity calculation method based on LSTM and a traditional machine learning method based on multi-feature extraction OOD utterances Multi-task learning [Cohan et al 2019] Utilisation of naturally labelled data Multitask learning based on joint loss [Shridhar et al 2019] OOV issue; Small/lack of labelled training data Subword semantic hashing ] Learning of deep semantic information Hybrid CNN and bidirectional GRU neural network with pretrained embeddings (Char-CNN-BGRU) [Lin and Xu 2019] Emerging intents detection Maximise inter-class variance and minimise intra-class variance to get the discriminative feature [Ren and Xue 2020] Similar utterance with different intent Triples of samples used for training [Yilmaz and Toraman 2020] OOD utterances KL divergence vector for classification [Costello et al 2018] developed a novel multi-layer ensembling approach that ensembles both different model initialisation and different model architectures to determine how multi-layer ensembling improves performance on multilingual intent classification. They constructed a CNN with character-level embedding and a bidirectional CNN with attention mechanism.…”
Section: Papermentioning
confidence: 99%
See 1 more Smart Citation
“…Adversarial training method for the multi-task and multi-lingual joint modelling [Mohasseb et al 2018] Grammar feature exploration Grammar-based framework with 3 main features [Xie et al 2018] Short text; Semantic feature expansion Semantic Tag-empowered combined features [Qiu et al 2018] Potential consciousness information mining A similarity calculation method based on LSTM and a traditional machine learning method based on multi-feature extraction OOD utterances Multi-task learning [Cohan et al 2019] Utilisation of naturally labelled data Multitask learning based on joint loss [Shridhar et al 2019] OOV issue; Small/lack of labelled training data Subword semantic hashing ] Learning of deep semantic information Hybrid CNN and bidirectional GRU neural network with pretrained embeddings (Char-CNN-BGRU) [Lin and Xu 2019] Emerging intents detection Maximise inter-class variance and minimise intra-class variance to get the discriminative feature [Ren and Xue 2020] Similar utterance with different intent Triples of samples used for training [Yilmaz and Toraman 2020] OOD utterances KL divergence vector for classification [Costello et al 2018] developed a novel multi-layer ensembling approach that ensembles both different model initialisation and different model architectures to determine how multi-layer ensembling improves performance on multilingual intent classification. They constructed a CNN with character-level embedding and a bidirectional CNN with attention mechanism.…”
Section: Papermentioning
confidence: 99%
“…Rather than temporal information [Qiu et al 2018] proposed the construction of multiple features from user metadata, regex extraction of named entities, and probabilistic context free grammar of composite entities.…”
Section: Contextual/temporal Information Modellingmentioning
confidence: 99%
“…Peningkatan efektivitas mesin pencari untuk menampilkan data yang relevan dapat dilakukan dengan menyediakan fungsi pencarian yang memahami maksud pengguna. Menurut Qiu et al (2018), penelitian tentang teknologi pencarian informasi berdasarkan maksud pengguna telah banyak dilakukan setelah taksonomi pencarian web disusun oleh Broder (2002). Salah satu arah penelitian tersebut adalah penerapan klasifikasi kueri pada mesin pencari.…”
Section: Pendahuluanunclassified
“…Pada penelitian Qiu et al (2018) telah dibangun metode klasifikasi kueri menggunakan LSTM similarity dan time sequence model untuk mengenali maksud dari kueri dalam pencarian data individu. Kemudian, penelitian Bortnikova et al (2019) telah membangun model neural network yang diintegrasikan dengan mesin pencari guna mengklasifikasikan kueri terhadap lima kategori.…”
Section: Pendahuluanunclassified
“…Popular methods include recurrent neural network (RNN) [26], the joint model for ID and SF [27], and the attention-based model [28]. Most of these studies focus on simple query statements, but the performance improvement in task understanding for service robots is not significant [29]. The reason may be that common query task descriptions, such as ''What are the flights from Tacoma to San jose,'' differ from service language which usually contains verbs carrying core instructional information, such as ''carry'' in ''Could you please help me carry this book to the bedroom.''…”
Section: Related Workmentioning
confidence: 99%