2017 IEEE 28th Annual International Symposium on Personal, Indoor, and Mobile Radio Communications (PIMRC) 2017
DOI: 10.1109/pimrc.2017.8292587
|View full text |Cite
|
Sign up to set email alerts
|

A convolutional neural network for search term detection

Abstract: Pathfinding in hospitals is challenging for patients, visitors, and even employees. Many people have experienced getting lost due to lack of clear guidance, large footprint of hospitals, and confusing array of hospital wings. In this paper, we propose Halo; An indoor navigation application based on voice-user interaction to help provide directions for users without assistance of a localization system. The main challenge is accurate detection of origin and destination search terms. A custom convolutional neural… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
6
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
4
2
2

Relationship

0
8

Authors

Journals

citations
Cited by 9 publications
(6 citation statements)
references
References 22 publications
0
6
0
Order By: Relevance
“…RNN is a class of artificial neural networks specially designed to process sequential data, such as time series, text, and speech (Salehinejad et al, 2017). Unlike traditional feedforward neural networks (like CNNs), RNNs have connections that form directed cycles, allowing them to maintain internal memory and capture temporal dependencies within the input data.…”
Section: Recurrent Neural Networkmentioning
confidence: 99%
“…RNN is a class of artificial neural networks specially designed to process sequential data, such as time series, text, and speech (Salehinejad et al, 2017). Unlike traditional feedforward neural networks (like CNNs), RNNs have connections that form directed cycles, allowing them to maintain internal memory and capture temporal dependencies within the input data.…”
Section: Recurrent Neural Networkmentioning
confidence: 99%
“…The LSTM network, a type of recurrent neural network (RNN), employs a cyclic structure that transfers the hidden layer's output back to itself, allowing it to capture temporal features and information from previous time series signals [37]. Proposed by Hochreiter and Schmidhuber [38], LSTM was designed to address the vanishing gradient problem through the use of memory cells [21].…”
Section: Long Short-term Memorymentioning
confidence: 99%
“…A large amount of work has been devoted in the previous years to accelerate RNNs [53]. One of the most commonly applied approaches is to accelerate RNNs by using GPU accelerators [40].…”
Section: Related Workmentioning
confidence: 99%