2021
DOI: 10.3390/app112311514
|View full text |Cite
|
Sign up to set email alerts
|

Time-Aware and Feature Similarity Self-Attention in Vessel Fuel Consumption Prediction

Abstract: An accurate vessel fuel consumption prediction is essential for constructing a ship route network and vessel management, leading to efficient sailings. Besides, ship data from monitoring and sensing systems accelerate fuel consumption prediction research. However, the ship data consist of three properties: sequential, irregular time interval, and feature importance, making the predicting problem challenging. In this paper, we propose Time-aware Attention (TA) and Feature-similarity Attention (FA) applied to bi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4

Relationship

1
3

Authors

Journals

citations
Cited by 4 publications
(2 citation statements)
references
References 34 publications
0
2
0
Order By: Relevance
“…In the time-series prediction domain, [37]- [39] demonstrated that Transformer achieved better performance than the previous time-series models. [39] suggested that Transformer models achieved higher performance than LSTM models in vessel fuel consumption prediction and [37], [38] proved the outstanding of Transformer by comparing their methods with both statistical methods and RNNs in influenza and multi-horizon time-series prediction respectively. As previous studies have proven the success of Transformer in time-series prediction, researchers have proposed Transformer-based time-series models that satisfy both efficiency and performance [40]- [43].…”
Section: B Deep Learning Models For Time-series Predictionmentioning
confidence: 99%
See 1 more Smart Citation
“…In the time-series prediction domain, [37]- [39] demonstrated that Transformer achieved better performance than the previous time-series models. [39] suggested that Transformer models achieved higher performance than LSTM models in vessel fuel consumption prediction and [37], [38] proved the outstanding of Transformer by comparing their methods with both statistical methods and RNNs in influenza and multi-horizon time-series prediction respectively. As previous studies have proven the success of Transformer in time-series prediction, researchers have proposed Transformer-based time-series models that satisfy both efficiency and performance [40]- [43].…”
Section: B Deep Learning Models For Time-series Predictionmentioning
confidence: 99%
“…Owing to the added advantage of an attention mechanism that alleviates the vanishing gradient problem, the sequential feature extraction ability of Transformer helps achieve state-of-theart performance in diverse domains such as pre-trained language models [32], image-relevant tasks [33]- [35], and a multi-modal task [36]. In the time-series prediction domain, [37]- [39] demonstrated that Transformer achieved better performance than the previous time-series models. [39] suggested that Transformer models achieved higher performance than LSTM models in vessel fuel consumption prediction and [37], [38] proved the outstanding of Transformer by comparing their methods with both statistical methods and RNNs in influenza and multi-horizon time-series prediction respectively.…”
Section: B Deep Learning Models For Time-series Predictionmentioning
confidence: 99%