International Conference on Cloud Computing, Internet of Things, and Computer Applications (CICA 2022) 2022
DOI: 10.1117/12.2642583
|View full text |Cite
|
Sign up to set email alerts
|

Frequency adjustable energy harvesting for autonomous wireless sensor nodes

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
15
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
3
2
1

Relationship

0
6

Authors

Journals

citations
Cited by 9 publications
(15 citation statements)
references
References 6 publications
0
15
0
Order By: Relevance
“…Given the prevailing preference for transformer-based models in time series forecasting, this study strongly recommends exploring autocyclic within optimized transformer models such as informer [42], fedformer [43], or autoformer [38]. Such exploration aims to establish superior benchmarks, emphasizing the crucial role of autocorrelation in enhancing the effectiveness of autocyclic.…”
Section: E Testing On a Different Modelmentioning
confidence: 99%
“…Given the prevailing preference for transformer-based models in time series forecasting, this study strongly recommends exploring autocyclic within optimized transformer models such as informer [42], fedformer [43], or autoformer [38]. Such exploration aims to establish superior benchmarks, emphasizing the crucial role of autocorrelation in enhancing the effectiveness of autocyclic.…”
Section: E Testing On a Different Modelmentioning
confidence: 99%
“…The resulting products are still being used by many power companies today. Later, a large body of literature has developed and applied deep learning models including LSTM-RNN [21], RBM [31], CNN [57] and more recently, transformer-based network [10,60]. Despite the strong fitting power, deep learning architectures often suffer from overfitting due to multiple layers [46] and not well-interpretable to human.…”
Section: Related Workmentioning
confidence: 99%
“…2022 and FEDformer Zhou et al. 2022), and popular ones (e.g., TCN Bai et al. 2018 and LSTM Hochreiter and Schmidhuber 1997); (2) Traditional machine learning models such as CatBoost Dorogush et al.…”
Section: Eforecaster Platformmentioning
confidence: 99%