2019
DOI: 10.48550/arxiv.1902.08835
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Transfer Learning for Non-Intrusive Load Monitoring

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
1
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
2
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(8 citation statements)
references
References 34 publications
0
1
0
Order By: Relevance
“…The model proposed in [19] achieves an excellent balance between complexity and performance, outperforming both FHMM and previous DL models. This algorithm has been tested on several public datasets; in particular, its generalization capability was tested in [23] by providing a system with data on homes belonging to datasets other than the one presented in training. Thus, in this work, DL models are used to demonstrate the feasibility of a solution based on a small, low-power microcontroller for real-time energy consumption monitoring.…”
Section: I2mtc Paper Id: 1570780806mentioning
confidence: 99%
See 3 more Smart Citations
“…The model proposed in [19] achieves an excellent balance between complexity and performance, outperforming both FHMM and previous DL models. This algorithm has been tested on several public datasets; in particular, its generalization capability was tested in [23] by providing a system with data on homes belonging to datasets other than the one presented in training. Thus, in this work, DL models are used to demonstrate the feasibility of a solution based on a small, low-power microcontroller for real-time energy consumption monitoring.…”
Section: I2mtc Paper Id: 1570780806mentioning
confidence: 99%
“…Table I in DL called overfitting [29], where the model becomes too tightly fitted to the training data, hindering its ability to perform well on new data. For each appliance, validation is performed using the entire dataset of a single house, according to the methodology established in [23], during model presentation. The data for training are preprocessed before being used for training.…”
Section: B Training Settingsmentioning
confidence: 99%
See 2 more Smart Citations
“…Two transfer learning schemes for NILM, appliance transfer learning (ATL) and cross-domain transfer learning (CTL), are proposed in [25]. The authors investigate in transferability of Deep Neural Networks for NILM and find that sequence-topoint learning is transferable in the sense that this technique can be applied to test data without fine-tuning, given that the training and test data are in a similar domain.…”
Section: Related Workmentioning
confidence: 99%