ICASSP 2019 - 2019 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) 2019
DOI: 10.1109/icassp.2019.8682486
|View full text |Cite
|
Sign up to set email alerts
|

Transferability of Neural Network Approaches for Low-rate Energy Disaggregation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

2
96
0
1

Year Published

2019
2019
2024
2024

Publication Types

Select...
6
2
1

Relationship

2
7

Authors

Journals

citations
Cited by 101 publications
(103 citation statements)
references
References 23 publications
2
96
0
1
Order By: Relevance
“…The integration of domain knowledge further enriches the design of CNN architectures. An on/off state classification subnetwork can be added in parallel to the regression sub-network so that the model can learn from on/off state information directly [2,18]. The work in this paper adopts the structure of subtask gated network (SGN) [2] as a starting point.…”
Section: Introductionmentioning
confidence: 99%
“…The integration of domain knowledge further enriches the design of CNN architectures. An on/off state classification subnetwork can be added in parallel to the regression sub-network so that the model can learn from on/off state information directly [2,18]. The work in this paper adopts the structure of subtask gated network (SGN) [2] as a starting point.…”
Section: Introductionmentioning
confidence: 99%
“…Some NILM algorithms [15], [16] require priors to build a general model then tune it to a specific house. Some success with houses in the same country/region.…”
Section: A General Model Tuningmentioning
confidence: 99%
“…However, this has not been proven to be successful inter-country/region; e.i., having a model of a dishwasher in the UK will not work for disaggregating dishwasher in USA. This method is often referred to as transfer learning [16].…”
Section: A General Model Tuningmentioning
confidence: 99%
“…AMPd [23]). Approaches based on convolutional neural networks (CNNs) [24][25][26], recurrent neural networks (RNNs) [27,28] and long short-time memories (LSTMs) [27,29] have been proposed in the literature, while denoising autoencoders (dAEs) [30] and gate recurrent units (GRUs) [26] have also been used. Approaches with SS are based on single-channel source separation algorithms (e.g.…”
Section: Introductionmentioning
confidence: 99%