Our system is currently under heavy load due to increased usage. We're actively working on upgrades to improve performance. Thank you for your patience.
2024
DOI: 10.1109/tii.2023.3328436
|View full text |Cite
|
Sign up to set email alerts
|

Knowledge Distillation for Scalable Nonintrusive Load Monitoring

Giulia Tanoni,
Lina Stankovic,
Vladimir Stankovic
et al.

Abstract: Smart meters allow the grid to interface with individual buildings and extract detailed consumption information using nonintrusive load monitoring (NILM) algorithms applied to the acquired data. Deep neural networks, which represent the state of the art for NILM, are affected by scalability issues since they require high computational and memory resources, and by reduced performance when training and target domains mismatched. This article proposes a knowledge distillation approach for NILM, in particular for … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
1

Relationship

1
0

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 35 publications
0
1
0
Order By: Relevance
“…Future works will extend the method by considering criteria based on explainability [65,66] to select the subset of data to be labelled by the users. Moreover, advanced neural network techniques [67][68][69] will be included to improve the effectiveness and efficiency of the method.…”
Section: Discussionmentioning
confidence: 99%
“…Future works will extend the method by considering criteria based on explainability [65,66] to select the subset of data to be labelled by the users. Moreover, advanced neural network techniques [67][68][69] will be included to improve the effectiveness and efficiency of the method.…”
Section: Discussionmentioning
confidence: 99%