2020
DOI: 10.48550/arxiv.2009.07756
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Exploring Bayesian Surprise to Prevent Overfitting and to Predict Model Performance in Non-Intrusive Load Monitoring

Richard Jones,
Christoph Klemenjak,
Stephen Makonin
et al.

Abstract: Non-Intrusive Load Monitoring (NILM) is a field of research focused on segregating constituent electrical loads in a system based only on their aggregated signal. Significant computational resources and research time are spent training models, often using as much data as possible, perhaps driven by the preconception that more data equates to more accurate models and better performing algorithms. When has enough prior training been done? When has a NILM algorithm encountered new, unseen data? This work applies … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
0
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(2 citation statements)
references
References 38 publications
0
0
0
Order By: Relevance
“…In other perspective, the Bayesian approach has the advantage of incorporating information outside of the data. Jones et al [11] explore the Bayesian approach and improve the overall performance by using the posterior.…”
Section: Overfitting In Linear Regressionmentioning
confidence: 99%
See 1 more Smart Citation
“…In other perspective, the Bayesian approach has the advantage of incorporating information outside of the data. Jones et al [11] explore the Bayesian approach and improve the overall performance by using the posterior.…”
Section: Overfitting In Linear Regressionmentioning
confidence: 99%
“…To overcome this problem, researchers formulated many solutions. To name a few instances: improving the data used to build the model [6], Bayesian statistics approach [10,11], and regularization [12]. These solutions have also been used to reduce overfitting in applied statistics.…”
Section: Introductionmentioning
confidence: 99%