2018
DOI: 10.1016/j.buildenv.2018.04.034
|View full text |Cite
|
Sign up to set email alerts
|

Occupancy prediction through Markov based feedback recurrent neural network (M-FRNN) algorithm with WiFi probe technology

Abstract: Accurate occupancy prediction can improve building control and energy efficiency. In recent years, WiFi signals inside buildings have been widely adopted in occupancy and building energy studies. However, WiFi signals are easily disturbed by building components and the connections between users and WiFi signals are unstable. Meanwhile, occupancy information is often characterized stochastically and varies with time. To overcome such limitations, this study utilizes WiFi probe technology to actively scan the Wi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
47
0
1

Year Published

2019
2019
2023
2023

Publication Types

Select...
8
1
1

Relationship

1
9

Authors

Journals

citations
Cited by 104 publications
(49 citation statements)
references
References 56 publications
(61 reference statements)
1
47
0
1
Order By: Relevance
“…With the Bayesian network algorithm, they reported an occupancy status estimation accuracy of 96.7%. Wang et al [21,22] used Wi-Fi probe data, indoor environmental measurement data, and a camera for ground truth from an open office space. To estimate the number of occupants, a M-FRNN (Markov based feedback recurrent neural network) algorithm was suggested, and then compared with many other algorithms, such as ANN, kNN, and SVM.…”
Section: Necessity and Purpose Of Studymentioning
confidence: 99%
“…With the Bayesian network algorithm, they reported an occupancy status estimation accuracy of 96.7%. Wang et al [21,22] used Wi-Fi probe data, indoor environmental measurement data, and a camera for ground truth from an open office space. To estimate the number of occupants, a M-FRNN (Markov based feedback recurrent neural network) algorithm was suggested, and then compared with many other algorithms, such as ANN, kNN, and SVM.…”
Section: Necessity and Purpose Of Studymentioning
confidence: 99%
“…To capture this time dependency, the Recurrent Neural Network (RNN) has been proposed, taking both the input at the current timestamp t and the state of the previous timestamp t-1 into consideration to predict the output h t , as illustrated in Figure 7(b). Because of the ability to capture the time-dependencies of time-series data, RNN has been applied to a variety of problems: speech recognition, translation, the prediction of occupancy [46], etc. RNN is powerful but not enough when the prediction tasks have long-term dependencies, which means the output of timestamp t, was not only influenced by the state of the previous timestamp t-1, but also by the state of k timestamps ago tk.…”
Section: From Ann To Lstmsmentioning
confidence: 99%
“…This kind of dropout technique reduces the number of cycles and computational time in the network. This kind of internal feedback structures will help to reduce the number of iterations for training and reduce overfitting [16].…”
Section: Fig 3: Basic Feedback Loop Connectionmentioning
confidence: 99%