2021
DOI: 10.1016/j.buildenv.2021.108133
|View full text |Cite
|
Sign up to set email alerts
|

A hybrid deep transfer learning strategy for thermal comfort prediction in buildings

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
10
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 69 publications
(17 citation statements)
references
References 34 publications
0
10
0
Order By: Relevance
“…For parameter-based TL, the appropriate transfer of weights from the source to the target domain is critical for improving model performance. The lower (convolutional) layers in CNNs capture domain-specific information, and the deeper (dense) layers contribute to the effective learning required for data classification 37 , 38 . Since the source and target domain datasets vary concerning the environment and load conditions, we hypothesise that the CNN architecture can be adapted to these changes by retraining the weights in only the lower layers, while weights in the deeper layers weights can be transferred unmodified for effective fault classification.…”
Section: Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…For parameter-based TL, the appropriate transfer of weights from the source to the target domain is critical for improving model performance. The lower (convolutional) layers in CNNs capture domain-specific information, and the deeper (dense) layers contribute to the effective learning required for data classification 37 , 38 . Since the source and target domain datasets vary concerning the environment and load conditions, we hypothesise that the CNN architecture can be adapted to these changes by retraining the weights in only the lower layers, while weights in the deeper layers weights can be transferred unmodified for effective fault classification.…”
Section: Resultsmentioning
confidence: 99%
“…Parameter-based TL allows the reuse of parameter weights to improve the accuracy when data distribution differs from the source domain to the target domain. Lower (convolutional) layers of CNNs capture more domain-specific information concealed within the image by convolving it with a kernel (or filter), while deeper (dense) layers are responsible for learning information that is relevant for making the decision 37 , 38 . In the TL framework presented in this work, the source classifier CNN was trained on the CWRU dataset, and the target classifier (TL-CNN) was then allowed to leverage this learned information of decision-making by transferring weights of certain layers in the CNN, and retraining the remaining layers with data from the target domain.…”
Section: Methodsmentioning
confidence: 99%
“…Parameter-based TL allows the reuse of parameter weights to improve the accuracy when data distribution differs from the source domain to the target domain. Lower (convolutional) layers of CNNs capture more domain-specific information concealed within the image by convolving it with a kernel (or filter), while deeper (dense) layers are responsible for learning information that is relevant for making the decision 34,35 . In the TL framework presented in this work, the source classifier CNN was trained on the CWRU data set, and the target classifier (TL-CNN) was then allowed to leverage this learned information of decision-making by transferring weights of certain layers in the CNN, and retraining the remaining layers with data from the target domain.…”
Section: Transfer Learning For Machine Fault Diagnosismentioning
confidence: 99%
“…For parameter-based TL, the appropriate transfer of weights from the source to the target domain is critical for improving model performance. The lower (convolutional) layers in CNNs capture domain-specific information, and the deeper (dense) layers contribute to the effective learning required for data classification 34,35 . Since the source and target domain data sets vary concerning the environment and load conditions, we hypothesise that the CNN architecture can be adapted to these changes by retraining the weights in only the lower layers, while weights in the deeper layers weights can be transferred unmodified for effective fault classification.…”
Section: Impact Of Retraining Individual Tl-cnn Layers On Accuracymentioning
confidence: 99%
“…Cao et al [25] reviewed the research related to dynamic thermal comfort, aiming to explain the development, prospects, and necessity of dynamic thermal comfort research. Somu et al [26] used transfer learning and a comprehensive thermal comfort evaluation index to solve the lack of thermal comfort samples. Korkas et al [27] guaranteed the independence of microgrid with supervisory strategy to improve the thermal comfort when energy storage and renewable energy sources are used.…”
Section: Introductionmentioning
confidence: 99%