2021
DOI: 10.1016/j.patcog.2021.108058
|View full text |Cite
|
Sign up to set email alerts
|

Fused lasso for feature selection using structural information

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
7
0

Year Published

2022
2022
2025
2025

Publication Types

Select...
7
2
1

Relationship

0
10

Authors

Journals

citations
Cited by 33 publications
(7 citation statements)
references
References 33 publications
0
7
0
Order By: Relevance
“…Additionally, LASSO aids in avoiding overfitting by preventing the model from becoming excessively complex, which can generalize well to unseen data. Therefore, the LASSO method provides a powerful and efficient approach to feature selection by effectively handling highdimensional datasets, promoting interpretability, and robustness against multicollinearity, and preventing overfitting [30]. The important factors selected by LASSO involved in predicting the final math grade of students were first and second-period grades, quality of family relationships, age, number of school absences, weekend alcohol consumption, current health status, the reason for choosing the school, weekly study time, and home to school time arrival.…”
Section: Discussionmentioning
confidence: 99%
“…Additionally, LASSO aids in avoiding overfitting by preventing the model from becoming excessively complex, which can generalize well to unseen data. Therefore, the LASSO method provides a powerful and efficient approach to feature selection by effectively handling highdimensional datasets, promoting interpretability, and robustness against multicollinearity, and preventing overfitting [30]. The important factors selected by LASSO involved in predicting the final math grade of students were first and second-period grades, quality of family relationships, age, number of school absences, weekend alcohol consumption, current health status, the reason for choosing the school, weekly study time, and home to school time arrival.…”
Section: Discussionmentioning
confidence: 99%
“…where cov(g k , g l ) is the covariance between g k and g l , σ g k is the standard deviation of g k , and σ g l is the standard deviation of g l . The use of Fused LASSO is motivated by several factors [69]. One of the main advantages is its ability to handle high-dimensional data where the number of genes exceeds the number of samples.…”
Section: Generalized Fused Lassomentioning
confidence: 99%
“…While candidate feature relevancy is considered to be equivalent to selected feature relevancy in mutual information, some less relevant features may be misinterpreted as salient features. To overcome these issues, the fusion LASSO enhances the trade-off between the relevancy of each individual feature [35]. However, due to the structured penalties, the fusion LASSO cost is computationally high when dealing with EEG dataset.…”
Section: Related Workmentioning
confidence: 99%