2017
DOI: 10.1093/geronb/gbx008
|View full text |Cite
|
Sign up to set email alerts
|

Feature Selection Methods for Optimal Design of Studies for Developmental Inquiry

Abstract: Feature selection techniques permit researchers to choose measures that are maximally predictive of relevant outcomes, even when there are interactions or nonlinearities. These techniques facilitate decisions about which measures may be dropped from a study while maintaining efficiency of prediction across groups and reducing costs to the researcher and burden on the participants.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
15
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
9

Relationship

1
8

Authors

Journals

citations
Cited by 20 publications
(15 citation statements)
references
References 42 publications
0
15
0
Order By: Relevance
“…There are a number of intelligent feature selection or reduction approaches that have been proposed in literature including filter methods, wrapper methods, forward selection, backward selection, embedded methods or regularization, LASSO regression, Ridge regression, matrix factorization methods such as principal component analysis (PCA), independent component analysis (ICA), etc. [ 56 ]. In most of the cases, features are transformed into a new low-dimensional feature space.…”
Section: Introductionmentioning
confidence: 99%
“…There are a number of intelligent feature selection or reduction approaches that have been proposed in literature including filter methods, wrapper methods, forward selection, backward selection, embedded methods or regularization, LASSO regression, Ridge regression, matrix factorization methods such as principal component analysis (PCA), independent component analysis (ICA), etc. [ 56 ]. In most of the cases, features are transformed into a new low-dimensional feature space.…”
Section: Introductionmentioning
confidence: 99%
“…Note that to prevent variable selection (i.e., ranking) from seeing the data used for model training (i.e., parameter tuning in this study), the training-validation dataset was divided into five disjoint subsets in this recursive selection process so that at each backward elimination parameter tuning can be conducted using four of the subsets of data while variable ranking can be performed separately based on the other subset. This suggested approach follows the principles of variable selection for study design recommended by Brick et al (2017).…”
Section: Methodsmentioning
confidence: 99%
“…For example, a survey might trigger when the weighted sum B 1 ×HeartRate+B 2 ×HRV is higher than a threshold T, where B 1 , B 2 , and T have been determined by fitting a logistic regression or support vector machine. Ongoing research has begun to incorporate more advanced models, such as Bayesian sequential updating models [34], vector autoregressive models [35], and automated feature selection results [19] In brief, these models are kept relatively lightweight by using sequential updating methods whenever possible, by relying on existing OS-level functions when available (eg, for GPS), and by tuning the temporal precision of modeling whenever possible. For example, the rule set provided in Figure 2 uses a preselected set of identified geofences-that is, areas of the map that have been precomputed as spaces of interest.…”
Section: Real-time Adaptation For Responsive Assessment and Interventionmentioning
confidence: 99%
“…Participant burden can be reduced, for example, via passive sensing tools [18]. Methods also exist to balance data cost with burden, for example, by reducing survey size through feature selection [19] or by modeling adherence propensity [20]. Research is still needed to optimize the interaction between participant and technology to maximize engagement and minimize burden while ensuring data quality.…”
Section: Introductionmentioning
confidence: 99%