Stepwise Searching for Feature Variables in High-Dimensional Linear Regression
Add to your list(s)
Download to your calendar using vCal
If you have a question about this talk, please contact rbg24.
We investigate the classical stepwise forward and backward
search methods
for selecting sparse models in the context of linear regression with the
number of candidate variables p greater than the number of observations
n. Two types of new information criteria BICP and BICC are proposed to
serve as the stopping rules in the stepwise searches, since the
traditional information criteria such as BIC and AIC are designed for the
cases with p LASSO
selector. The consistency of the stepwise search is investigated when
both $n$ and p tend to infinity. We show that a stepwise forward
addition followed by a stepwise backward deletion, both controlled by a
version of BICP , leads to a consistent estimated model under the sparse
Riesz condition.
This talk is part of the Statistics series.
This talk is included in these lists:
Note that ex-directory lists are not shown.
|