The restricted consistency property of leave-nv-out cross-validation for high-dimensional variable selection

Yang Feng, Yi Yu

Research output: Contribution to journalArticle (Academic Journal)peer-review

10 Citations (Scopus)
78 Downloads (Pure)


Cross-validation (CV) methods are popular for selecting the tuning parameter in the high-dimensional variable selection problem. We show the misalignment of the CV is one possible reason of its over-selection behavior. To fix this issue, we propose a version of leave-nv-out cross-validation (CV(nv)), for selecting the optimal model among the restricted candidate model set for high-dimensional generalized linear models. By using the same candidate model sequence and a proper order of construction sample size nc in each CV split, CV(nv) avoids the potential hurdles in developing theoretical properties. CV(nv) is shown to enjoy the restricted model selection consistency property under mild conditions. Extensive simulations and real data analysis support the theoretical results and demonstrate the performances of CV(nv) in terms of both model selection and prediction.
Key words and phrases: Leave-nv-out cross-validation; Generalized linear models; Restricted maximum likelihood estimators; Restricted model selection consistency; Variable selection.
Original languageEnglish
Article number23
Pages (from-to)1607-1630
Number of pages24
JournalStatistica Sinica
Issue number3
Early online date22 Feb 2019
Publication statusPublished - Mar 2019


  • Leave-nv-out cross-validation
  • Generalized linear models
  • Restricted maximum likelihood estimators
  • Restricted model selection consistency
  • Variable selection


Dive into the research topics of 'The restricted consistency property of leave-nv-out cross-validation for high-dimensional variable selection'. Together they form a unique fingerprint.

Cite this