Skip to main content
bioRxiv
  • Home
  • About
  • Submit
  • ALERTS / RSS
Advanced Search
New Results

Understanding double descent through the lens of principal component regression

Christine H. Lind, Angela J. Yu
doi: https://doi.org/10.1101/2021.04.26.441538
Christine H. Lind
1Department of Electrical and Computer Engineering, University of California San Diego, La Jolla, CA, USA
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • For correspondence: clind@eng.ucsd.edu
Angela J. Yu
2Department of Cognitive Science & Halicioglu Data Science Institute, University of California San Diego, La Jolla, CA, US,
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • For correspondence: ajyu@ucsd.edu
  • Abstract
  • Full Text
  • Info/History
  • Metrics
  • Preview PDF
Loading

Abstract

A number of recent papers have studied the double-descent phenomenon: as the number of parameters in a supervised learning model increasingly exceeds that of data points (“second-descent”), the empirical risk curve has been observed to not overfit, instead decreasing monotonically, sometimes to a level even better than the best “first-descent” model (using a subset of features not exceeding the number of data points). Understanding exactly when this happens and why it happens is an important theoretical problem. Focusing on the over-parameterized linear regression setting, a commonly chosen case study in the double-descent literature, we present two theoretical results: 1) final second-descent (regression using all of the predictor variables) and principal component (PC) regression without dimensionality reduction are equivalent; 2) the PCR risk curve can be expected to lower bound not only all linearly transformed first-descent models, but also all linearly transformed second-descent models (including the elimination of features as a special case); 3) if the smallest singular value of the design matrix is “large enough” (we will define mathematically), final second-descent can be expected to outperform any first-descent or second-descent model. These insights have important ramifications for a type of semi-supervised learning problem, a scenario which can explain why a face representation trained on unlabeled faces from one race would be better for later supervised-learning tasks on the same race of faces than for faces from another race – this can both provide a scientific explanation for the other-race effect seen in humans and give hints for how to mitigate similar issues in the domain of ethical AI.

Competing Interest Statement

The authors have declared no competing interest.

Copyright 
The copyright holder for this preprint is the author/funder, who has granted bioRxiv a license to display the preprint in perpetuity. It is made available under a CC-BY-NC-ND 4.0 International license.
Back to top
PreviousNext
Posted June 06, 2021.
Download PDF
Email

Thank you for your interest in spreading the word about bioRxiv.

NOTE: Your email address is requested solely to identify you as the sender of this article.

Enter multiple addresses on separate lines or separate them with commas.
Understanding double descent through the lens of principal component regression
(Your Name) has forwarded a page to you from bioRxiv
(Your Name) thought you would like to see this page from the bioRxiv website.
CAPTCHA
This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.
Share
Understanding double descent through the lens of principal component regression
Christine H. Lind, Angela J. Yu
bioRxiv 2021.04.26.441538; doi: https://doi.org/10.1101/2021.04.26.441538
Digg logo Reddit logo Twitter logo Facebook logo Google logo LinkedIn logo Mendeley logo
Citation Tools
Understanding double descent through the lens of principal component regression
Christine H. Lind, Angela J. Yu
bioRxiv 2021.04.26.441538; doi: https://doi.org/10.1101/2021.04.26.441538

Citation Manager Formats

  • BibTeX
  • Bookends
  • EasyBib
  • EndNote (tagged)
  • EndNote 8 (xml)
  • Medlars
  • Mendeley
  • Papers
  • RefWorks Tagged
  • Ref Manager
  • RIS
  • Zotero
  • Tweet Widget
  • Facebook Like
  • Google Plus One

Subject Area

  • Neuroscience
Subject Areas
All Articles
  • Animal Behavior and Cognition (3477)
  • Biochemistry (7315)
  • Bioengineering (5290)
  • Bioinformatics (20180)
  • Biophysics (9967)
  • Cancer Biology (7696)
  • Cell Biology (11242)
  • Clinical Trials (138)
  • Developmental Biology (6413)
  • Ecology (9910)
  • Epidemiology (2065)
  • Evolutionary Biology (13266)
  • Genetics (9346)
  • Genomics (12542)
  • Immunology (7665)
  • Microbiology (18919)
  • Molecular Biology (7413)
  • Neuroscience (40853)
  • Paleontology (298)
  • Pathology (1224)
  • Pharmacology and Toxicology (2124)
  • Physiology (3137)
  • Plant Biology (6833)
  • Scientific Communication and Education (1268)
  • Synthetic Biology (1890)
  • Systems Biology (5295)
  • Zoology (1083)