Modified Gradient Boosting in High-dimensional Nonlinear Regression

Placeholder Show Content

Abstract/Contents

Abstract
Herein we develop a modification of Freund and Schapire's (1997) AdaBoost algorithm for classification trees and Friedman's (2001) gradient boosting generalization. Not only does this modification address long-standing unresolved problems concerning convergence and when to terminate the iterative algorithms, but it is also shown to have optimality properties via attainment of the convergence rates of oracle benchmarks that are computationally infeasible. Simulation studies are presented to illustrate the key ideas underlying modified gradient boosting and how it works.

Description

Type of resource text
Date created November 4, 2021
Date modified November 5, 2021; December 5, 2022
Publication date November 5, 2021

Creators/Contributors

Author Lai, T.L.
Author Xia, T.
Author Yuan, H.

Subjects

Subject steepest-descent minimization of loss functions
Subject orthogonal matching pursuit
Subject semi-population model
Subject weak greedy algorithms
Genre Text
Genre Technical report

Bibliographic information

Access conditions

Use and reproduction
User agrees that, where applicable, content will not be used to identify or to otherwise infringe the privacy or confidentiality rights of individuals. Content distributed via the Stanford Digital Repository may be subject to additional license and use restrictions applied by the depositor.
License
This work is licensed under a Creative Commons Attribution Non Commercial No Derivatives 4.0 International license (CC BY-NC-ND).

Preferred citation

Preferred citation
Lai, T., Xia, T., and Yuan, H. (2021). Modified Gradient Boosting in High-dimensional Nonlinear Regression. Department of Statistics Technical Report, Stanford University. Available from the Stanford Digital Repository at https://purl.stanford.edu/hk301nk9163

Collection

Statistics Department Technical Reports

Contact information

Loading usage metrics...