Distributed and error resilient convex optimization formulations in machine learning

Placeholder Show Content

Abstract/Contents

Abstract
Neural networks have been very successful across many domains in machine learning. Training neural networks typically requires minimizing a high-dimensional non-convex function. Stochastic gradient descent and variants are often used in practice for training neural networks. In this thesis, we describe convex optimization formulations for optimally training neural networks with polynomial activation functions. More specifically, we present semidefinite programming formulations for training neural networks with second degree polynomial activations and show that its solution provides a globally optimal solution to the original non-convex training problem. We then extend this strategy to train quantized neural networks with integer weights. We show that we can globally optimize the training loss with respect to integer weights in polynomial time via semidefinite relaxations and randomized rounding. In the second part of the thesis, we describe a distributed computing and optimization framework to train models, including our convex neural networks. The proposed second order optimization methods in this part rely on approximating the Hessian matrix via random projections. In particular, we describe how to employ randomized sketches in reducing the problem dimensions as well as preserving privacy and improving straggler resilience in asynchronous distributed systems. We present novel approximation guarantees as well as closed-form expressions for debiasing the update directions of the optimization algorithm. Finally, we establish a novel connection between randomized sketching and coded computation. The proposed approach builds on polar codes for straggler-resilient distributed computing.

Description

Type of resource text
Form electronic resource; remote; computer; online resource
Extent 1 online resource.
Place California
Place [Stanford, California]
Publisher [Stanford University]
Copyright date 2022; ©2022
Publication date 2022; 2022
Issuance monographic
Language English

Creators/Contributors

Author Bartan, Burak
Degree supervisor Pilanci, Mert
Thesis advisor Pilanci, Mert
Thesis advisor Lall, Sanjay
Thesis advisor Wootters, Mary
Degree committee member Lall, Sanjay
Degree committee member Wootters, Mary
Associated with Stanford University, Department of Electrical Engineering

Subjects

Genre Theses
Genre Text

Bibliographic information

Statement of responsibility Burak Bartan.
Note Submitted to the Department of Electrical Engineering.
Thesis Thesis Ph.D. Stanford University 2022.
Location https://purl.stanford.edu/st635tm1500

Access conditions

Copyright
© 2022 by Burak Bartan
License
This work is licensed under a Creative Commons Attribution Non Commercial 3.0 Unported license (CC BY-NC).

Also listed in

Loading usage metrics...