Optimization-based modeling in investment and data science

Placeholder Show Content

Abstract/Contents

Abstract
Optimization has played a key role in numerous fields including data science, statistics, machine learning, decision science, control and quantitative investment. Optimization offers a way for users to focus on the modeling step. Convex optimization has been a very successful and powerful modeling framework. By formulating a problem as convex optimization, practitioners could focus on the modeling side without worrying about designing problem-specific optimization algorithms during prototyping time. However, there are hurdles in applying this convex modeling framework. First, lots of signal processing and machine learning problems are most naturally formulated as non-convex problems. Second, not all convex problems are tractable. Third, it may be hard to encode the knowledge of data into a simple regularizer or constraint and specify the mathematical form of the optimization problem. In this thesis, we talk about topics in optimization-based modeling, including 1) distributional robust Kelly strategy in investment and gambling; 2) convex sparse blind deconvolution; 3) missing data imputation via a new structure called matrix network; 4) neural proximal method for compressive sensing.In these works. I try to expand the boundary of convex optimization based modeling by conquering several hurdles. In the distributional robust Kelly problem, the original distributional robust optimization formulation isconvex but non-tractable; we transform the problem into a tractable form. In the sparse blind deconvolutionproblem, blind deconvolution has been perceived as a non-convex problem for a long time, we proposea scalable convex formulation, and find a phase transition for the convex algorithm. In the missing dataimputation problem, we study a slice-wise missing pattern on tensorial type data that is beyond the capabilityof typical tensor completion algorithms. We propose a new type of underlying low-dimensional structure thatallows us to impute the missing data. In the first three topics, we solve these problems via convex optimizationformulations. In the last topic, we step out of the safety zone of convexity. On the linear inverse problem, we go beyond the sparsity and1−norm regularizer for compressive sensing. To model complex structure innatural/medical images, we propose a learning-based idea to parameterize the proximal map of an unknownregularizer. This idea is inspired by the convex optimization modeling framework and the learning-basedmethod, although the result need not correspond to convex optimization.

Description

Type of resource text
Form electronic resource; remote; computer; online resource
Extent 1 online resource.
Place California
Place [Stanford, California]
Publisher [Stanford University]
Copyright date 2019; ©2019
Publication date 2019; 2019
Issuance monographic
Language English

Creators/Contributors

Author Sun, Qingyun
Degree supervisor Boyd, Stephen P
Degree supervisor Donoho, David Leigh
Thesis advisor Boyd, Stephen P
Thesis advisor Donoho, David Leigh
Thesis advisor Candès, Emmanuel J. (Emmanuel Jean)
Degree committee member Candès, Emmanuel J. (Emmanuel Jean)
Associated with Stanford University, Department of Mathematics.

Subjects

Genre Theses
Genre Text

Bibliographic information

Statement of responsibility Qingyun Sun.
Note Submitted to the Department of Mathematics.
Thesis Thesis Ph.D. Stanford University 2019.
Location electronic resource

Access conditions

Copyright
© 2019 by Qingyun Sun
License
This work is licensed under a Creative Commons Attribution Non Commercial No Derivatives 3.0 Unported license (CC BY-NC-ND).

Also listed in

Loading usage metrics...