Fast approximate inference : shifting the Pareto frontier via adaptation
Abstract/Contents
- Abstract
- Computing high dimensional discrete integrals is a fundamental problem that underlies probabilistic inference and appears in fields ranging from machine learning to statistical mechanics. This problem has no efficient exact solution, which has led to the development of many approximation strategies that can generally be categorized as sampling, variational, or randomized hashing based inference algorithms. There exists a general trade-off between the approximation accuracy and computational efficiency of these algorithms. This thesis focuses on improving this trade-off. The Pareto frontier of algorithms for a particular inference query captures this trade-off. It consists of state of the art algorithms that cannot be improved upon by any known algorithms in terms of both accuracy and efficiency for the query. In this thesis we present techniques for improving the Pareto frontier, resulting in more accurate approximations for a fixed runtime budget or faster runtime for a fixed level of accuracy compared with prior work. The techniques are generally based on identifying handcrafted algorithmic components that perform a single, static computation on every input problem instance. We then design alternative dynamic components that adapt to specific input instances. First, we hand design an adaptive sampling algorithm for estimating the permanent of a matrix. Second, we design an adaptive randomized hashing algorithm for estimating the number of solutions to a Boolean formula. Next, we show that adaptive strategies can be parameterized and learned directly from data using a novel graph neural network architecture that subsumes belief propagation, a popular and general algorithm that performs variational inference on factor graphs. Finally, we develop a novel inference algorithm that leverages randomized optimization to perform fast inference. In experiments we consider multi-object target tracking, community detection, estimating the partition function of an Ising model, and approximate model counting. Our proposed techniques frequently result in orders of magnitude improvements in terms of approximation accuracy or computational efficiency.
Description
Type of resource | text |
---|---|
Form | electronic resource; remote; computer; online resource |
Extent | 1 online resource. |
Place | California |
Place | [Stanford, California] |
Publisher | [Stanford University] |
Copyright date | 2020; ©2020 |
Publication date | 2020; 2020 |
Issuance | monographic |
Language | English |
Creators/Contributors
Author | Kuck, Jonathan David |
---|---|
Degree supervisor | Ermon, Stefano |
Thesis advisor | Ermon, Stefano |
Thesis advisor | Ahmadipouranari, Nima |
Thesis advisor | Sabharwal, Ashish |
Degree committee member | Ahmadipouranari, Nima |
Degree committee member | Sabharwal, Ashish |
Associated with | Stanford University, Department of Computer Science Department |
Subjects
Genre | Theses |
---|---|
Genre | Text |
Bibliographic information
Statement of responsibility | Jonathan Kuck. |
---|---|
Note | Submitted to the Computer Science Department. |
Thesis | Thesis Ph.D. Stanford University 2020. |
Location | electronic resource |
Access conditions
- Copyright
- © 2020 by Jonathan David Kuck
- License
- This work is licensed under a Creative Commons Attribution Non Commercial 3.0 Unported license (CC BY-NC).
Also listed in
Loading usage metrics...