In-database machine learning on reconfigurable dataflow accelerators

Placeholder Show Content

Abstract/Contents

Abstract
Stagnant CPU performance is driving an explosion in domain-specific architectures to supplement CPUs for data-intensive workloads in data centers. These accelerators sacrifice generality and programmability beyond their target domain in exchange for higher-performance. However, accelerator deployment in data centers remains limited outside all but the most ubiquitous of application domains like machine learning. High-performance systems require large dice in advanced process nodes with many supporting resources like high-bandwidth memory. As a result, to justify the price that comes with high-performance IC design, designers building non-ML accelerators are left with a choice: either piggyback on ML accelerators---giving up some efficiency but taking advantage of supporting hardware resources---or build more-efficient but less-advanced bespoke hardware. In this work, we show that reconfigurable dataflow accelerators (RDAs) are a practical alternative to building fixed-function designs per application domain. We extend Plasticine---a previously proposed ML-focused RDA--- with low-overhead, micro-architectural extensions to support analytic database queries. These extensions increase area by just 4%, and the unified accelerator outperforms a multi-core software baseline by 1500x. Finally, we show how to re-purpose these extensions to implement data structures like trees and hash tables that are critical to asymptotically optimal query plans. We introduce a threading model for vector dataflow accelerators that extracts parallelism from data structures with irregular control flow using fine-grained thread scheduling---outperforming a GPU by 8x.

Description

Type of resource text
Form electronic resource; remote; computer; online resource
Extent 1 online resource.
Place California
Place [Stanford, California]
Publisher [Stanford University]
Copyright date 2021; ©2021
Publication date 2021; 2021
Issuance monographic
Language English

Creators/Contributors

Author Vilim, Matthew
Degree supervisor Olukotun, Oyekunle Ayinde
Thesis advisor Olukotun, Oyekunle Ayinde
Thesis advisor Hennessy, John L
Thesis advisor Ré, Christopher
Degree committee member Hennessy, John L
Degree committee member Ré, Christopher
Associated with Stanford University, Department of Electrical Engineering

Subjects

Genre Theses
Genre Text

Bibliographic information

Statement of responsibility Matthew Vilim.
Note Submitted to the Department of Electrical Engineering.
Thesis Thesis Ph.D. Stanford University 2021.
Location https://purl.stanford.edu/br114bf8393

Access conditions

Copyright
© 2021 by Matthew Vilim
License
This work is licensed under a Creative Commons Attribution 3.0 Unported license (CC BY).

Also listed in

Loading usage metrics...