2018
DOI: 10.1145/3277006.3277013
|View full text |Cite
|
Sign up to set email alerts
|

Scalable Linear Algebra on a Relational Database System

Abstract: Scalable linear algebra is important for analytics and machine learning (including deep learning). In this paper, we argue that a parallel or distributed database system is actually an excellent platform upon which to build such functionality. Most relational systems already have support for cost-based optimization-which is vital to scaling linear algebra computations-and it is well-known how to make relational systems scale. We show that by making just a few changes to a parallel/distributed relational databa… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
22
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
4
3
1

Relationship

1
7

Authors

Journals

citations
Cited by 24 publications
(22 citation statements)
references
References 20 publications
0
22
0
Order By: Relevance
“…Existing works [12,45,50] propose to: (1) Abstract the tensor as a set of tensor blocks; (2) Encode local linear algebra computation logics that manipulate single or a pair of tensor blocks, in user defined functions (UDFs), also called as kernel functions, such as matrix multiplication, matrix addition, etc. ; (3) Apply the relational algebra operators nested with these UDFs for performing linear algebra computations.…”
Section: Background and Related Work 21 ML Model Inferences As Queriesmentioning
confidence: 99%
See 1 more Smart Citation
“…Existing works [12,45,50] propose to: (1) Abstract the tensor as a set of tensor blocks; (2) Encode local linear algebra computation logics that manipulate single or a pair of tensor blocks, in user defined functions (UDFs), also called as kernel functions, such as matrix multiplication, matrix addition, etc. ; (3) Apply the relational algebra operators nested with these UDFs for performing linear algebra computations.…”
Section: Background and Related Work 21 ML Model Inferences As Queriesmentioning
confidence: 99%
“…Therefore, as illustrated in Fig. 1, a fully-connected feed-forward network (FFNN) can be represented in relational algebra [34,45]. While the experiments in this work (Sec.…”
Section: Background and Related Work 21 ML Model Inferences As Queriesmentioning
confidence: 99%
“…Advanced in-database analytics. To accommodate the exponential growth in data science and machine learning applications, a recent line of work [3,14,24,26,39,41,54,77] focuses on supporting advanced analytics queries that involve linear algebra (LA) operators. TCUDB shares the goal of LevelHeaded [3] in identifying the worst-case optimal join (WCOJ) [62] or LaraDB's rule-based translation between relational queries and parallel LA queries, but TCUDB additionally provides the capability of translating (parts of) the query to TCU-accelerated matrix multiplication operator(s) and different sets of opportunities from the orders of magnitude speedup by TCUs in such operations.…”
Section: Related Workmentioning
confidence: 99%
“…Despite being originally designed for AI/ML workloads, tensor processors also hold potential performance improvements for database engines. This is due to both the increasing demand for native support of linear algebra queries (e.g., matrix multiplication itself) in SQL DB engines [3,24,26,39,54] and the observation that a large number of regular query operators can be cast into matrix multiplication. For example, one can show that the most commonly used natural joins [5,20] and group-by aggregates can be encoded as matrix multiplication, which enables TCUs to deliver exceptional performance.…”
Section: Introductionmentioning
confidence: 99%
“…We have briefly discussed enhancements to SimSQL to provide vector and matrix support (see [33] for more detail). Others have also considered to incorporate array data types into relational systems.…”
Section: Related Workmentioning
confidence: 99%