The platform will undergo maintenance on Sep 14 at about 9:30 AM EST and will be unavailable for approximately 1 hour.
2022
DOI: 10.1609/aaai.v36i9.21180
|View full text |Cite
|
Sign up to set email alerts
|

MLink: Linking Black-Box Models for Collaborative Multi-Model Inference

Abstract: The cost efficiency of model inference is critical to real-world machine learning (ML) applications, especially for delay-sensitive tasks and resource-limited devices. A typical dilemma is: in order to provide complex intelligent services (e.g. smart city), we need inference results of multiple ML models, but the cost budget (e.g. GPU memory) is not enough to run all of them. In this work, we study underlying relationships among black-box ML models and propose a novel learning task: model linking. Model linkin… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
5
1

Relationship

1
5

Authors

Journals

citations
Cited by 7 publications
(2 citation statements)
references
References 31 publications
0
2
0
Order By: Relevance
“…The comprehensive evaluations show the effectiveness of black-box model linking and the superiority of the MLink compared to other alternative methods. We summarize limitations and future work as follows: (1) When the semantic correlations between source and target models are low, model linking has poor output accuracy. (2) When the number of joined models is very large, pairwise model linking will become unpractical.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…The comprehensive evaluations show the effectiveness of black-box model linking and the superiority of the MLink compared to other alternative methods. We summarize limitations and future work as follows: (1) When the semantic correlations between source and target models are low, model linking has poor output accuracy. (2) When the number of joined models is very large, pairwise model linking will become unpractical.…”
Section: Discussionmentioning
confidence: 99%
“…We implemented our designs in Python based on Tensor-Flow 2.0 [55] as a pluggable middleware for inference systems 1 . We tested the integration on programs implemented with TensorFlow [55], PyTorch [56] and MindSpore [57], with only dozens of lines of code modification, which shows the ease of use of MLink.…”
Section: Methodsmentioning
confidence: 99%