2018
DOI: 10.1002/pamm.201800368
|View full text |Cite
|
Sign up to set email alerts
|

Data‐driven approximation methods applied to non‐rational functions

Abstract: In this study, four data-driven interpolation-based methods are compared. The aim is to construct reduced-order models for which the corresponding rational transfer function matches the original non-rational one at selected interpolation points. The primary method of this study is the Loewner framework [2] which addresses this problem in a natural and direct way. The other methods that were studied, vector fitting (VF) [5], and adaptive Antoulas-Anderson (AAA) [4], are instead based on an iterative and adaptiv… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

1
7
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
4
2

Relationship

1
5

Authors

Journals

citations
Cited by 8 publications
(8 citation statements)
references
References 7 publications
1
7
0
Order By: Relevance
“…Ar, Br, Cr}, linear Loewner framework[2,4]. ;3 H 2r (jω k , jω k ) = Cr(2jω k Er − Ar) −1 O: gen. observability Nr (jω k Er − Ar) −1 Br R: gen.controllability ≈ 2Y II (jω k ) U (jω k ) 2 ;4 Identify the matrix Nr by solving the system: H 2r (jω i , jω i ) = O (i,:) NrR (:,i) ⇒ H 2r (jω i , jω i ) = O (i,:) ⊗ R (:,i) vec(Nr), i = 1, ..., n.;…”
mentioning
confidence: 99%
“…Ar, Br, Cr}, linear Loewner framework[2,4]. ;3 H 2r (jω k , jω k ) = Cr(2jω k Er − Ar) −1 O: gen. observability Nr (jω k Er − Ar) −1 Br R: gen.controllability ≈ 2Y II (jω k ) U (jω k ) 2 ;4 Identify the matrix Nr by solving the system: H 2r (jω i , jω i ) = O (i,:) NrR (:,i) ⇒ H 2r (jω i , jω i ) = O (i,:) ⊗ R (:,i) vec(Nr), i = 1, ..., n.;…”
mentioning
confidence: 99%
“…Low-rank approximations to and may be computed by adaptive cross approximation [ 4 ], particularly suited for hierarchical matrices, the CUR decomposition [ 11 ] as in [ 22 , 38 ], or related schemes. These approaches select a certain number of columns and rows of the original matrices in a greedy fashion based on various heuristics, and a core matrix is utilised to compute a low-rank approximation.…”
Section: Exploiting the Structure Of Andmentioning
confidence: 99%
“…Moreover, the memory constraints coming from the allocation of and are still present. Alternatively, one can focus on accelerating solely the step of the SVD calculation by employing Krylov methods (see, e.g., [ 3 , 19 , 25 , 41 ] to name a few), by using the randomized SVD [ 31 ] to compute the dominant singular triplets instead of the full SVD or other types of inexact SVD-type decompositions (adaptive cross approximation [ 4 ], particularly suited for hierarchical matrices, or a CUR decomposition [ 11 ] as in [ 22 , 38 ]).…”
Section: Introductionmentioning
confidence: 99%
“…This can be done using the so-called Loewner framework 33 . The method has found wide applications, not limited to model order reduction [34][35][36][37][38] . First, let us generalise the problem.…”
Section: Review Of Interpolation-based Reductionmentioning
confidence: 99%