2022
DOI: 10.1016/j.neucom.2022.01.036
|View full text |Cite
|
Sign up to set email alerts
|

Learning sub-patterns in piecewise continuous functions

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
2

Relationship

1
1

Authors

Journals

citations
Cited by 2 publications
(3 citation statements)
references
References 38 publications
0
3
0
Order By: Relevance
“…During an interview, students mentioned that it was not a single function, and their responses showed that there is a gap in the knowledge of sketching piecewise functions and finding the domain. According to different studies, students who do not understand the concept of domain functions may have difficulties understanding piecewise functions (Kratsios & Zamanlooy, 2022;Triutami, Hanifah, Novitasari, Apsari, & Wulandari, 2021).…”
Section: Discussion On Limits and Continuity Of Piecewise Functions (...mentioning
confidence: 99%
“…During an interview, students mentioned that it was not a single function, and their responses showed that there is a gap in the knowledge of sketching piecewise functions and finding the domain. According to different studies, students who do not understand the concept of domain functions may have difficulties understanding piecewise functions (Kratsios & Zamanlooy, 2022;Triutami, Hanifah, Novitasari, Apsari, & Wulandari, 2021).…”
Section: Discussion On Limits and Continuity Of Piecewise Functions (...mentioning
confidence: 99%
“…(2018); Gonon et al. (2020); Kratsios and Zamanlooy (2022). Not only is this column most pertinent to transformer networks trained with randomized methods, but it also highlights the potential of transformer approaches to geometric deep learning, applications to stochastic analysis, and mathematical finance.…”
Section: Static Case: Universal Approximation Into Qas Spacesmentioning
confidence: 99%
“…(2021); Shen et al. (2021b); Kratsios and Zamanlooy (2022). Briefly, such activation functions allow us to build neural networks achieving exponential approximation rates.…”
Section: Introductionmentioning
confidence: 99%