2022
DOI: 10.48550/arxiv.2204.11613
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Machine learning of the well known things

Abstract: Machine learning (ML) in its current form implies that an answer to any problem can be well approximated by a function of a very peculiar form: a specially adjusted iteration of Heavyside theta-functions. It is natural to ask if the answers to the questions, which we already know, can be naturally represented in this form. We provide elementary, still non-evident examples that this is indeed possible, and suggest to look for a systematic reformulation of existing knowledge in a ML-consistent way. Success or a … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
8
0

Year Published

2022
2022
2022
2022

Publication Types

Select...
2

Relationship

1
1

Authors

Journals

citations
Cited by 2 publications
(8 citation statements)
references
References 7 publications
0
8
0
Order By: Relevance
“…Together with the results of [2], by now we have the following set of network building blocks (which may be used/combined as sub-networks):…”
Section: Arithmetic Operationsmentioning
confidence: 99%
See 3 more Smart Citations
“…Together with the results of [2], by now we have the following set of network building blocks (which may be used/combined as sub-networks):…”
Section: Arithmetic Operationsmentioning
confidence: 99%
“…i.e. points in the segment [2,3] belong to the class 1, and the rest to 0. Lets take a level-1 network.…”
Section: Example: 1dmentioning
confidence: 99%
See 2 more Smart Citations
“…In Ref. [18][19][20][21][22], the authors utilized deep layers satisfying a specific recurrence relation. In this case, experimental or observational data allow us to find the recurrence relation determining the bulk geometry.…”
Section: Introductionmentioning
confidence: 99%