2004
DOI: 10.1109/tit.2004.830749
|View full text |Cite
|
Sign up to set email alerts
|

Finite-Memory Universal Prediction of Individual Sequences

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
17
0

Year Published

2007
2007
2021
2021

Publication Types

Select...
3
2
1

Relationship

0
6

Authors

Journals

citations
Cited by 21 publications
(17 citation statements)
references
References 18 publications
0
17
0
Order By: Relevance
“…This framework we term resource bounded learning, and this is a very rich area in machine learning that needs to be explored both in a single task learning case and transfer learning case. Current work in this include Feder & Federovski, 1998;Rajwan & Feder, 2000;Meron & Feder, 2004. The results, particularly the last paper cited, are impressive.…”
Section: Future Workmentioning
confidence: 88%
“…This framework we term resource bounded learning, and this is a very rich area in machine learning that needs to be explored both in a single task learning case and transfer learning case. Current work in this include Feder & Federovski, 1998;Rajwan & Feder, 2000;Meron & Feder, 2004. The results, particularly the last paper cited, are impressive.…”
Section: Future Workmentioning
confidence: 88%
“…(As far as the symbol erased from window is unknown at every step, the number of symbols is recounted in accordance with a specific rule). We shall give an overview of such algorithms described in [2], [4] and [5].…”
Section: Data Compression Using "Sliding Window"mentioning
confidence: 99%
“…This algorithm is known as "Imaginary Sliding Window" (ISW). Feder and Meron in paper [4] gave an extensive review of estimation techniques using finite memory and suggested a method which consists of randomized finite-state machine replacement by time invariant deterministic finite-state machine (TIDFS). In this paper we suggest a way to implement derandomized method, which we shall call "virtual sliding window".…”
Section: Introductionmentioning
confidence: 99%
“…In reality an implementation of probability smoothing has to work with finite precision arithmetic. Taking this into account [65] showed that a K-state FSM approximation of probability smoothing for a binary alphabet has re- Every state has N outgoing edges and an associated probability distribution. An outgoing edge (one for each letter) points to a follow-up state.…”
Section: Relative Frequencies With Smoothingmentioning
confidence: 99%
“…Any K-state FSM predictor has redundancy at least Θ(K −4/5 · n) [65]. Later, this lower bound was improved [44] and the authors proposed an FSM con-struction algorithm: Given per-bit redundancy R, the algorithm constructs an FSM with per-bit redundancy close to R. Other approaches rely on randomization, the FSM proposed in [66] simulates an Imaginary Sliding Window (see next section) and attains expected redundancy O(…”
Section: Relative Frequencies With Smoothingmentioning
confidence: 99%