2017
DOI: 10.1515/bpasts-2017-0101
|View full text |Cite
|
Sign up to set email alerts
|

FPGA implementation of logarithmic versions of Baum-Welch and Viterbi algorithms for reduced precision hidden Markov models

Abstract: Abstract. This paper presents a programmable system-on-chip implementation to be used for acceleration of computations within hidden Markov models. The high level synthesis (HLS) and "divide-and-conquer" approaches are presented for parallelization of Baum-Welch and Viterbi algorithms. To avoid arithmetic underflows, all computations are performed within the logarithmic space. Additionally, in order to carry out computations efficiently -i.e. directly in an FPGA system or a processor cache -we postulate to red… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(1 citation statement)
references
References 33 publications
0
1
0
Order By: Relevance
“…We want to avoid having to set the HMM parameters manually because it requires expert knowledge; therefore, it is necessary to solve the learning problem of the network security risk assessment model. We use the modified equation of the Baum-Welch algorithm to address this issue [32]: Baum et al proposed the forward-backward algorithm to reduce the computational complexity of P(O | λ) and defined the forward probability α t (i) and the backward probability β t (i). α t (i) is defined as the probability that the previous observation sequence is O 1 , O 2 , .…”
Section: I-hmm Model Parameters a Observation Sequencementioning
confidence: 99%
“…We want to avoid having to set the HMM parameters manually because it requires expert knowledge; therefore, it is necessary to solve the learning problem of the network security risk assessment model. We use the modified equation of the Baum-Welch algorithm to address this issue [32]: Baum et al proposed the forward-backward algorithm to reduce the computational complexity of P(O | λ) and defined the forward probability α t (i) and the backward probability β t (i). α t (i) is defined as the probability that the previous observation sequence is O 1 , O 2 , .…”
Section: I-hmm Model Parameters a Observation Sequencementioning
confidence: 99%