2021 IEEE International Conference on Big Data (Big Data) 2021
DOI: 10.1109/bigdata52589.2021.9672071
|View full text |Cite
|
Sign up to set email alerts
|

Parallelization of Sequential Pattern Sampling

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
1
1

Relationship

1
1

Authors

Journals

citations
Cited by 2 publications
(1 citation statement)
references
References 7 publications
0
1
0
Order By: Relevance
“…This modular design not only contributes to the model's scalability but also simplifies the addition of more layers, thereby facilitating the development of larger and more powerful models. The parallelization [38], combined with layer normalization and residual connections, contributes to more stable and faster training, especially for large models. In contrast, training deep RNNs can be a delicate task, requiring careful initialization and regularization to mitigate challenges related to the sequential nature of computation.…”
Section: Transformermentioning
confidence: 99%
“…This modular design not only contributes to the model's scalability but also simplifies the addition of more layers, thereby facilitating the development of larger and more powerful models. The parallelization [38], combined with layer normalization and residual connections, contributes to more stable and faster training, especially for large models. In contrast, training deep RNNs can be a delicate task, requiring careful initialization and regularization to mitigate challenges related to the sequential nature of computation.…”
Section: Transformermentioning
confidence: 99%