2021
DOI: 10.1088/1742-6596/1873/1/012053
|View full text |Cite
|
Sign up to set email alerts
|

A novel linguistic steganalysis method for hybrid steganographic texts

Abstract: Most of the existing linguistic steganalysis methods mainly focus on detecting steganographic texts which are generated by embedding secret information into a type of text medium using one steganographic algorithm. But in practical applications, a large number of the steganographic texts may be hybrid ones which are generated by embedding secret information into different types of text media using different steganographic algorithms. In this paper, inspired by transfer learning, a novel linguistic steganalysis… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(3 citation statements)
references
References 12 publications
0
3
0
Order By: Relevance
“…Subsequently, Zou et al [27] employed Bidirectional Encoder Representation from Transformers (BERT) and Global Vectors for Word Representation (Glove) to capture inter-sentence contextual association relationships, then extracted context information using Bi-LSTM and finally obtained the sensitive semantic features via the attention mechanism for steganographic text detection. Xu et al [28] employed a pre-trained BERT language model to obtain initial contextually relevant word representation, after which the extracted features were fed into an LSTM with attention to obtain the final sentence representation used to classify the detected texts. Besides, [28] also mixes the steganographic texts generated by several steganographic methods.…”
Section: Related Workmentioning
confidence: 99%
See 2 more Smart Citations
“…Subsequently, Zou et al [27] employed Bidirectional Encoder Representation from Transformers (BERT) and Global Vectors for Word Representation (Glove) to capture inter-sentence contextual association relationships, then extracted context information using Bi-LSTM and finally obtained the sensitive semantic features via the attention mechanism for steganographic text detection. Xu et al [28] employed a pre-trained BERT language model to obtain initial contextually relevant word representation, after which the extracted features were fed into an LSTM with attention to obtain the final sentence representation used to classify the detected texts. Besides, [28] also mixes the steganographic texts generated by several steganographic methods.…”
Section: Related Workmentioning
confidence: 99%
“…Xu et al [28] employed a pre-trained BERT language model to obtain initial contextually relevant word representation, after which the extracted features were fed into an LSTM with attention to obtain the final sentence representation used to classify the detected texts. Besides, [28] also mixes the steganographic texts generated by several steganographic methods.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation