Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conferen 2019
DOI: 10.18653/v1/d19-1115
|View full text |Cite
|
Sign up to set email alerts
|

Neural Linguistic Steganography

Abstract: Whereas traditional cryptography encrypts a secret message into an unintelligible form, steganography conceals that communication is taking place by encoding a secret message into a cover signal. Language is a particularly pragmatic cover signal due to its benign occurrence and independence from any one medium. Traditionally, linguistic steganography systems encode secret messages in existing text via synonym substitution or word order rearrangements. Advances in neural language models enable previously imprac… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
42
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 66 publications
(55 citation statements)
references
References 17 publications
0
42
0
Order By: Relevance
“…Traditionally, arithmetic coding encodes a string of elements into a bit sequence. To use such a coding for linguistic steganography, we follow (Ziegler et al, 2019) and reverse the encoding order. Namely, we encode a bit sequence (ciphertext) into a string of tokens (cover text) and decode a cover text to its original ciphertext.…”
Section: Arithmetic Codingmentioning
confidence: 99%
See 4 more Smart Citations
“…Traditionally, arithmetic coding encodes a string of elements into a bit sequence. To use such a coding for linguistic steganography, we follow (Ziegler et al, 2019) and reverse the encoding order. Namely, we encode a bit sequence (ciphertext) into a string of tokens (cover text) and decode a cover text to its original ciphertext.…”
Section: Arithmetic Codingmentioning
confidence: 99%
“…Due to the above reasons, people in practice will truncate the LM distribution to include only top K most likely tokens (Ziegler et al, 2019), which leads to the following distribution:…”
Section: Imperceptibility Analysismentioning
confidence: 99%
See 3 more Smart Citations