Search citation statements
Paper Sections
Citation Types
Year Published
Publication Types
Relationship
Authors
Journals
Data compression is a technique used to reduce the size of a file. To reduce the size of a file, unnecessary information is removed or parts that repeat the same information are stored once. When the compressed file is opened, it has all the properties of the original file and can be used in the same way. Data compression can be done using different techniques. Some of these techniques include Huffman coding, Lempel-Ziv-Welch (LZW) coding and BWT (Burrows-Wheeler Transform). Some techniques simplify the data, while others identify repetitive parts of the data and store them in a separate file. Which technique to use depends on the type and size of the data to be compressed. Huffman, LZW, BWT and deflate algorithms are the most commonly used techniques for text compression. Each algorithm uses different approaches and can produce different results in terms of compression ratios and performance. In this study, different data compression techniques were measured on specific datasets by using them individually and in stacked pairs. The most successful result was obtained with the deflate algorithm and 29.08 compression ratio when used individually. When considered in the form of stacked binaries, the BWT-Deflate binary gave the best result with a compression ratio of 57.36. In addition, when compressing in pairs, which algorithm is applied first and which one is applied later can make a significant difference in the compression ratio. In this study, the performance measurements obtained by applying the algorithms in different orders are compared and suggestions are presented to achieve optimum performance.
Data compression is a technique used to reduce the size of a file. To reduce the size of a file, unnecessary information is removed or parts that repeat the same information are stored once. When the compressed file is opened, it has all the properties of the original file and can be used in the same way. Data compression can be done using different techniques. Some of these techniques include Huffman coding, Lempel-Ziv-Welch (LZW) coding and BWT (Burrows-Wheeler Transform). Some techniques simplify the data, while others identify repetitive parts of the data and store them in a separate file. Which technique to use depends on the type and size of the data to be compressed. Huffman, LZW, BWT and deflate algorithms are the most commonly used techniques for text compression. Each algorithm uses different approaches and can produce different results in terms of compression ratios and performance. In this study, different data compression techniques were measured on specific datasets by using them individually and in stacked pairs. The most successful result was obtained with the deflate algorithm and 29.08 compression ratio when used individually. When considered in the form of stacked binaries, the BWT-Deflate binary gave the best result with a compression ratio of 57.36. In addition, when compressing in pairs, which algorithm is applied first and which one is applied later can make a significant difference in the compression ratio. In this study, the performance measurements obtained by applying the algorithms in different orders are compared and suggestions are presented to achieve optimum performance.
A wireless sensor network (WSN) of a tree-like topology is considered, which performs measurements and transmits their results to the consumer. Under the interference influence, the WSN nodes transmitters low power makes the transmitted information vulnerable, which leads to significant data loss. To reduce the data loss during transmission, a noise-immune WSN model is proposed. Such a WSN, having detected a stable connection absence between a pair of nodes, transfers the interaction between these nodes to a radio channel free from interference influence. For this, the model, in addition to forming a network and transferring application data, provides for checking the communication availability based on the keep-alive mechanism and restoring the network with a possible channel change. A feature point of the proposed approach is the ability to restore network connectivity when exposed to interference of significant power and duration, which makes it impossible to exchange service messages on the channel selected for the interaction of nodes. To support the model, work algorithms and data structures have been developed, indicators have been formalized to assess an anti-jamming system work quality.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.