2015
DOI: 10.1016/j.jvcir.2015.06.001
|View full text |Cite
|
Sign up to set email alerts
|

Digital watermark extraction using support vector machine with principal component analysis based feature reduction

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
38
0
1

Year Published

2016
2016
2023
2023

Publication Types

Select...
9
1

Relationship

0
10

Authors

Journals

citations
Cited by 55 publications
(39 citation statements)
references
References 38 publications
0
38
0
1
Order By: Relevance
“…Firms' business methods are changing from product-centric to (1) Singular value decomposition: This method has a good effect when the data dimension is high, and it is often used as a preprocessing method to realize the convergence of fuzzy rough reduction in high-dimensional data sets; but the calculation cost of this method is high [13,14]; (2) The principal component analysis is reduced, and the mutual influence between evaluation indexes is eliminated by replacing the original variables with several principal components with larger contributions. This study is necessary to delete irrelevant or unimportant attributes to eliminate the interference of irrelevant features when using the data with higher dimensions [15]; (3) Effective feature extraction in deep learning: Data-driven deep learning analysis has been developed and applied in many fields. The ability to fit and extract features has been improved by combining multiple processing layers in a variety of data analysis tasks [16]; (4) The attribute reduction of rough set theory is an extension of the theory of modeling ambiguity and imprecision [17].…”
Section: Introductionmentioning
confidence: 99%
“…Firms' business methods are changing from product-centric to (1) Singular value decomposition: This method has a good effect when the data dimension is high, and it is often used as a preprocessing method to realize the convergence of fuzzy rough reduction in high-dimensional data sets; but the calculation cost of this method is high [13,14]; (2) The principal component analysis is reduced, and the mutual influence between evaluation indexes is eliminated by replacing the original variables with several principal components with larger contributions. This study is necessary to delete irrelevant or unimportant attributes to eliminate the interference of irrelevant features when using the data with higher dimensions [15]; (3) Effective feature extraction in deep learning: Data-driven deep learning analysis has been developed and applied in many fields. The ability to fit and extract features has been improved by combining multiple processing layers in a variety of data analysis tasks [16]; (4) The attribute reduction of rough set theory is an extension of the theory of modeling ambiguity and imprecision [17].…”
Section: Introductionmentioning
confidence: 99%
“…Verma et al [33] embedded a binary watermark into the low-pass sub-band of LWT of the original cover image by a quantization process. They proposed a watermark extraction method based on SVM and Principal Component Analysis (PCA).…”
Section: Related Workmentioning
confidence: 99%
“…To prove the superiority and the robustness advantages of our scheme a comparative study with state of the art works is indispensable. In fact, we compared the robustness of our scheme with six robust watermarking schemes [26,27,35,[37][38][39]. In order to carry out a fair comparison, the same insertion parameters (PSNR, watermark size) and attacks are used in this comparison.…”
Section: Comparative Studymentioning
confidence: 99%