Universal steganalysis method based on multi-domain features | |
Yan, Yan; Li, Liting; Zhang, Qiuyu | |
刊名 | Journal of Information and Computational Science |
2013-05-01 | |
卷号 | 10期号:7页码:2177-2185 |
关键词 | Algorithms Support vector machines Correlation coefficient Difference values Embedding capacity Embedding rates Mean and standard deviations Multi-domain features Statistical features Universal steganalysis |
ISSN号 | 15487741 |
DOI | 10.12733/jics20101693 |
英文摘要 | A new method of universal steganalysis for BMP images, which had low embedding rate and based on multi-domain features, is proposed in this paper. It provided a way to extract statistical features from multi-domain for universal steganalysis and solved the problem of low detection rate for a small amount of data embedding. Features were extracted from gradient energy differences in spatial domain, correlation coefficient in DCT domain, and the mean and standard deviation of difference value matrix in DWT domain. Experiments results show that detection achieved a better reliability when the embedding capacity is above 2 KB, compared with existing methods. © 2013 by Binary Information Press. |
语种 | 英语 |
出版者 | Binary Information Press, Flat F 8th Floor, Block 3, Tanner Garden, 18 Tanner Road, Hong Kong |
内容类型 | 期刊论文 |
源URL | [http://ir.lut.edu.cn/handle/2XXMBERH/112912] |
专题 | 计算机与通信学院 |
作者单位 | School of Computer and Communication, Lanzhou University of Technology, Lanzhou 730050, China |
推荐引用方式 GB/T 7714 | Yan, Yan,Li, Liting,Zhang, Qiuyu. Universal steganalysis method based on multi-domain features[J]. Journal of Information and Computational Science,2013,10(7):2177-2185. |
APA | Yan, Yan,Li, Liting,&Zhang, Qiuyu.(2013).Universal steganalysis method based on multi-domain features.Journal of Information and Computational Science,10(7),2177-2185. |
MLA | Yan, Yan,et al."Universal steganalysis method based on multi-domain features".Journal of Information and Computational Science 10.7(2013):2177-2185. |
个性服务 |
查看访问统计 |
相关权益政策 |
暂无数据 |
收藏/分享 |
除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。
修改评论