摘要
针对目前大数据缺乏群组隐私保护的问题,提出一种基于二分关联图的大数据群组隐私保护方法,在不同群组隐私层级的二分关联图中保护数据隐私。所提算法通过关联图分层(association graph layering,AGL)和层级群组差分隐私(hierarchical group differential privacy,HGDP),实现发布大数据的群组隐私保护。关联图分层将给定关联图的节点和边分组,通过划分二分关联图的节点最小化每个层级的敏感度,可以向不同权限的用户公开不同层级的子图;在层级群组差分隐私过程中,对不同层级选择灵敏度并计算方差,重复聚合噪声减少方差,通过高斯机制进行子图噪声注入,实现分层关联图的扰动,以保证每个层级的群组隐私。实验结果表明,所提方法可以用来保护群组数据的综合敏感信息,并且比其他方法具有更好的隐私保护效果和更高的数据可用性。
In order to solve the problem of lack of group privacy protection in big data,we propose a big data group privacy protection method based on bipartite association graph to protect data privacy in binary association graph at different group privacy levels.The proposed algorithm implements group privacy protection for publishing big data through association graph layering(AGL)and hierarchical group differential privacy(HGDP).The association graph layering will group the nodes and edges of the given association graph.By dividing the nodes of the binary association graph to minimize the sensitivity of each level,different levels of subgraphs can be exposed to users with different permissions.In the process of hierarchical group differential privacy,the sensitivities of different levels are selected and the variance is calculated,and the noise is aggregated repeatedly to reduce the variance.The noise annotation of the subgraph is carried out by Gaussian mechanism.In order to guarantee the group privacy of each level,the hierarchical association graph is disturbed.The experimental results show that the proposed method can be used to protect the comprehensive sensitive information of group data,and has better privacy protection effect and higher data availability than other methods.
作者
田华
何翼
TIAN Hua;HE Yi(School of Big Data Science,Tongren University,Tongren 554300,P.R.China)
出处
《重庆邮电大学学报(自然科学版)》
CSCD
北大核心
2020年第4期673-680,共8页
Journal of Chongqing University of Posts and Telecommunications(Natural Science Edition)
基金
贵州省教育厅创新群体重大研究项目(黔教合KY字[2016]051)。
关键词
二分关联图
大数据
关联图分层
层级群组差分隐私
binary association graph
big data
association graph layering
hierarchical group differential privacy