zl程序教程

您现在的位置是:首页 >  其他

当前栏目

架构问题:调查差异性隐私对神经网络设计的影响

2023-03-20 15:40:14 时间

更加广泛地采用差异化私有神经网络的一个障碍是所带来的准确性损失。为了解决这个问题,需要更好地理解在不同隐私约束下神经网络架构和模型准确性之间的关系。作为第一步,我们测试了关于架构设计的现有知识是否也适用于差异性隐私环境。我们的发现表明,它并不成立;在没有差异化隐私的情况下表现良好的架构,在差异化隐私的情况下也不一定如此。因此,关于神经网络架构设计的现有知识不能无缝地转化为差异化隐私环境。未来的研究需要更好地理解神经网络架构和模型准确性之间的关系,以便在差异性隐私约束下做出更好的架构设计选择。

原文题目:Architecture Matters: Investigating the Influence of Differential Privacy on Neural Network Design

原文:One barrier to more widespread adoption of differentially private neural networks is the entailed accuracy loss. To address this issue, the relationship between neural network architectures and model accuracy under differential privacy constraints needs to be better understood. As a first step, we test whether extant knowledge on architecture design also holds in the differentially private setting. Our findings show that it does not; architectures that perform well without differential privacy, do not necessarily do so with differential privacy. Consequently, extant knowledge on neural network architecture design cannot be seamlessly translated into the differential privacy context. Future research is required to better understand the relationship between neural network architectures and model accuracy to enable better architecture design choices under differential privacy constraints.