zl程序教程

您现在的位置是:首页 >  其他

当前栏目

分子图上可扩展的几何深度学习

2023-04-18 14:51:51 时间

分子和材料科学中的深度学习由于缺乏应用科学、人工智能和高性能计算之间的整合而受到限制。训练数据量、模型架构的大小和复杂性以及计算基础设施的规模等方面的瓶颈都是限制分子和材料深度学习扩展的关键因素。在这里,我们提出了LitMatter,一个用于扩展分子深度学习方法的轻型框架。我们在400多个GPU上训练四种图形神经网络架构,并研究这些方法的扩展行为。根据模型架构的不同,训练时间的速度可以提高到60倍。经验性的神经缩放关系量化了与模型相关的缩放,并使计算资源的优化分配和可扩展的分子几何深度学习模型实现得以识别。

原文题目:Scalable Geometric Deep Learning on Molecular Graphs

原文:Deep learning in molecular and materials sciences is limited by the lack of integration between applied science, artificial intelligence, and high-performance computing. Bottlenecks with respect to the amount of training data, the size and complexity of model architectures, and the scale of the compute infrastructure are all key factors limiting the scaling of deep learning for molecules and materials. Here, we present LitMatter, a lightweight framework for scaling molecular deep learning methods. We train four graph neural network architectures on over 400 GPUs and investigate the scaling behavior of these methods. Depending on the model architecture, training time speedups up to 60× are seen. Empirical neural scaling relations quantify the model-dependent scaling and enable optimal compute resource allocation and the identification of scalable molecular geometric deep learning model implementations.