Issue 15, 2019

Extensive deep neural networks for transferring small scale learning to large scale systems

Abstract

We present a physically-motivated topology of a deep neural network that can efficiently infer extensive parameters (such as energy, entropy, or number of particles) of arbitrarily large systems, doing so with Image ID:c8sc04578j-t2.gif scaling. We use a form of domain decomposition for training and inference, where each sub-domain (tile) is comprised of a non-overlapping focus region surrounded by an overlapping context region. The size of these regions is motivated by the physical interaction length scales of the problem. We demonstrate the application of EDNNs to three physical systems: the Ising model and two hexagonal/graphene-like datasets. In the latter, an EDNN was able to make total energy predictions of a 60 atoms system, with comparable accuracy to density functional theory (DFT), in 57 milliseconds. Additionally EDNNs are well suited for massively parallel evaluation, as no communication is necessary during neural network evaluation. We demonstrate that EDNNs can be used to make an energy prediction of a two-dimensional 35.2 million atom system, over 1.0 μm2 of material, at an accuracy comparable to DFT, in under 25 minutes. Such a system exists on a length scale visible with optical microscopy and larger than some living organisms.

Graphical abstract: Extensive deep neural networks for transferring small scale learning to large scale systems

Article information

Article type
Edge Article
Submitted
14 ኦክቶ 2018
Accepted
28 ፌብሩ 2019
First published
20 ማርች 2019
This article is Open Access

All publication charges for this article have been paid for by the Royal Society of Chemistry
Creative Commons BY license

Chem. Sci., 2019,10, 4129-4140

Extensive deep neural networks for transferring small scale learning to large scale systems

K. Mills, K. Ryczko, I. Luchak, A. Domurad, C. Beeler and I. Tamblyn, Chem. Sci., 2019, 10, 4129 DOI: 10.1039/C8SC04578J

This article is licensed under a Creative Commons Attribution 3.0 Unported Licence. You can use material from this article in other publications without requesting further permissions from the RSC, provided that the correct acknowledgement is given.

Read more about how to correctly acknowledge RSC content.

Social activity

Spotlight

Advertisements