In September, Alphabet’s DeepMind published a paper in the journal Physical Review Research detailing Fermionic Neural Network (FermiNet), a new neural network architecture that’s well-suited to modeling the quantum state of large collections of electrons. The FermiNet, which DeepMind claims is one of the first demonstrations of AI for computing atomic energy, is now available in open source on GitHub — and ostensibly remains one of the most accurate methods to date.

In quantum systems, particles like electrons don’t have exact locations. Their positions are instead described by a probability cloud. Representing the state of a quantum system is challenging, because probabilities have to be assigned to possible configurations of electron positions. These are encoded in the wavefunction, which assigns a positive or negative number to every configuration of electrons; the wavefunction squared gives the probability of finding the system in that configuration.

The space of possible configurations is enormous — represented as a grid with 100 points along each dimension, the number of electron configurations for the silicon atom would be larger than the number of atoms in the universe. Researchers at DeepMind believed that AI could help in this regard. They surmised that, given neural networks have historically fit high-dimensional functions in artificial intelligence problems, they could be used to represent quantum wavefunctions as well.

Above: Simulated electrons sampled from the FermiNet move around a bicyclobutane molecule.

By way of refresher, neural networks contain neurons (mathematical functions) arranged in layers that transmit signals from input data and slowly adjust the synaptic strength — i.e., weights — of each connection. That’s how they extract features and learn to make predictions.

Because electrons are a type of particle known as fermions, which include the building blocks of most matter (e.g., protons, neutrons, quarks, and neutrinos), their wavefunction has to be antisymmetric. (If you swap the position of two electrons, the wavefunction gets multiplied by -1, meaning that if two electrons are on top of each other, the wavefunction and the probability of that configuration will be zero.) This led the DeepMind researchers to develop a new type of neural network that was antisymmetric with respect to its inputs — the FermiNet — and that has a separate stream of information for each electron. In practice, the FermiNet averages together information from across streams and passes this information to each stream at the next layer. This way, the streams have the right symmetry properties to create an antisymmetric function.

Above: The FermiNet’s architecture.

The FermiNet picks a random selection of electron configurations, evaluates the energy locally at each arrangement of electrons, and adds up the contributions from each arrangement. Since the wavefunction squared gives the probability of observing an arrangement of particles in any location, the FermiNet can generate samples from the wavefunction directly. The inputs used to train the neural network are generated by the neural network itself, in effect.

“We think the FermiNet is the start of great things to come for the fusion of deep learning and computational quantum chemistry. Most of the systems we’ve looked at so far are well-studied and well-understood. But just as the first good results with deep learning in other fields led to a burst of follow-up work and rapid progress, we hope that the FermiNet will inspire lots of work on scaling up and many ideas for new, even better network architectures,” DeepMind wrote in a blog post. “We have … just scratched the surface of computational quantum physics, and look forward to applying the FermiNet to tough problems in material science and condensed matter physics as well. Mostly, we hope that by releasing the source code used in our experiments, we can inspire other researchers to build on our work and try out new applications we haven’t even dreamed of.”

The release of the FermiNet code comes after DeepMind demonstrated its work on an AI system that can predict the movement of glass molecules as they transition between liquid and solid states. (Both the techniques and trained models, which were also made available in open source, could be used to predict other qualities of interest in glass, DeepMind said.) Beyond glass, the researchers asserted the work yielded insights into general substance and biological transitions, and that it could lead to advances in industries like manufacturing and medicine.

The audio problem:

Learn how new cloud-based API solutions are solving imperfect, frustrating audio in video conferences. Access here