DeepMind’s AI can understand the unusual atomic structure of glass

The glass comes under those materials that have very complex and difficult to understand atomic structures, in a recently published paper DeepMind researchers describe that they have created an AI system that can predict the movement of glass molecules as they transition between liquid and solid states.

This new research of DeepMind could help us to deepen our insights about the nature of glass and our understanding of the structural changes that occur near the glass transition. Speaking practically, these findings may help us answer questions about the mechanical constraints of glass (e.g. where glass will break).

Beyond the glass, the researchers assert the work yields insights into general substance and biological transitions, and that it could lead to advances in industries like manufacturing and medicine.




“Machine learning is well placed to investigate the nature of fundamental problems in a range of fields,” a DeepMind said. “We will apply some of the techniques are proven and developed through modeling glassy dynamics to other central questions in science, with aim of revealing new things about the world around us.”

The AI ran the software several times to account for all the various combinations of particles and neighbor particles, and to model how the entire piece of glass would react to different conditions.




The DeepMind researcher’s AI system simulates how atomic particles in a piece of glass respond to different temperatures and pressures. The techniques and training models have now been made available in open-source, and could be used to predict other qualities of interest in glass, DeepMind says.

There about countless unknowns unknown information about the nature of the glass formation, like whether it corresponds to a structural phase transition & why viscosity during cooling increases by a factor of a trillion.

The DeepMind team train the AI system using a graph neural network, a type of AI model that directly operates on a graph, a non-linear data structure consisting of nodes and edges to predict glassy dynamics.

They first created an input graph where the nodes and edges represented particles and interactions between particles, respectively, such that a particle was connected to its neighboring particles within a certain radius.

Two encoder models then embedded the labels (i.e., translated them to mathematical objects the AI system could understand). Next, the edge embeddings were iteratively updated, at first based on their previous embeddings and the embeddings of the two nodes to which they were connected.

After all of the graph’s edges were updated in parallel using the same model, another model refreshed the nodes based on the sum of their neighboring edge embeddings and their previous embeddings.

This process repeated several times to allow local information to propagate through the graph, after which a decoder model extracted mobilities measures of how much a particle typically moves for each particle from the final embeddings of the corresponding node.

The team validated their model by constructing several data sets corresponding to mobilities predictions on different time horizons for different temperatures. After applying graph networks to the simulated 3D glasses, they found that the system “strongly” outperformed both existing physics-inspired state-of-the-art AI models.

They say that network was “extremely good” on short times and remained “well-matched” up to the relaxation time of the glass (which would be up to thousands of years for actual glass), achieving a 96% correlation with the ground truth for short times and a 64% correlation for relaxation time of the glass.

In the latter case, that’s an improvement of 40% compared with the previous state of the art.

In a separate experiment, to better understand the graph model, the team explored which factors were important to its success. They measured the sensitivity of the prediction for the central particle when another particle was modified, enabling them to judge how large of an area the network used to extract its prediction.

This provided an estimate of the distance over which particles influenced each other in the system.

They report there’s “compelling evidence” that growing spatial correlations are present upon approaching the glass transition, and that the network learned to extract them. “These findings are consistent with a physical picture where a correlation length grows upon approaching the glass transition,” DeepMind wrote.“

The definition and study of correlation lengths is a cornerstone of the study of phase transition in physics.”

“Graph networks may not only help us make better predictions for a range of systems,” wrote DeepMind, “but indicate what physical correlates are important for modeling them that machine learning systems might be able to eventually assist researchers in deriving fundamental physical theories, ultimately helping to augment, rather than replace human understanding.”

More in AI

MIT’s AI predicts catastrophe if social distancing restrictions relax too soon

Scientists are using AI to predict which coronavirus patients need ventilators

Facebook AI model beats Google, runs 5x faster on GPUs

Google & Murata creates world’s smallest AI module with Coral Intelligence


Leave a Reply

Your email address will not be published.