What Display Daily thinks: AI methods like machine learning and deep learning have shown early promise for predicting materials properties, identifying candidates, and guiding experiments. This can drastically expand the search space and speed up development timelines.
However, current AI models are limited by the availability of high-quality structured data matched to experimental results. Lack of data for training and evaluation makes reliability a concern. Generalizability of models to diverse material classes and undiscovered chemistries needs improvement. Models developed on known materials may not translate to completely new systems.
But there is a timeline here that suggests, over time, the probability of success is going to increase because of how effective AI platforms become at getting it right. Although AI is unlikely to be a standalone solution, the pace of development is extraordinary meaning that AI materials discovery could be on an exponential curve to reaching. practical use. The question remains, how much of this is going to go to the bottom line of display technology development and when can we expect to see ROI? It could be sooner than we imagine. It’s just a numbers game now, heading into a time when the probability of success is good enough to warrant investment in the materials theoretically developed by machines.
Scaling Up Materials Discovery with AI
Materials science is undergoing a revolution driven by artificial intelligence. In a recent breakthrough, researchers at Google DeepMind have used advanced machine learning techniques to discover over 2 million new stable inorganic crystal structures – an order of magnitude more than previously known. Their findings, published in Nature, demonstrate the vast untapped potential of AI to accelerate materials innovation.
At the heart of the achievement is a technique called deep learning, where neural networks are trained on large datasets to make predictions. The common challenge in materials discovery is that while the space of possible materials is astronomically large, the number of known stable materials has been relatively small – around 50,000. Without sufficient data, previous deep learning models have struggled to effectively explore new materials.
The DeepMind team overcame this through an iterative process they call GNoME (graph networks for materials exploration). GNoME uses graph neural networks – a type of deep learning architecture well-suited for molecules and crystals – to predict the stability of materials. The predicted candidates are then evaluated using the gold standard of quantum chemistry simulations. The now verified stable structures are added back into the dataset, allowing the neural networks to be retrained on an ever-growing catalogue.
This data flywheel enabled the GNoME models to explore materials space with far greater accuracy than before. Within only six rounds of prediction and quantum chemistry evaluation, over 2 million new stable inorganic crystal structures were uncovered – a staggering increase.
A key innovation was the generation of highly diverse candidate structures. This included new techniques like symmetry-aware partial substitutions, allowing strategic exploration of under-searched transition metal and rare earth compounds. Combined with improvements in neural network architectures, GNoME achieved unprecedented predictive accuracy. By the final round, high fidelity simulations verified the stability of over 80% of GNoME’s proposed structures.
The newly discovered materials significantly expand the known inorganic crystal structure library. Not only has the number of known stable structures increased 10-fold, but 45,500 entirely new structure prototypes have also been revealed. This includes thousands of complex quaternary and quinary systems – chemical spaces noted as difficult for previous methods.
With access to such a vast catalogue, researchers can now rapidily screen materials for desirable properties. As proof-of-concept, the DeepMind team used GNoME’s output to identify new layered materials for electronics, solid electrolytes for improved batteries, and lithium-ion conductors. The recognition of over 700 matches with recently synthesized materials further supports the validity of these AI-generated crystals.
An unexpected bonus finding was the enabling of highly accurate neural network interatomic potentials. By leveraging the huge dataset of quantum chemistry relaxations generated during materials screening, the researchers pretrained models that can simulate interactions between atoms in inorganic crystals. Remarkably, these simulations matched expensive quantum chemistry calculations in accuracy, while being up to a million times faster.
The potentials exhibited strong zero-shot transferability, accurately modeling never-before-seen material compositions and states. As demonstration, the researchers used them to screen solid-state electrolytes and identify promising lithium-ion conductors. Such transferability has been a notorious challenge for previous neural network potentials.
The work represents a breakthrough in combining advances in AI and quantum simulation to profoundly accelerate materials science. While significant research remains to experimentally realize these structures, the computational pipelines established by projects like GNoME bring that ultimate goal tantalizingly closer. With continued scaling, the DeepMind team believes such data-driven platforms could one day become a universal materials explorer – able to accurately predict any property for any chemical system.
If successful, the impact would be profound. Mastering matter through modeling and simulation would allow rapid discovery and optimization of advanced materials for catalysis, energy storage, quantum computing, and yes, displays. This research provides a glimpse into a very near future, where AI and computing unlock material innovations once only imagined in science fiction.
Reference
Merchant, A., Batzner, S., Schoenholz, S. S., Aykol, M., Cheon, G., & Cubuk, E. D. (2023). Scaling deep learning for materials discovery. Nature. https://doi.org/10.1038/s41586-023-06735-9