Electrons are elementary particles of fundamental importance. Their quantum mechanical interactions with each other and with atomic nuclei give rise to a multitude of phenomena observed in chemistry and materials science. Understanding and controlling the electronic structure of matter helps to better understand the reactivity of molecules, the structure and energy transport within the planets, and the failure mechanisms of matter.
Scientific challenges are increasingly being addressed through computational modeling and simulation, leveraging the capabilities of high-performance computing. However, a significant obstacle to achieving realistic simulations with quantum accuracy is the lack of a predictive modeling technique that combines high accuracy with scalability across time scales and lengths. different. Classical atomic simulation methods can handle large and complex systems, but neglecting the quantum electronic structure limits their applicability. In contrast, simulation methods that do not rely on assumptions such as empirical modeling and parameter tuning (first-principles methods) provide high accuracy but are computationally demanding. For example, density functional theory (DFT), a widely used first-principle method, expresses the ratio of the block to the system size, thus limiting its predictive power at scale. small.
Hybrid approach based on deep learning
The team of researchers has now presented a new simulation method called the Materials Learning Algorithm (MALA) software stack. In computer science, a software stack is a collection of algorithms and software components that are combined to create a software application to solve a specific problem. Lenz Fiedler, Ph.D. MALA student and lead developer at CASUS, explains: «MALA integrates machine learning with physics-based approaches to predicting the electronic structure of materials. It uses a combined approach, uses an established machine learning method called deep learning to accurately predict local quantities, supplemented by physical algorithms to compute global quantities of interest.»
The MALA software stack takes the arrangement of atoms in space as input and generates fingerprints called bispectral components, which encode the spatial arrangement of atoms around a point Cartesian grid. The machine learning model in MALA is trained to predict the electronic structure based on this atomic neighborhood. A significant advantage of MALA is the ability of the machine learning model to be system-independent, allowing it to be trained on data from small systems and deployed at any scale.
In their publication, the team of researchers showcased the remarkable effectiveness of this strategy. They achieved more than 1,000x speedups for smaller system sizes, including up to a few thousand atoms, compared to conventional algorithms. Furthermore, the team demonstrated MALA’s ability to accurately perform large-scale electronic structure calculations, involving more than 100,000 atoms. Notably, this achievement was achieved with modest computational effort, revealing the limitations of conventional DFT codes.
Attila Cangi, Acting Head of Extreme Matter Matters at CASUS, explains: «As the system size increases and there are more atoms, DFT calculations become impractical, while the speed advantage becomes MALA’s magnitude continues to increase.MALA’s key breakthrough lies in its ability to operate on a local atomic environment, allowing accurate numerical predictions to be minimally affected by system size. . This groundbreaking achievement opens up computing possibilities once considered unattainable.»
Promote applied research is expected
Cangi aims to push the boundaries of electronic structure calculations by leveraging machine learning: «We anticipate that MALA will create a transformation in electronic structure computation, as we currently have a method of simulating significantly larger systems at unprecedented speeds.In the future, researchers will be able to tackle a wide range of societal challenges on a dramatically improved basis, including These include developing new vaccines and new materials for energy storage, conducting large-scale simulations of semiconductor devices, studying material defects, and exploring chemical reactions to convert atmospheric gases. greenhouse carbon dioxide into climate-friendly minerals.»
Furthermore, MALA’s approach is particularly well-suited to high-performance computing (HPC). As system size increases, MALA allows independent processing on the compute grid it uses, making efficient use of HPC resources, especially graphics processing units. Siva Rajamanickam, a staff scientist and specialist in parallel computation at Sandia National Laboratory, explains: «MALA’s algorithm for computing the electronic structure is consistent with modern HPC systems with distributed accelerators The ability to split work and parallelize different mesh points on different accelerators makes MALA an ideal match for machine learning that can scale across HPC resources, resulting in unparalleled speed and efficiency in electronic structure computing.»
In addition to developing partners HZDR and Sandia National Laboratory, MALA has been employed by organizations and companies such as Georgia Institute of Technology, A&T State University North Carolina, Sambanova Systems Inc. and Nvidia Corp.
#Machine #learning #brings #materials #modeling #era #Deep #learning #enables #precise #electronic #structure #calculations #scale