A recipe for scalable attention-based MLIPs: unlocking long-range accuracy with all-to-all node attention Paper • 2603.06567 • Published Mar 6
The Importance of Being Scalable: Improving the Speed and Accuracy of Neural Network Interatomic Potentials Across Chemical Domains Paper • 2410.24169 • Published Oct 31, 2024