Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

The Neural Network Potential in Parallel #179

Open
LiMahappy opened this issue May 24, 2024 · 2 comments
Open

The Neural Network Potential in Parallel #179

LiMahappy opened this issue May 24, 2024 · 2 comments

Comments

@LiMahappy
Copy link

Hi mjwen@mjwen ,
I am currently attempting to develop a neural network potential for a PtRh alloy. I have trained using a 2000-step AIMD dataset with KLIFF, but the resulting neural network potential has very low accuracy and is essentially unusable. Below is my input file. Additionally, I noticed on the official website that only physically-based potential training can be run in parallel mode. How can I run the neural network potential in parallel?
nn_ptrh.txt
Awaiting your reply.

@mjwen
Copy link
Collaborator

mjwen commented May 28, 2024

Hi @LiMahappy,

The quality of a model depends on multiple factors, and you might want to generate a dataset that covers the physics of the problem you are interested in studying.

What do you mean by running in parallel? Training the model in parallel or using the trained model in parallel for simulation? If the former, you can use pytorch distributed data parallel (pytorch is what kliff used internally). If the latter, the simulator (e.g. lammps, gulp) will provide parallelization.

@LiMahappy
Copy link
Author

Hi @mjwen ,
Sorry, I may not have made myself clear, I was talking about training a neural network potential. Physical potentials can be trained in parallel using Python's multiprocessing module or the mpi4py module, but can neural network potentials be modified from just the input file or the submit command in order to be trained in parallel?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants