Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

PYTORCH_MODEL with CUDA 11.2 #1154

Open
alexberlaga opened this issue Nov 8, 2024 · 2 comments
Open

PYTORCH_MODEL with CUDA 11.2 #1154

alexberlaga opened this issue Nov 8, 2024 · 2 comments

Comments

@alexberlaga
Copy link

Dear Plumed Team,

I am using PLUMED 2.10b with CUDA 11.2 and I am attempting to enable libtorch to use the Pytorch module. I have tried with multiple versions of libtorch but it seems like one of two things is happening:

(1) I use Cxx11 ABI, and then ./configure cannot enable libtorch
(2) I use pre-Cxx11 ABI, and then the "make" step does not work. I figure this is due to some issue with my cluster being incompatible with pre-Cxx11 ABI, rather than a PLUMED thing.

My question is -- would there be any version of libtorch that is compatible both with all 3 of PLUMED 2.10b, Cxx11 ABI and CUDA 11.2? Thanks!

@luigibonati
Copy link
Contributor

Hi Alex,
can you attach the configure log files in the two cases?
Furthermore, which LibTorch versions did you try? In principle, there should not be compatibility issues besides cluster compatibility. Note that LibTorch>2.1 requires C++17 support, but that is already enforced if you are using PLUMED>2.10

Luigi

@alexberlaga
Copy link
Author

Hi, thanks for your help! I'll share log files here:

./configure --enable-libtorch --enable-modules=pytorch
config.log

./configure --enable-libtorch --enable-modules=pytorch CXXFLAGS="-D_GLIBCXX_USE_CXX11_ABI=0"
config2.log

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants