-
Notifications
You must be signed in to change notification settings - Fork 20
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
MPI tests using > 15 GB memory #626
Comments
Found while troubleshooting #624. |
failing tests are both continuity_6 asgard/src/distribution_tests.cpp Line 647 in 028b142
asgard/src/time_advance_tests.cpp Line 678 in 028b142
|
Did a few tests, I think that 32GB will be enough, but it is still too much for CI. For some reason, when we create a 6D problem, we are unreasonable amount of memory for something. I wonder if this is an issue with the hash-map taking too much space. If I'm right, this will be an issue across the board, it's just that the problem needs to run 4 copies (since the test uses 4 mpi ranks) and all of our workstations have lots more than that. |
Tested again on develop, it goes nowhere near 16GB. I suspect the problem was resolved in #743 |
Describe the bug
A clear and concise description of what the bug is.
To Reproduce
Steps to reproduce the behavior:
asgard-unit-mpi-gxx
andasgard-unit-mpi-gxx-scalapack
currently run out of memory on the CI machines. Iincreased the memory size from 15GB to 144GB as a (temporarily?) workaround, suspect there is something over allocating memory or a test that should be scaled down or moved to a different label that runs less often.Expected behavior
A clear and concise description of what you expected to happen.
MPI tests pass on a container with 15GB RAM.
System:
module list
]Additional context
Add any other context about the problem here.
Reproduced locally with
docker run -m 15000m -it cpu /bin/bash
The text was updated successfully, but these errors were encountered: