-
Notifications
You must be signed in to change notification settings - Fork 900
Memory Leaks in pmix4x & mpool #6242
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
FWIW: the PMIx community has been working on eliminating memory leaks in PMIx, but that work is done in the PMIx repository. It only gets reflected in OMPI once someone updates the OMPI-embedded version of the code. This happens infrequently. You might have better luck if you configure OMPI against an external copy of PMIx so you can keep up with the upstream changes. |
I tried to find the PMIx repo, is it public and has an issue tracker? Where is |
Ah, here it is: https://github.com/pmix/pmix/issues |
I can try, but the problem is that you are looking at a rather stale version of PMIx and so those lines may not even exist any more. If you build OMPI against the PMIx master branch, then any memory leaks you find will at least reference current code and can be dealt with accordingly. |
mpool/hugepage is developed in OMPI, so those issues belong here |
Refs. open-mpi#6242 Signed-off-by: Gilles Gouaillardet <[email protected]>
refresh to openpmix/openpmix@dc53a84 Refs. open-mpi#6242 Signed-off-by: Gilles Gouaillardet <[email protected]>
refresh to openpmix/openpmix@dc53a84 and add commits from openpmix/openpmix#1031 and openpmix/openpmix#1034 !!! NOT FROM THE OFFICIAL PMIx repository !!! Refs. open-mpi#6242 Signed-off-by: Gilles Gouaillardet <[email protected]>
Background information
What version of Open MPI are you using?
OpenMPI
master
as of 3a4a1f9Describe how Open MPI was installed
via spack with:
Performing the test below with Clang, e.g. v6.0.0, shows the same results.
Please describe the system on which you are running
Details of the problem
I am trying to run CI on downstream software (libs and apps) on which I would like to constantly enable address and memory sanitizers. Unfortunately, OpenMPI reports a lot of (probably valid) memory leaks, tested with with recent versions of GCC and Clang.
Use the following snippet:
And compile it with:
When running it with:
We will see the following issues in
pmix_control.c:72
andmpool_hugepage_component.c:150
The memory leak is visible in:
The text was updated successfully, but these errors were encountered: