[Yt-dev] mpi4py on an altix
    John Wise 
    jwise at astro.princeton.edu
       
    Sun Nov  8 17:39:45 PST 2009
    
    
  
Hi Matt,
> What's ldd /home/astro/jwise/local/lib/python2.6/site-packages/ 
> mpi4py/MPI.so
> report?  And can you try running with the python2.6-mpi executable,
> see if that helps?
>>
Ah, ldd was what I was thinking about!  I did some more digging, and  
ldd showed that MPI.so was correctly linked to /usr/lib/libmpi.so.  I  
found out what's going on.
I did a "nm -o /usr/lib/libmpi.o", and MPI_comm_get_name wasn't in  
libmpi.so.
SGI's libmpi (Propack 5, I think ... based on kernel 2.6.5) is still  
an MPI-1 implementation.  The current svn version of mpi4py has config  
headers in the src/config, and the sgimpi.h is probably set up for a  
newer version.
So I had to modify it to include the missing routines from SGI's MPI.   
This included
MPI_Comm_get_name
MPI_Comm_set_name
MPI_MAX_OBJECT_NAME
MPI_MAX_PORT_NAME
MPI_Open_port
MPI_Type_get_name
MPI_Win_get_name
MPI_Close_port
(plus some more that I may have forgotten)
I discovered these by including the "#define PyMPI_MISSING_XXX 1" that  
was causing the ImportError, recompile, and try importing mpi4py.MPI  
again.  If another ImportError occurred, repeat this.  And repeat  
until it imports correctly.
Once I accounted for all of the missing routines, I ran into the same  
error you described in your mpi4py posts, where it seg. faults in  
MPI_Finalized().  So using the python2.6-mpi executable completed the  
solution.
I'm not sure if will help anyone else, but I wanted to let people know  
about the solution!
Cheers,
John
    
    
More information about the yt-dev
mailing list