simple MPI program fails

This is a discussion on simple MPI program fails within the C++ Programming forums, part of the General Programming Boards category; Here's the program: Code: #include <stdio.h> #include <mpi.h> main(int argc, char **argv) { int node; MPI_Init(&argc,&argv); MPI_Comm_rank(MPI_COMM_WORLD, &node); printf("Hello World ...

  1. #1
    Registered User
    Join Date
    May 2010
    Posts
    245

    simple MPI program fails

    Here's the program:

    Code:
    #include <stdio.h>
    #include <mpi.h>
    
    main(int argc, char **argv)
    {
      int node;
    
      MPI_Init(&argc,&argv);
      MPI_Comm_rank(MPI_COMM_WORLD, &node);
    
      printf("Hello World from Node %d\n",node);
    
      MPI_Finalize();
    }
    Here's how I compile it

    Code:
    mpiCC -o simple simple.cpp -lstdc++
    Here's how I run it:

    Code:
    mpirun -np 2 simple
    Here's the error I get:

    Code:
    [macbook:24160] *** Process received signal ***
    [macbook:24160] Signal: Segmentation fault (11)
    [macbook:24160] Signal code: Address not mapped (1)
    [macbook:24160] Failing at address: 0x440000b0
    [macbook:24160] [ 0] 2   libSystem.B.dylib                   0x00007fff871371ba _sigtramp + 26
    [macbook:24160] [ 1] 3   libSystem.B.dylib                   0x00007fff70c545e0 __stack_chk_guard + 0
    [macbook:24160] [ 2] 4   simple                              0x0000000100000eb6 main + 42
    [macbook:24160] [ 3] 5   simple                              0x0000000100000e84 start + 52
    [macbook:24160] *** End of error message ***
    [macbook:24161] *** Process received signal ***
    [macbook:24161] Signal: Segmentation fault (11)
    [macbook:24161] Signal code: Address not mapped (1)
    [macbook:24161] Failing at address: 0x440000b0
    [macbook:24161] [ 0] 2   libSystem.B.dylib                   0x00007fff871371ba _sigtramp + 26
    [macbook:24161] [ 1] 3   libSystem.B.dylib                   0x00007fff70c545e0 __stack_chk_guard + 0
    [macbook:24161] [ 2] 4   simple                              0x0000000100000eb6 main + 42
    [-macbook:24161] [ 3] 5   simple                              0x0000000100000e84 start + 52
    [macbook:24161] *** End of error message ***
    mpirun noticed that job rank 0 with PID 24160 on node macbook.local exited on signal 11 (Segmentation fault). 
    1 additional process aborted (not shown)
    Anyone have ANY suggestions? I honestly have no idea.

  2. #2
    and the Hat of Guessing tabstop's Avatar
    Join Date
    Nov 2007
    Posts
    14,185
    Does it still fail if you give it command line arguments?

  3. #3
    Registered User
    Join Date
    May 2010
    Posts
    245
    Quote Originally Posted by tabstop View Post
    Does it still fail if you give it command line arguments?
    I just tried it, and it still fails with the same error stuff.

    I did

    Code:
    mpirun -np 4 testmpi sdf

  4. #4
    and the Hat of Guessing tabstop's Avatar
    Join Date
    Nov 2007
    Posts
    14,185
    I don't know enough about MPI to know if this is a problem: you have C code. You have specified the C++ compiler and the C++ standard library. Does it work if you change the compiler/libraries to the C versions? Does it work if you change the code to match the C++ bindings (MPI namespace, etc.)?

  5. #5
    Registered User
    Join Date
    May 2010
    Posts
    245
    Quote Originally Posted by tabstop View Post
    I don't know enough about MPI to know if this is a problem: you have C code. You have specified the C++ compiler and the C++ standard library. Does it work if you change the compiler/libraries to the C versions? Does it work if you change the code to match the C++ bindings (MPI namespace, etc.)?
    Unfortunately, no. This was just a simple example I got from online and tried to run it. In the code I was actually trying to test, it's C++, and I get the exact same errors.

    I was able to log in to a cluster and run this simply on the log-in nodes, and the hello world example does indeed work. There must be something goofed up with my configuration.

    I'd like to know what got goofed up.. i've ran sample MPI apps on my macbook before, with no problem.

  6. #6
    Registered User
    Join Date
    Jan 2010
    Posts
    412
    __stack_chk_guard sounds like some kind of stack corruption protector. What happens if you compile with -fno-stack-protector? Does it crash somewhere else then?
    Also try compiling with -g to see the line number of the crash instead of just "main + 42"

  7. #7
    Registered User
    Join Date
    May 2010
    Posts
    245
    Quote Originally Posted by _Mike View Post
    __stack_chk_guard sounds like some kind of stack corruption protector. What happens if you compile with -fno-stack-protector? Does it crash somewhere else then?
    Also try compiling with -g to see the line number of the crash instead of just "main + 42"
    I compiled it with -g and -fno-stack-protector, and I get the exact same error (main +42). Perhaps mpiCC doesn't understand those options?

    Edit: i compiled with -g once, and ran it, then with -fno-stack-protector once and ran it. Same error both runs.

  8. #8
    Registered User
    Join Date
    Jan 2010
    Posts
    412
    Quote Originally Posted by dayalsoap View Post
    I compiled it with -g and -fno-stack-protector, and I get the exact same error (main +42). Perhaps mpiCC doesn't understand those options?.
    It should. As far as I know mpiCC is just a wrapper for g++. But I have never written any code on mac os before so I don't know if -g is supported there or if you have to use something else.
    I don't have any experience with openmpi but I did try your code under Mandriva 2010.2 x64 with openmpi 1.4.1 though, and it works fine for me so the problem is probably not in your code.
    Do macs have valgrind? Maybe that will give some more information about the crash.

  9. #9
    Registered User
    Join Date
    May 2010
    Posts
    245
    Yes, I can get it to work on Linux. I'm going to reinstall some things and see what happens.

  10. #10
    Registered User
    Join Date
    May 2010
    Posts
    245
    Ok, I installed the latest version of OpenMPI, and it works. The only difference is, instead of mpiCC, i had to do mpic++. That's it.

  11. #11

    Join Date
    May 2005
    Posts
    1,041
    Congrats on getting that to work. Were you experimenting with this or did you have a particular application in mind? I've got the same example you posted working in C++ but I'm looking to use IMSL/ScaLapack fortran MPI to parallelize a panel code...if I ever get around to it, which might take exactly a billion years.

    Anyway, perhaps you'd like to keep in touch.

    Cheers.
    I'm not immature, I'm refined in the opposite direction.

Popular pages Recent additions subscribe to a feed

Similar Threads

  1. Simple program, simple problem
    By KAUFMANN in forum C Programming
    Replies: 5
    Last Post: 02-16-2011, 01:16 PM
  2. Replies: 10
    Last Post: 09-24-2010, 02:09 AM
  3. C in Visual Studio 2010, simple code fails
    By Jaymond Flurrie in forum C Programming
    Replies: 13
    Last Post: 07-01-2010, 08:27 AM
  4. simple program, simple error? HELP!
    By colonelhogan44 in forum C Programming
    Replies: 4
    Last Post: 03-21-2009, 12:21 PM
  5. Simple program, not so simple problem
    By nolsen in forum C++ Programming
    Replies: 2
    Last Post: 01-18-2008, 10:28 AM

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21