Thread: exec a bash script

  1. #16
    Registered User
    Join Date
    May 2011
    Posts
    116
    Quote Originally Posted by MK27 View Post
    Is it necessary to put the echo in the while loop or could you concatenate all the data there instead then and echo it all at once afterward?
    In the bash script all the echo commands are stored into a file let's say its name is "aaa".Then for every line I read from that file in the script I have to pass it into the pipe in order my .cpp reads what the script sends.So in the script I have:


    Code:
    while read line;
    do
    
    	echo $line >> $2
    
    done < 'aaa'
    where line is every line of the file aaa and $2 is the name of the pipe the line has to be written to.

    Then in my .cpp program I do:


    Code:
    while(child.getline(line,LINESIZE)){
    		cout<<line<<endl;		
    		sleep(1);
    	}
    where child is the ifstream the .cpp reads the pipe

    Is there another way to write my data from the script to the pipe?
    It's the most simple I can think of..

  2. #17
    spurious conceit MK27's Avatar
    Join Date
    Jul 2008
    Location
    segmentation fault
    Posts
    8,300
    The problem is that in the reader, you are calling getline(), which clears the pipe that echo was waiting on, and then there is a basic "race condition" between the proceses: if the bash script manages to add another line in its loop before the reader loop calls getline() again, then you are in luck. If not, there is no line to get and so getline() fails and the while loop exits.

    You could solve this problem a few different ways. You could try using & with echo in the bash loop, but that is still not a guarantee it will keep up, and if it does, it may create an undesirable number of forks.

    The simplest would be to just feed the whole file at once into the pipe:

    Code:
    cat 'aaa' > $2
    If there is a reason you need to loop through the file one line at a time, this is what I meant by concatenate the data first then send it:

    Code:
    data=""
    while read line
    do
        data=$data$line"\n"
    done < 'aaa'
    echo -e $data > $2
    The "\n" is because read chomps the newline. The -e is important otherwise echo will output a literal slash-small-n and not a newline.

    Finally, if you absolutely have to feed a line at a time into the pipe, then you can do something like this:

    Code:
    while read line
    do
        echo $line >> $2
    done < 'aaa'
    echo "***END***" >> $2
    And the reader:

    Code:
    #include <cstring>;
    
    char line[LINESIZE];  // I'm assuming something like this
    while (1) {
        child.getline(line,LINESIZE));
        if (!strcmp(line, "***END***")) break;
        cout << line << endl;
    }
    The only pitfall with this is if the bash script exits prematurely and never sends ***END***, because there may be no percievable difference in terms of error-checking in the reader (via .good(), .eof(), .fail(), .bad()) between that and just having to wait for input from the pipe.

    It also may not work, if the 'child' fstream closes or has the failbit or eofbit set when readline returns from an empty pipe. In that case you could try checking or reseting those. It may end up that you have to re-open the stream each time.

    Finally, it's also a busy loop, but probably not too bad a one.

    WRT to adding small delays, don't do it as the sole means of synchronization, but it is fine to combine it with one of the more reliable techniques above if you think it will smooth things out or to take the potential teeth out of a busy loop. You can get smaller delays than one second on linux using nanosleep():

    SourceForge.net: POSIX timers - cpwiki

    Beware the caveat about "granularity" there; don't bother with gaps less than 10ms (meaning, your 500 line file will take at least 5 seconds).
    C programming resources:
    GNU C Function and Macro Index -- glibc reference manual
    The C Book -- nice online learner guide
    Current ISO draft standard
    CCAN -- new CPAN like open source library repository
    3 (different) GNU debugger tutorials: #1 -- #2 -- #3
    cpwiki -- our wiki on sourceforge

  3. #18
    Registered User
    Join Date
    May 2011
    Posts
    116
    Quote Originally Posted by MK27 View Post
    The problem is that in the reader, you are calling getline(), which clears the pipe that echo was waiting on, and then there is a basic "race condition" between the proceses: if the bash script manages to add another line in its loop before the reader loop calls getline() again, then you are in luck. If not, there is no line to get and so getline() fails and the while loop exits.

    You could solve this problem a few different ways. You could try using & with echo in the bash loop, but that is still not a guarantee it will keep up, and if it does, it may create an undesirable number of forks.

    The simplest would be to just feed the whole file at once into the pipe:

    Code:
    cat 'aaa' > $2
    Thank you so much for your answer!
    There's absolutely no reason to read the file line by line ,I just want to pass the whole file in the pipe
    Learned a lot from your post!

Popular pages Recent additions subscribe to a feed

Similar Threads

  1. Awk and sed bash script help
    By Annonymous in forum Linux Programming
    Replies: 19
    Last Post: 05-10-2012, 12:40 AM
  2. ssh/bash script question
    By Overworked_PhD in forum Tech Board
    Replies: 2
    Last Post: 03-30-2009, 07:48 PM
  3. Bash Script Q
    By QuestionC in forum Tech Board
    Replies: 1
    Last Post: 04-19-2007, 10:16 AM
  4. Linux: Use C to call a bash script
    By harada in forum Linux Programming
    Replies: 9
    Last Post: 10-27-2006, 01:59 PM
  5. Running 'exec' twice in one UNIX shell script
    By Zughiaq in forum Tech Board
    Replies: 2
    Last Post: 05-03-2003, 12:04 AM