Really weird file I/O problem

This is a discussion on Really weird file I/O problem within the C Programming forums, part of the General Programming Boards category; Hey all, I have been attempting to write the following simple program. I want to take a file that looks ...

  1. #1
    Registered User
    Join Date
    May 2002
    Posts
    208

    Really weird file I/O problem

    Hey all,

    I have been attempting to write the following simple program. I want to take a file that looks like this

    Code:
    Header stuff not needed
    more useless header
    5.88 3.55 1.22
    6.77 8.99 2.33
     * * *
     * * *
    and make it look like this....

    Code:
    5.88,3.55,1.22
    6.77,8.99,2.33
     * ,* ,*
     * , * , *
    Easy enough right? Well file 1 has 4002 lines and file 2 will have 4000 and I think this is where the problem lies. I wrote the following code.

    Code:
    #include <stdio.h>
    
    #define NUMLINES 4000
    
    int main(){
    
    char name[20],outname[20];
    double x[NUMLINES],y[NUMLINES],z[NUMLINES],dummy;
    FILE *infile, *outfile;
    
    for(int i=0;i<NUMLINES;i++){
    	x[i]=-999.999;
    	y[i]=-999.999;
    	z[i]=-999.999;
    }
    
    printf("name?   ");
    scanf("&#37;s",name);
    
    printf("outname? ");
    scanf("%s",outname);
    
    infile = fopen(name,"r");
    outfile = fopen(outname,"w");
    
    if(infile && outfile){
    fscanf(infile,"%lf %lf %lf %lf %lf\n",&dummy,&dummy,&dummy,&dummy,&dummy);
    fscanf(infile,"%lf %d\n",&dummy,(int)&dummy);
    
     for (int i=0; i<NUMLINES;i++){
    	fscanf(infile,"%lf %lf %lf \n", &x[i],&y[i],&z[i]);    
    	printf("%f %f %f",x[i],y[i],z[i]);
           fprintf(outfile,"%lf,%lf,%lf\n",x[i],y[i],z[i]);
     }
     
     fclose(infile);
     fclose(outfile);
     }
    else{printf("\n\n Something went awry\n");}
     return 0;
    
    }
    But it was not giving me anything in the arrays. They were all still -999.999 when output. I could not locate the bug in this code for the life of me so I started again from scratch. I wrote this code.

    Code:
    #include <stdio.h>
    #include <stdlib.h>
    
    int main(void)
    {
    
       FILE *fp;
       double x[4000],y[4000],z[4000];
    
    
       if((fp=fopen("rest","r"))==NULL)
        {
          printf("Cannot open file \n");
          exit(0);
        }
    
         for (int k=0;k<4000;k++)
        {
          fscanf(fp,"%lf %lf %lf \n", &x[k],&y[k],&z[k]);
          printf("%lf %lf %lf\n",x[k],y[k],z[k]);
        }
        
        fclose(fp);
    
        return 0;
    
    }
    and unless I only take the first 40 or so lines, I get garbage printed to the screen.....

    I realize this is a simple issue but I am baffled as to what could be going on. I have had other co-workers look at the code to no avail as well. Any help someone could provide would be greatly appreciated.
    Jeff Paddon
    Undergraduate Research Assistant
    Physics Department
    St. Francis Xavier University

  2. #2
    Kernel hacker
    Join Date
    Jul 2007
    Location
    Farncombe, Surrey, England
    Posts
    15,677
    How about adding a check to verify that the return value from fscanf is actually 3, and if not, print the line you're on. fscanf will return the number of correctly translated lines.

    For skipping the first line (in your first code-sample), why not use either "fgets()", or just a
    Code:
    while ((fgetc(file) !='\n');
    You may want to remove the "\n" from your fscanf too - I don't think it actually does what you want.

    --
    Mats

  3. #3
    Chinese pâté foxman's Avatar
    Join Date
    Jul 2007
    Location
    Canada
    Posts
    404
    You should use fgets for reading from the file, and sscanf to extract the information from the read line. From there you shouldn't have any real problems if you are careful (that's it, your buffer in which you are reading the line must be tall enough to contain one complete line or it might get ugly without proper treatment).

    Reading with fscanf is a bit silly, because it's not likely to discard the '\n' at the end of the line, so the next time you'll read, well, things will get ugly (mostly, fscanf will read nothing and store no value in your variable). I'm not quite sure, but i know things get ugly heh.

  4. #4
    Kernel hacker
    Join Date
    Jul 2007
    Location
    Farncombe, Surrey, England
    Posts
    15,677
    Quote Originally Posted by foxman View Post
    You should use fgets for reading from the file, and sscanf to extract the information from the read line. From there you shouldn't have any real problems if you are careful (that's it, your buffer in which you are reading the line must be tall enough to contain one complete line or it might get ugly without proper treatment).

    Reading with fscanf is a bit silly, because it's not likely to discard the '\n' at the end of the line, so the next time you'll read, well, things will get ugly (mostly, fscanf will read nothing and store no value in your variable). I'm not quite sure, but i know things get ugly heh.
    I agree, that using fgets() for all lines is a good thing - and perhaps checking that the last char read "s[strlen(s)-1]" is '\n', so that it's sure that the line actually fit - either that, or make sure the string is PLENTY long enough - and that can be hard if someone is actually intentionally trying to disrupt things with bad data or there is some "funny" data, say for example the output is in "fixed" format (rather than scientific), and someone has some REALLY large numbers in there, it can produce a REALLY long line. A double is capable of representing numbers with a magnitude of 10 to the power of +/- 3000 or so... So unless you have a string that is about 10000 chars long, you may be seeing data that is technically valid, but doesn't fit in your string! And there's of course no rule against supplying huge amounts of "unneeded" decimals or leading zero's.


    --
    Mats

  5. #5
    Registered User hk_mp5kpdw's Avatar
    Join Date
    Jan 2002
    Location
    Northern Virginia/Washington DC Metropolitan Area
    Posts
    3,806
    Code:
    #define NUMLINES 4000
    
    int main(){
    
    char name[20],outname[20];
    double x[NUMLINES],y[NUMLINES],z[NUMLINES],dummy;
    96,000+ bytes (assuming 8 byte doubles) seems a lot for a program's stack. There is no need to store all the values while writing to file (unless you plan on really doing something with them later on). Just read 3 doubles from the input file, and write those 3 doubles to the output file, rinse & repeat.
    "Owners of dogs will have noticed that, if you provide them with food and water and shelter and affection, they will think you are god. Whereas owners of cats are compelled to realize that, if you provide them with food and water and shelter and affection, they draw the conclusion that they are gods."
    -Christopher Hitchens

  6. #6
    Kernel hacker
    Join Date
    Jul 2007
    Location
    Farncombe, Surrey, England
    Posts
    15,677
    In a modern OS on a decent machine, it's quite possible to use megabytes of stack, but I agree, there's no need to store all the values if all you're doing is write them back out again.
    [In fact, if the file-format is identicaly, there's no need to read them in as float either, just read and discard the first two lines, then read and write a chunk at a time (chunk could be as little as a single char or a line or 2000+ chars. Just make sure if you read a line at a time that the buffer is big enough, and if isn't, that you can deal with it in some way that produces some predictable result:

    - bail out with some error message like "Line 1234 is too long - aborting".
    - fix it up by just outputting what you got - WITHOUT A NEWLINE and continue reading the rest of the line as before. This could be done many times if need be.

    --
    Mats

  7. #7
    Registered User whiteflags's Avatar
    Join Date
    Apr 2006
    Location
    United States
    Posts
    7,762
    fscanf would work okay. You could make a case for fgets and sscanf, but it should work okay, simply because the file has a simple format.

    > for (int k=0;k<4000;k++)
    Okay, but the file could be a lot smaller or a lot bigger than 4000 numbers, maybe. A way to handle that reality is to store an array on the free store (or some other data structure) and let it grow naturally.

    > fscanf(fp,"%lf %lf %lf \n", &x[k],&y[k],&z[k]);
    Like mats was saying though, you're going to have to check the return value of fscanf. Don't ignore the commas in your file, they are a part of the format and are probably screwing things up here.

  8. #8
    Chinese pâté foxman's Avatar
    Join Date
    Jul 2007
    Location
    Canada
    Posts
    404
    Yeah in fact i just tought about it, using fscanf would be a good solution if you have a way of discarting the '\n' character that fscanf doesn't discard but fgets do (in the case the buffer is enough tall to read a complete line). Like, making a small macro
    Code:
    #define FPURGE(file) while (fgetc(file) != '\n')
    or just putting it integraly in your code (but macros look cooler )

    If you do have large value stored in a non-scientific way (like matsp talked about), in fact, it would be better to use this solution i guess, since it takes less operation to do the same thing.

    And like everybody said, you should check the value returned by fscanf.

  9. #9
    and the hat of int overfl Salem's Avatar
    Join Date
    Aug 2001
    Location
    The edge of the known universe
    Posts
    32,852
    > while (fgetc(file) != '\n')
    More hackery.
    If this gets an EOF before it gets a '\n', you're sunk in an infinite loop.
    If you dance barefoot on the broken glass of undefined behaviour, you've got to expect the occasional cut.
    If at first you don't succeed, try writing your phone number on the exam paper.
    I support http://www.ukip.org/ as the first necessary step to a free Europe.

  10. #10
    Kernel hacker
    Join Date
    Jul 2007
    Location
    Farncombe, Surrey, England
    Posts
    15,677
    Quote Originally Posted by Salem View Post
    > while (fgetc(file) != '\n')
    More hackery.
    If this gets an EOF before it gets a '\n', you're sunk in an infinite loop.
    And not everyone puts a newline before the end of the file, especially if the the application generating the output crashed or otherwise "failed to complete". In this case, the next application failing to perform and slurping 100% cpu until someone has noticed it, isn't going to make it better!

    --
    Mats

  11. #11
    Chinese pâté foxman's Avatar
    Join Date
    Jul 2007
    Location
    Canada
    Posts
    404
    I agree, this solution is only envisageable if you know the last character before the end of the file is a '\n'. How would you improve it ? I only see this solution

    Code:
    #define FPURGE(file) while ((fgetc(file) != '\n') && (!feof(file)))

  12. #12
    CSharpener vart's Avatar
    Join Date
    Oct 2006
    Location
    Rishon LeZion, Israel
    Posts
    6,484
    Code:
    #define FPURGE(file) {int c; while ( ((c = fgetc(file)) != EOF) && (c != '\n'));}
    The first 90% of a project takes 90% of the time,
    the last 10% takes the other 90% of the time.

Popular pages Recent additions subscribe to a feed

Similar Threads

  1. Newbie problem: I/O File
    By Ardetcho in forum C Programming
    Replies: 4
    Last Post: 07-19-2006, 05:27 PM
  2. Replies: 3
    Last Post: 03-04-2005, 02:46 PM
  3. Possible circular definition with singleton objects
    By techrolla in forum C++ Programming
    Replies: 3
    Last Post: 12-26-2004, 10:46 AM
  4. Unknown Memory Leak in Init() Function
    By CodeHacker in forum Windows Programming
    Replies: 3
    Last Post: 07-09-2004, 10:54 AM
  5. simulate Grep command in Unix using C
    By laxmi in forum C Programming
    Replies: 6
    Last Post: 05-10-2002, 05:10 PM

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21