how do i scan an entire line of a text file and store it into an array and then move on to the next line and store that into another array?
Any information would help, thanks.
how do i scan an entire line of a text file and store it into an array and then move on to the next line and store that into another array?
Any information would help, thanks.
Make a 2-d array, and use fgets(). (that's one way).
When all else fails, read the instructions.
If you're posting code, use code tags: [code] /* insert code here */ [/code]
is there a way to make fgets() to go to a certain line? I am making a program that needs to read a huge text file and store the information into a linked list. my first step is to get have the ability to have fgets() go to certain lines according to the text file.
I am not sure if that makes any sense, so ... i dont know how to explain him.
>>to have fgets() go to certain lines according to the text file
If you've opened in text mode, the only way to find a specific line is to read every line in, and keep going until you hit the one you want. Once you've found your line, you can go back to it again using fseek(), providing you're stored the start location using ftell().
>>i dont know how to explain
Show a sample of the input, and the code you've built so far. Tell us what you want your program to do, and what you're having trouble with. If you're getting really stuck, break the problem down into small sections to make life easier.
When all else fails, read the instructions.
If you're posting code, use code tags: [code] /* insert code here */ [/code]
Hi ,
Here is a sample code I wrote when trying to read lines in a file's buiffer:
Code 1.1: Using readFilebyLines() to break lines in a bufferCode:#include <stdio.h> #include <stdlib.h> int readFilebyLines(char *buffer, char string[][1024]) { int b, j, l, z = 1; int newLine[128]; char *pch; static char *p; /* ** Get each line */ newLine[0] = 0; pch = strchr(buffer,'\n'); if( pch == NULL ) { string[0][0] = '\0'; return false; } while( pch != NULL) { newLine[z] = pch-buffer+1; pch = strchr(pch+1,'\n'); z++; } newLine[z] = (int)strlen(buffer) + 1; for( l = 0; l < z; l++ ) { b = 0; for(j = newLine[l]; j < newLine[l+1]-1; j++) { string[l][b] = buffer[j]; b++; } string[l][b] = '\0'; } return z; } void main() { FILE *file; int lineCount; char *buffer; char fileLines[128][1024]; unsigned int bytesRead, lSize; file=fopen("C:\\test.txt", "r"); if(file != NULL) { fseek(file, 0, SEEK_END); lSize=ftell(file); rewind(file); buffer = new char[ (lSize + 1) ]; if( !buffer ) return; bytesRead = fread( buffer, 1, lSize, file ); buffer[bytesRead] = '\0'; // Here is where each line is broken lineCount = readFilebyLines(buffer, fileLines); delete [] buffer; for( int i = 0; i < lineCount; i++ ) { printf("%s\n", fileLines[i]); } fclose(file); }else { printf("File not found."); } }
Hope this helps,
Stack Overflow
Last edited by Stack Overflow; 04-06-2004 at 05:24 PM. Reason: Added delete[]
Segmentation Fault: I am an error in which a running program attempts to access memory not allocated to it and core dumps with a segmentation violation error. This is often caused by improper usage of pointers, attempts to access a non-existent or read-only physical memory address, re-use of memory if freed within the same scope, de-referencing a null pointer, or (in C) inadvertently using a non-pointer variable as a pointer.
Hurray, i have a gotten the fgets() to read the txt properly. This is the code that i have gotten to work...
the next problem is to allocate enough memory to read any size of txt file... any ideas?Code:#include <stdio.h> #include <windows.h> char subsection [1000]; char discription [1000]; char discriptiontemp [1000]; int points=0; int numoflines=0; int i=0; int num_of_entries = 0; int outerloop = 0; int main() { FILE *inputfile; FILE *inputfile2; inputfile = fopen("Rune2.txt","r"); inputfile2 = fopen("rune.txt","r"); if (inputfile == NULL) { printf("error opening file"); } else { printf("File opened successfully. Loading data...\n\n"); while(fgets(discriptiontemp, 100, inputfile)!=NULL) { strcat(subsection, discriptiontemp); i++; } printf("%s\n\n", subsection); } }
malloc() or calloc() commands for dynamic memory allocation
If you insist on using an array, consider keeping track of the length of the text in said array inside your loop, and for your loop's controle statement. Couple that with strncpy to make sure you stay in-bounds.
Quzah.
Hope is the first step on the road to disappointment.
>void main()
main still returns int. Maybe I wasn't clear last time though. If you use void main then one of two things will happen on your system:
1) Undefined behavior, which turns your program into a time bomb.
2) Proper execution, which encourages the use of void main by the uninformed. Or if you're on a freestanding implementation (where your code is still nonstandard by using stdio.h and stdlib.h)
One of two things will happen here:
1) You will be corrected.
2) You will be ignored or not taken seriously because anyone who doesn't care enough to learn the language properly probably takes shortcuts elsewhere too.
Around here, void main is an indication of ignorance. No matter how talented of a programmer you are, if we see void main in your code then we immediately stop reading on the assumption that the rest of the program is of too poor quality to waste our time with.
>fseek(file, 0, SEEK_END);
This has no meaning for a text file. Not everyone uses a Unix spinoff where text and binary files are the same. Even if you open the file as binary, an fseek to SEEK_END isn't required to be meaningful, so you're relying on implementation-defined behavior.
>lSize=ftell(file);
There is a (very small) chance that this will not produce the size you want.
>buffer = new char[ (lSize + 1) ];
I think you meant to use malloc.
>if( !buffer )
new throws a bad_alloc exception. It doesn't return a null pointer anymore.
>delete [] buffer;
You mean free.
I'll assume that readFilebyLines works as you say without looking too closely at it because your code has been logically correct in the past. You're welcome.
My best code is written with the delete key.
Sorry about the "int main()" thing... i have never heard of that before. i have only been programming a few weeks. Thanks for your help, it is greatly appreciated.
Ah, yes.
Thanks again for pointing that out Prelude, I was to busy trying to write over a C++ example I wrote and convert it to C I didn't take the time to fully turn it around.
One thing I did see, is if I removed fseek() the file wouldn't read after I used ftell(). I could remove it, but seems things go awry without it. I also updated the main() code, to a more presentable standard:
Code 1.1: Fixed/Better ImplemtationCode:#include <stdio.h> int main(int argc, char *argv[]) { FILE *file; int lineCount; char *buffer; char fileLines[128][1024]; unsigned int bytesRead, lSize; file=fopen("C:\\test.txt", "r"); if(file != NULL) { // Get size (bytes) of file fseek(file, 0, SEEK_END); lSize=ftell(file); rewind(file); // Allocate memory to hold bytes buffer = (char *) malloc (lSize + 1); if( buffer == NULL ) return 0; // Write file to buffer bytesRead = fread( buffer, 1, lSize, file ); buffer[bytesRead] = '\0'; // Here is where each line is broken lineCount = readFilebyLines(buffer, fileLines); // Free the buffer's memory free(buffer); // Display each line through a loop for( int i = 0; i < lineCount; i++ ) { printf("%s\n", fileLines[i]); } // Close the file fclose(file); }else { // If file is null, then the file doesnt exist printf("File not found."); } return 0; }
Also, I may have messed up again, I'm not sure. I just tried making everything C like and etc...
Hope this helps,
Stack Overflow
Segmentation Fault: I am an error in which a running program attempts to access memory not allocated to it and core dumps with a segmentation violation error. This is often caused by improper usage of pointers, attempts to access a non-existent or read-only physical memory address, re-use of memory if freed within the same scope, de-referencing a null pointer, or (in C) inadvertently using a non-pointer variable as a pointer.
>I could remove it, but seems things go awry without it.
That's true with your current implementation. However, since all you're doing is storing the lines of the file and the line count, you don't need to know the size of the file beforehand. You have the option of just assuming an upper limit on the lines (which you're basically doing with the fileLines array). So you could forgo dynamic allocation altogether and use static arrays for a simple program:
The better option for any realistic program is to use a dynamic data structure that grows instead of limiting itself to a certain size. A linked list is ideal for this simple operation:Code:#include <stdio.h> #include <stdlib.h> #include <string.h> #define LINESIZE BUFSIZ #define MAXLINES 128 int main ( void ) { FILE *in = fopen ( "test.txt", "r" ); char fileLines[MAXLINES][LINESIZE]; char buf[LINESIZE]; int nlines = 0; int line = 0; if ( in == NULL ) { perror ( NULL ); exit ( EXIT_FAILURE ); } while ( fgets ( buf, sizeof buf, in ) != NULL ) { char *newline = strrchr ( buf, '\n' ); if ( newline != NULL ) *newline = '\0'; strcpy ( fileLines[line++], buf ); nlines++; } fclose ( in ); for ( line = 0; line < nlines; line++ ) printf ( "Line %d: \"%s\"\n", line + 1, fileLines[line] ); printf ( "Total Lines: %d\n", nlines ); return 0; }
Another improvement would be to use dynamically resized strings so that each line can be arbitrarily long (critical resources withstanding of course ).Code:#include <stdio.h> #include <stdlib.h> #include <string.h> #define LINESIZE BUFSIZ struct node { char line[LINESIZE]; struct node *next; }; int main ( void ) { FILE *in = fopen ( "test.txt", "r" ); struct node *fileLines = NULL; struct node *walk, *save; char buf[LINESIZE]; int nlines = 0; int line = 0; if ( in == NULL ) { perror ( NULL ); exit ( EXIT_FAILURE ); } while ( fgets ( buf, sizeof buf, in ) != NULL ) { char *newline = strrchr ( buf, '\n' ); if ( newline != NULL ) *newline = '\0'; walk = fileLines; if ( fileLines == NULL ) { /* First node */ walk = malloc ( sizeof *walk ); if ( walk == NULL ) break; strcpy ( walk->line, buf ); walk->next = fileLines; fileLines = walk; } else { /* Tail insertion */ while ( walk->next != NULL ) walk = walk->next; walk->next = malloc ( sizeof *walk->next ); if ( walk->next == NULL ) break; strcpy ( walk->next->line, buf ); walk->next->next = NULL; } nlines++; } fclose ( in ); for ( walk = fileLines; walk != NULL; walk = save ) { save = walk->next; printf ( "Line %d: \"%s\"\n", ++line, walk->line ); free ( walk ); } printf ( "Total Lines: %d\n", nlines ); return 0; }
The point is that there's no need to write crumbly code to allocate enough memory to hold a file when you can just as easily (or more easily in this case) deal with lines as they come. By doing so you end up with a more robust, solid program that doesn't rely on implementation-defined features.
My best code is written with the delete key.