Manipulating 60MB files

This is a discussion on Manipulating 60MB files within the C++ Programming forums, part of the General Programming Boards category; Help me, I need to find out how to manipulate a 60MB file using C++ I need to cast it ...

  1. #1
    Registered User
    Join Date
    May 2002
    Posts
    1

    Question Manipulating 60MB files

    Help me, I need to find out how to manipulate a 60MB file using C++
    I need to cast it to a variable at some point.
    Is it even possible?

  2. #2
    Guest Sebastiani's Avatar
    Join Date
    Aug 2001
    Location
    Waterloo, Texas
    Posts
    5,659
    Don't be silly. The only restrictions are the particular system's performance. Just treat it like any other chunk of memory...



    ITSA
    Socket Library!

  3. #3
    the hat of redundancy hat nvoigt's Avatar
    Join Date
    Aug 2001
    Location
    Hannover, Germany
    Posts
    3,139
    >I need to cast it to a variable at some point.

    You can open it using the FILE structure functions. You can then read it into memory, manipulate the memory like you would manipulate any other piece of memory and write it back to disk.

    If you have problems with a specific part, please post some code.
    hth
    -nv

    She was so Blonde, she spent 20 minutes looking at the orange juice can because it said "Concentrate."

    When in doubt, read the FAQ.
    Then ask a smart question.

  4. #4
    Just because ygfperson's Avatar
    Join Date
    Jan 2002
    Posts
    2,493
    btw, is the FILE* pointer a typecast of another type of pointer, like a void* or a char*?

  5. #5
    Registered User
    Join Date
    Apr 2002
    Posts
    249

    Guys, is there any different

    Is there any different if the file is very big or very smal....

    If so We will be having a problem in our future programing, because there is no Database file as small as 1 MG and so on...

    I think the file that you have has some error. or the program that you write has some error.. if so Can you write it down.
    Thnx
    C++
    The best

  6. #6
    Registered User Azuth's Avatar
    Join Date
    Feb 2002
    Posts
    236
    I'm not sure I quite follow the preceeding post. My understanding would be that there is no need to read an entire database into memory in order to access it, we just need to read through it in order to find the information we want, then load that into memory and manipulate it. We streamline the process by caching, sorting and re-indexing (in no particullar order or realation).

    The database could take up half our hard disk, but with an efficient index we would not have much trouble finding the record(s) we require with reasonable efficiency.

  7. #7
    Registered User Liam Battle's Avatar
    Join Date
    Jan 2002
    Posts
    114
    haha, no offense but you guys must be new to binary access...
    just learn how to use the fstream and you have already answered your question...
    LB0: * Life once school is done
    LB1: N <- WakeUp;
    LB2: N <- C++_Code;
    LB3: N >= Tired : N <- Sleep;
    LB4: JMP*-3;

  8. #8
    Evil Member
    Join Date
    Jan 2002
    Posts
    638
    Well, I think he does pose a valid concern, he just didn't ask very well.

    There isn't any reason you can't work with a huge file using standard constructs, BUT there are some do's and dont's when it comes to working with a lot of data. There are a few limits to what a machine can do. Specifically, stack size and file pointers. Beyond that, a bigger file means a bigger time hit for non O(1) operations, but there is not reason you can't do it. So, if you plan on making a program scalable to very large files, you should shy away from any kind of recursive algorithm (like most unoptimised quicksorts) because they use stack space, and if N is on the order of thousands or tens of thousands, you will quickly run out of stack space. That would be bad.

    Oh yeah, and dynamic memory too. Almost forgot. If you make a char array using new[] and read you entire harddrive into it, you will have problems.

    A good idea in this case would be to work with your file a little at a time.

  9. #9
    Registered User Liam Battle's Avatar
    Join Date
    Jan 2002
    Posts
    114
    exactly...
    thats why i told him to read up on the fstream using BINARY MODE...

    you can analyse small amounts of data at a time as much as you want... and you have EXCELLENT control over the file its self...
    LB0: * Life once school is done
    LB1: N <- WakeUp;
    LB2: N <- C++_Code;
    LB3: N >= Tired : N <- Sleep;
    LB4: JMP*-3;

  10. #10
    Has a Masters in B.S.
    Join Date
    Aug 2001
    Posts
    2,267
    maybe consider reading and manupulating small chunks of the file at a time, it would probably be faster and more efficient!

    >btw, is the FILE* pointer a typecast of another type of pointer, like a void* or a char*?

    no actually its structure

    heres CodeWarriors FILE

    Code:
    struct _FILE 
    {
    	__file_handle		handle;
    	__file_modes		mode;
    	__file_state		state;
    #ifndef _No_Disk_File_OS_Support							/* mm 981007 */
    	unsigned char       is_dynamically_allocated;			/* mm 981007 */
    #endif  /* not _No_Disk_File_OS_Support */					/* mm 981007 */
    	unsigned char		char_buffer;
    	unsigned char		char_buffer_overflow;
    	unsigned char		ungetc_buffer[__ungetc_buffer_size];
    #ifndef __NO_WIDE_CHAR										/* mm 980204 */
    	wchar_t				ungetwc_buffer[__ungetc_buffer_size];
    #endif /* not __NO_WIDE_CHAR */								/* mm 980204 */
    	unsigned long		position;
    	unsigned char *		buffer;
    	unsigned long		buffer_size;
    	unsigned char *		buffer_ptr;
    	unsigned long		buffer_len;
    	unsigned long		buffer_alignment;
    	unsigned long		saved_buffer_len;
    	unsigned long		buffer_pos;
    	__pos_proc			position_proc;
    	__io_proc			read_proc;
    	__io_proc			write_proc;
    	__close_proc		close_proc;
    	__idle_proc			idle_proc;
    #ifndef _No_Disk_File_OS_Support							/* mm 981007 */
    	struct _FILE *      next_file_struct;					/* mm 981007 */
    #endif  /* not _No_Disk_File_OS_Support */					/* mm 981007 */
    };
    and heres MSVC's

    Code:
    struct _iobuf {
            char *_ptr;
            int   _cnt;
            char *_base;
            int   _flag;
            int   _file;
            int   _charbuf;
            int   _bufsiz;
            char *_tmpfname;
            };
    ADVISORY: This users posts are rated CP-MA, for Mature Audiences only.

  11. #11
    Registered User
    Join Date
    Feb 2002
    Posts
    114
    Create a virtual filesystem to handle large files. A VFS is a class handling all access to your large file (you create a filesystem within a large file, kind of). Now you can indexate your file, have files in that file, get the file table (what data is where in the file). This way, you have great control over the file, and you will only need to read in the part of the file that you need. Combined with a resourcehandler this is a powerful way of handling large data. The resourcehandler see to that data is only loaded once, and then return a pointer to that data whenever needed. This is ideal for textures and sound and gamedata for a game.

    Oh, and the VFS uses fstream in binary mode (or any other filehandler really... depending on the kind of data

Popular pages Recent additions subscribe to a feed

Similar Threads

  1. Reading .dat files from a folder in current directory...
    By porsche911nfs in forum C++ Programming
    Replies: 7
    Last Post: 04-04-2009, 09:52 PM
  2. Working with muliple source files
    By Swarvy in forum C++ Programming
    Replies: 1
    Last Post: 10-02-2008, 08:36 AM
  3. Multiple Cpp Files
    By w4ck0z in forum C++ Programming
    Replies: 5
    Last Post: 11-14-2005, 01:41 PM
  4. Folding@Home Cboard team?
    By jverkoey in forum A Brief History of Cprogramming.com
    Replies: 398
    Last Post: 10-11-2005, 08:44 AM
  5. Batch file programming
    By year2038bug in forum Tech Board
    Replies: 10
    Last Post: 09-05-2005, 03:30 PM

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21