Help me, I need to find out how to manipulate a 60MB file using C++
I need to cast it to a variable at some point.
Is it even possible?
Help me, I need to find out how to manipulate a 60MB file using C++
I need to cast it to a variable at some point.
Is it even possible?
Don't be silly. The only restrictions are the particular system's performance. Just treat it like any other chunk of memory...
Code:#include <cmath> #include <complex> bool euler_flip(bool value) { return std::pow ( std::complex<float>(std::exp(1.0)), std::complex<float>(0, 1) * std::complex<float>(std::atan(1.0) *(1 << (value + 2))) ).real() < 0; }
>I need to cast it to a variable at some point.
You can open it using the FILE structure functions. You can then read it into memory, manipulate the memory like you would manipulate any other piece of memory and write it back to disk.
If you have problems with a specific part, please post some code.
hth
-nv
She was so Blonde, she spent 20 minutes looking at the orange juice can because it said "Concentrate."
When in doubt, read the FAQ.
Then ask a smart question.
btw, is the FILE* pointer a typecast of another type of pointer, like a void* or a char*?
Is there any different if the file is very big or very smal....
If so We will be having a problem in our future programing, because there is no Database file as small as 1 MG and so on...
I think the file that you have has some error. or the program that you write has some error.. if so Can you write it down.
Thnx
C++
The best
I'm not sure I quite follow the preceeding post. My understanding would be that there is no need to read an entire database into memory in order to access it, we just need to read through it in order to find the information we want, then load that into memory and manipulate it. We streamline the process by caching, sorting and re-indexing (in no particullar order or realation).
The database could take up half our hard disk, but with an efficient index we would not have much trouble finding the record(s) we require with reasonable efficiency.
haha, no offense but you guys must be new to binary access...
just learn how to use the fstream and you have already answered your question...
LB0: * Life once school is done
LB1: N <- WakeUp;
LB2: N <- C++_Code;
LB3: N >= Tired : N <- Sleep;
LB4: JMP*-3;
Well, I think he does pose a valid concern, he just didn't ask very well.
There isn't any reason you can't work with a huge file using standard constructs, BUT there are some do's and dont's when it comes to working with a lot of data. There are a few limits to what a machine can do. Specifically, stack size and file pointers. Beyond that, a bigger file means a bigger time hit for non O(1) operations, but there is not reason you can't do it. So, if you plan on making a program scalable to very large files, you should shy away from any kind of recursive algorithm (like most unoptimised quicksorts) because they use stack space, and if N is on the order of thousands or tens of thousands, you will quickly run out of stack space. That would be bad.
Oh yeah, and dynamic memory too. Almost forgot. If you make a char array using new[] and read you entire harddrive into it, you will have problems.
A good idea in this case would be to work with your file a little at a time.
exactly...
thats why i told him to read up on the fstream using BINARY MODE...
you can analyse small amounts of data at a time as much as you want... and you have EXCELLENT control over the file its self...
LB0: * Life once school is done
LB1: N <- WakeUp;
LB2: N <- C++_Code;
LB3: N >= Tired : N <- Sleep;
LB4: JMP*-3;
maybe consider reading and manupulating small chunks of the file at a time, it would probably be faster and more efficient!
>btw, is the FILE* pointer a typecast of another type of pointer, like a void* or a char*?
no actually its structure
heres CodeWarriors FILE
and heres MSVC'sCode:struct _FILE { __file_handle handle; __file_modes mode; __file_state state; #ifndef _No_Disk_File_OS_Support /* mm 981007 */ unsigned char is_dynamically_allocated; /* mm 981007 */ #endif /* not _No_Disk_File_OS_Support */ /* mm 981007 */ unsigned char char_buffer; unsigned char char_buffer_overflow; unsigned char ungetc_buffer[__ungetc_buffer_size]; #ifndef __NO_WIDE_CHAR /* mm 980204 */ wchar_t ungetwc_buffer[__ungetc_buffer_size]; #endif /* not __NO_WIDE_CHAR */ /* mm 980204 */ unsigned long position; unsigned char * buffer; unsigned long buffer_size; unsigned char * buffer_ptr; unsigned long buffer_len; unsigned long buffer_alignment; unsigned long saved_buffer_len; unsigned long buffer_pos; __pos_proc position_proc; __io_proc read_proc; __io_proc write_proc; __close_proc close_proc; __idle_proc idle_proc; #ifndef _No_Disk_File_OS_Support /* mm 981007 */ struct _FILE * next_file_struct; /* mm 981007 */ #endif /* not _No_Disk_File_OS_Support */ /* mm 981007 */ };
Code:struct _iobuf { char *_ptr; int _cnt; char *_base; int _flag; int _file; int _charbuf; int _bufsiz; char *_tmpfname; };
ADVISORY: This users posts are rated CP-MA, for Mature Audiences only.
Create a virtual filesystem to handle large files. A VFS is a class handling all access to your large file (you create a filesystem within a large file, kind of). Now you can indexate your file, have files in that file, get the file table (what data is where in the file). This way, you have great control over the file, and you will only need to read in the part of the file that you need. Combined with a resourcehandler this is a powerful way of handling large data. The resourcehandler see to that data is only loaded once, and then return a pointer to that data whenever needed. This is ideal for textures and sound and gamedata for a game.
Oh, and the VFS uses fstream in binary mode (or any other filehandler really... depending on the kind of data