Actually, in the meanwhile I've found my own solution! ^^
I've learnt about unions and used this method to solve my problem:
Code:
FILE *ptrRAWread;
union twobytes
{int intvalue;
char byte [sizeof (int)];
} bytes;
char *buffer;
ptrRAWread = fopen ("testinput.raw", "rb");
fseek (ptrRAWread, 0, SEEK_END);
lSize = ftell (ptrRAWread);
rewind (ptrRAWread);
buffer = (char*) malloc (sizeof (char) * (lSize))
if (buffer == NULL) {fputs ("\nMemoryerror\n", stderr); exit(2);}
result = fread (buffer, 1, lSize, ptrRAWread);
if (result != lSize) {fputs ("\nReadingerror\n", stderr); exit(3);}
bytes.byte[0] = buffer[0];
bytes.byte[1] = buffer[1];
bytes.byte[2] = 0;
bytes.byte[3] = 0;
printf ("\nbytes.intvalue: %i\n", bytes.intvalue);
Obviously I'll use a "for" scope to run through twobytes.bytes char array for both reading and writing values in files. Is it good? =)
The good thing about this method, if I didnt misunderstand little/big endian, is that I dont need to invert bytes' order because the system will read the integer variable the same direction as the integer values in my RAW file.
So the RAW file says "FF 7F", I read char[0] = "FF" and char[1] = "7F", and the integer value will be "32767". It's perfect!!! ^0^
Just one doubt, now:
I've used the "char *buffer" method from an example for reading files, but my RAW files can easily reach sizes as big as 512MB.....I could, for the little I know, never have such a large amount of contiguous free memory (despite my 12GB of RAM).
Any alternative, something halfway between continuously reading from the file and storing the whole thing in memory? Or maybe reading files byte-by-byte is not as bad as it looks? =o
PS: I use QT Creator, to compile....I hope this doesnt offend your tastes, a friend suggested me to use it. =|