Problem reading a certain sized integer into a value from a file.
I'm reading a file that calls for values to be interpreted from the file as 2-byte signed integers. So I create a short integer data type and read in the values, but from what I see from the hexeditor i'm using I'm getting totally different values. This is how the file marker is described in the specification
so I write this routine to read the number of elevation values
ALTW stands for 'Altitude in 16-bit Words'. After The "ALTW" marker, the following appear in order:
HeightScale, a 2-byte signed integer value.
BaseHeight, a 2-byte signed integer value.
Elevations, a sequence of 2-byte signed integers.
There are (xpts * ypts) elevation integers, where xpts and ypts will have been set earlier in the "SIZE" chunk or the "XPTS" and "YPTS" chunks. The elevations are ordered such that the first row (y = 0) is read first from left to right, then the second (y = 1), and so on. The values in Elevations are not absolute altitudes. The absolute altitude of a particular point (in the same scale as x and y) is equal to BaseHeight + Elevation * HeightScale / 65536.
the elevations array is an array of short integers and is just meant for holding the data in a "square" configuration.
short int a = header.BaseHeight;
short int b = header.heightScale;
//reads in and calculates header data
for(int y = 0; y < header.ypts; ++y)
for(int x = 0; x < header.xpts; ++x)
signed int tempHeight;
short int calcHeight = a + tempHeight * b / 65536;
elevations[(y * header.xpts) + x] = calcHeight;
Now at the point I started reading that data, the hex editor shows the first value to be.
0x799D which converted to a signed integer would be -1,635, but when I go to debug it the value is read 0x9D79 which is 25223