This is what I did.
Code:
int h = 0;
int b = 0;
char sortBuffer[m][n / m];
char **bitPtr = malloc(fSize * 8);
for (i = 0; i < fSize * 8; i ++) {
bitPtr[i] = malloc(9);
memset(bitPtr[i], 0, 9);
}
printf("\n");
// Read bit values into sortBuffer 2D array
for (i = 0; i < fSize * 8; i++) {
sortBuffer[b][h] = getBitVal(inBuffer, i);
b++;
if (i % m == 0 && i != 0) {
b = 0;
h++;
}
}
for (h = 0; h < k; h++) {
for (i = 0; i < m; i++) {
memcpy(bitPtr[h] + i, &sortBuffer[i][h], 1);
//fprintf(stdout, "%i", sortBuffer[i][h]);
}
}
So now each binary representation is saved in bitPtr[h] I can print that with a small function
Code:
void printVal(char *bitPtr) {
int i;
for (i = 0; i < 9; i++) {
fprintf(stdout, "%i", bitPtr[i]);
}
}
From there I can convert that to a decimal.
Code:
void convertToDec(char *bitPtr) {
int decimal_val = 0;
int base = 1;
int rem;
int num = *(int *) bitPtr;
while (num > 0) {
rem = num % 10;
decimal_val = decimal_val + rem * base;
num = num / 10;
base = base * 2;
}
fprintf(stdout, "%i", decimal_val);
}
But that is not right. The values saved in the bitPtr is wrong.
Code:
printVal(bitPtr[0]);
printf("\n");
convertToDec(bitPtr[0]);
The printVal is right, but the convert to dec is wrong. I don't know how to convert the bitPtr[0] to an int that is equal to the ascii characters within, so bitPtr is actually 111000101, but when converting to decimal its getting a different number.