I'm trying to write an optimal function that converts a blob into a hexadecimal character string. It's written in C++, but I think it is also C
Code:
const unsigned char *pBytes = (const unsigned char *)p;
char *pOut = (char *)malloc(nLength*2 + 1);
for(size_t i = 0; i < nLength; i++) {
unsigned int byte = (unsigned int)pBytes[i];
assert(byte <= 0xFF);
verify(snprintf(&pOut[2*i], 2, "%02X", byte) == 2);
}
pOut[nLength*2] = '\0';
fprintf(stderr, "strlen was: %d and nlegnth*2: %d\n", (int)strlen(pOut), (int)2*nLength));
assert(strlen(pOut) == nLength*2);
That little debug output reveals that strlen is 1 while nLength*2 is 96. My best guess is that snprintf() doesn't work the way I think it's supposed to.
Obviously, the program ends with the assert in the last line.