Hi,
I have a "template" string, into which I insert values. Typicaly xml content.
For instance:
Code:
static const char *template = "<value>%s</value>";
char *value = "blabla";
buf [256];
int ret = sprintf(buf, template, value);
This has been working fine but now all of a sudden I need to insert a value that is encoded in UTF-16BE. So my value string "Hello" for instance looks something like hex: 00 48 00 65 00 6c 00 6c 00 6f
It appears that the 00 is blocking the string from being further read. Maybe this is the internal \0 representation? I say this because for test purposes I tried using little endian (utf-16LE) and I was able to get the 'H' but then nothing after that.
As a java developer trying to fix an old peice of C code, im a bit confused as how to handle this
Any help would be appreciated, thanks!