First, read Click_here's link.

Second, you'll need to understand how the CPU works to get this low level thinking. To the CPU, the only difference between a signed and unsigned number is how it is assumed to be either. The compiler is much clearer about this, so mixing signed and unsigned values is often a source of bugs. When it's required you take some responsibility to ensure what you're doing makes sense, so the knowledge you're seeking is valuable.

It is also the subject of study, and not entirely appropriate for a post here, it becomes a book.

However, your last code represents a serious bug. The size of the char is 8 bits, but the unsigned integer you've specified in printf expects a larger integer (depending on the platform you're writing in). You've basically told the printf function to look way beyond the storage of the char and give you garbage.

printf can be "dangerous" in this way. Some compilers attempt to examine the format arguments to see if your parameters make sense, some don't. If the compiler warned you but you ignored the warning, you'll want to start responding to the warnings before you create crashes.