Interesting, and I stand corrected.
But it seems the C standard implies that char is always one byte, whether it's 8 bits or 16 bits or 32 bits, since the size of every type is measured against char, and char is defined as 1 byte.
Then we have in section 3.6:
which is basically what char is. It seems that the standard uses char and byte more or less interchangeably, even if char (and therefore C's "byte") is much larger than what has ever historically been called a "byte" (I only know about "byte" being defined as 6, 7, or 8 bits at various points in time).byte
addressable unit of data storage large enough to hold any member of the basic character set of the execution environment