1) Why is function sizeof() evaluating function bitset<16> of consuming 4 bytes, when it should be 2 bytes?
2) Why is it now evaluating bitset<40> of consuming 8 bytes when it should be 5 bytes?
3) Why is it evaluating bitset<5> of consuming 4 bytes, when the amount of bits isn't sufficient enough to create 1 byte.
My assumption would be that it is including the bytes allocated for the bitset<> itself?
Code:#include <iostream> #include <bitset> using namespace std; int main() { cout << "The amount of bytes function bitset<16> is consuming: " << sizeof(bitset<16>) << endl; return 0; }