I found this, a piece of bitset header:
Code:
#define _GLIBCXX_BITSET_BITS_PER_WORD numeric_limits<unsigned long>::digits
#define _GLIBCXX_BITSET_WORDS(__n) \
((__n) < 1 ? 0 : ((__n) + _GLIBCXX_BITSET_BITS_PER_WORD - 1)/_GLIBCXX_BITSET_BITS_PER_WORD)
namespace _GLIBCXX_STD
{
/**
* @if maint
* Base class, general case. It is a class inveriant that _Nw will be
* nonnegative.
*
* See documentation for bitset.
* @endif
*/
template<size_t _Nw>
struct _Base_bitset
{
typedef unsigned long _WordT;
/// 0 is the least significant word.
_WordT _M_w[_Nw];
I think that proves my point:
Code:
typedef unsigned long _WordT;
/// 0 is the least significant word.
_WordT _M_w[_Nw];
It declares an array of words (4 bytes each word, right?).