The expression sizeof(a) / sizeof(a[0]) can be used to calculate the number of elements in a array. The expression
sizeof (a) / sizeof (t), where t is the type of a's elements, would work, but its considered an inferior technique. Can anyone answer why this approach would be considered inferior?