So a very talented mathematician showed me how this can actually be done.

The main idea is to repeatedly apply a calculation in such a way that it causes an "overflow" in a given fixed-precision floating point type, thus revealing the number of mantissa bits. That's about the best I can describe it anyway, sorry!

Here's the gist of it:

Code:
 long double sample = 1;
 for(;;)
 {
  long double check = sample / 2 + 0.25;
  if(check >= sample)
   break;
  sample = check;
  ++mantissa_bits;
 }
It seems to work across the board too.