So a very talented mathematician showed me how this can actually be done.

The main idea is to repeatedly apply a calculation in such a way that it causes an "overflow" in a given fixed-precision floating point type, thus revealing the number of mantissa bits. That's about the best I can describe it anyway, sorry!

Here's the gist of it:

 long double sample = 1;
  long double check = sample / 2 + 0.25;
  if(check >= sample)
  sample = check;
It seems to work across the board too.