Here's something I've encountered in some code I'm writing.

Given:

p, q are two real values in the range [0..1]

m is the average of p, q, i.e. (p+q)/2,

Can it ever be the case that:

p^p * q^q < m^(p*q)

Where '^' is exponentiation, not XOR?

Anyway, the point is, if the above relation can NEVER be true, then I can make an enormous optimization somewhere.

I'm sure somebody in the infotheory literature has already proved/disproved this, but I haven't found it yet.