Is there a theoretical basis for this being a bad thing?
O_o
Yes. Of course, the research is focused on cryptography.
Yes. I know that you don't care about the strength of the function.
I only said I don't like the idea.
In this case, I wouldn't say the higher bits are excluded from changing the computed hash. If they were different, the computed hash would be different, so they had just as much of a role in determining the hash as the lower bits.
No. The higher order bits would not play any role in determining the hash in situation I described.
You should really just read the explanation again if you don't understand the situation I described, but I will offer an example.
Code:
// ...
Table[1][0] = 641671;
// ...
Table[2][0] = 751345;
// ...
Table[3][0] = 179318;
// ...
Code:
Value = Value & 0xffff;
// ...
Word[0] = Value & 0xffff; // Word[0] = Value;
Word[1] = (Value >> 16) & 0xffff; // Word[1] = 0;
Word[2] = (Value >> 32) & 0xffff; // Word[2] = 0;
Word[3] = (Value >> 48) & 0xffff; // Word[3] = 0;
// ...
Fragment = Table[1][Word[1]] ^ Table[2][Word[2]] ^ Table[3][Word[3]]; // Fragment = 0;
// ...
Hash = Fragment ^ Table[Word[0]]; // Hash = Table[Word[0]];
Code:
// ...
Table[3][(1 << 15)] = 179318;
// ...
Code:
Value = Value & 0xffff;
// ...
Value |= 1 << 63;
// ...
Word[0] = Value & 0xffff; // Word[0] = Value;
Word[1] = (Value >> 16) & 0xffff; // Word[1] = 0;
Word[2] = (Value >> 32) & 0xffff; // Word[2] = 0;
Word[3] = (Value >> 48) & 0xffff; // Word[3] = 1 << 15;
// ...
Fragment = Table[1][Word[1]] ^ Table[2][Word[2]] ^ Table[3][Word[3]]; // Fragment = 0;
// ...
Hash = Fragment ^ Table[Word[0]]; // Hash = Table[Word[0]];
Code:
Value1 = Value & 0xffff;
Value2 = Value1 | (1 << 63);
Hash(Value1) == Hash(Value2);
As I said, the circumstances would be unlikely.
I still don't like the idea that any bit could ever be excluded from changing the computed hash.
Soma