I'm a bit confused about this behaviour (gcc 4.1.2, Linux): i have function that accepts "char*" argument. Function is basicly my version of str(case)cmp() and I'm trying to optimize it for my locale and catch up with str(case)cmp(). I don't want to compile with any optimization flags. I found out that any extra argument given to function slows it down (1 more argument, and there's ~20% slowdown - tested on ~15000 strings comparing with each other). Another thing that causes slowdown is any excplicit cast (like signed to unsigned). As I use tables for fast conversion from uppercase to lowercase when that version is called (case insensitive), i just type: "c=TABLE[c]" (c is current character). Example:
(I'll ignore slowdown by "cs" argument for now...)Code:int _f(char *s1, char *s2, int cs)
{
// cs = case sensitivity (0=off)
(register?) int i=0; // index in string(s)
(register?) int c1, c2; // current char in 1st and 2nd strings
for(;;)
{
c1 = s1[i];
...
if (!cs) c1=TABLE[c1];
...
}
}
The thing that bugs me is that c1 CAN become negative (i've checked) and it's EXPECTED as no cast is being done, but then again when doing TABLE[c1] it gets silently cast to unsigned (as no crash occurs) when used as index. Is this normal or am i missing something?
EDIT: Oh yes, TABLE is plain "int TABLE[256]" initialized before _f() call...