PDA

View Full Version : Annoying compilers



anon
11-26-2008, 04:21 PM
This is just a rant (and the code is not the real code).

Recently I ran into a runtime problem because of the following scenario (resulted from a typo).



class A
{
int x;
public:
A(): x(x)
{
}
};

int main()
{}


Why o why can't MingW not Visual Express 2005 even warn that a member is being initialized with itself (or do I miss some compiler flags, or are there any cases where this obviously isn't an error)!?

Another recent annoyance is with Visual Express 2005. I couldn't figure the compiler error out until I compiled the same file with MingW, because the error message is so misleading that I was looking intently at the wrong place (and all the files included from this header to see if there's a stupid error like a missing ; or a mistyped inclusion guard etc):


class A
{
int n;
public:
A(float, n m) {n = m; }
};

int main()
{}


MSVC error diagnostics:


main.cpp(5) : error C2062: type 'float' unexpected


WTF! How is float (in my case another type - so I suspected circular dependencies) unexpected?

whereas MingW says


main.cpp:5: error: `n' is not a type

bling
11-26-2008, 05:21 PM
messing up syntax with templates is much more annoying. WADAYAMEAN I HAVE 5924 errors!?!?

bling
11-26-2008, 06:13 PM
interesting, with VS2005 pro i get this for your 2nd annoyance:


Error 1 error C2061: syntax error : identifier 'n'

zacs7
11-26-2008, 06:13 PM
> or are there any cases where this obviously isn't an error
Function pointer for a recursive algorithm or something?

lruc
11-26-2008, 07:00 PM
VC++ is a walk in the park compared to the errors Dev-C++ gives you.

zacs7
11-26-2008, 08:50 PM
> Dev-C++ gives you.
That is MinGW

lruc
11-27-2008, 08:01 AM
> Dev-C++ gives you.
That is MinGW

I knew that. I was just testing you.

anon
11-27-2008, 11:26 AM
What do you think about VC++'s warnings level 4? It spits a ton of warnings on you for any piece of code, so does it have any practical use? :)

laserlight
11-27-2008, 11:43 AM
What do you think about VC++'s warnings level 4? It spits a ton of warnings on you for any piece of code, so does it have any practical use?
Once you disable the "secure" warnings, the rest should actually be useful, though often benign as in they just require a legitimate type cast to silence.

CornedBee
11-27-2008, 12:01 PM
I think that VC++'s error recovery is a disaster. No matter what you do wrong, you always get two or three inane follow-up errors that make it completely impossible to fix more than one error per compilation.
Its saving grace is that it compiles fast.

VirtualAce
11-27-2008, 12:47 PM
MSVS does have awful error recovery and since I'm using 2003 at work it also crashes or just exits a lot. It does not seem to do well with multi-threaded apps and often one bad thread will keep the GUI busy to the point you have to end task on it. 2003's function browser is also near non-existent b/c half the time you can click on the function and nothing happens. Then the focus get's stuck in the function drop down and you have to press the middle mouse button just to get focus back to the code window. 2003 also has a big error where it throws up some weird window that stretches to my second monitor and yet you cannot click on it or remove it.
2003 also gets confused if you are in a remote debugging session and the client system goes down for some reason. 2003 will basically say the program is still runing even though your second system is rebooting. It won't let you stop debugging or anything and will force you to end task on dev.exe. 2003 frequently exits the IDE but does not shut down the process. So the next time you fire up 2003 you will get an NCB error since the other existing dev.exe still has it locked. This forces you to shut down your second 2003 instance, end task on the first, and then restart 2003.

That said 2005 is far better than 2003. I have not experienced any major issues with 2005 save for the very annoying side-by-side linker error which is hard to fix and diagnose. I only have 2005 standard so I don't know much about how professional acts. The so-called deprecated CRT is extremely annoying in 2005 but luckily can be turned off by defining_CRT_SECURE_NO_DEPRECATE in the preprocessor. I know why Microsoft 'deprecated' it but they created a bigger mess when they renamed every CRT function with a trailing _s or _n. Now the MS version of the CRT is confusing at best. 2005's Intellisense and function browsing is far superior to 2003's. Also 2005 has much more flexibility when it comes to separating errors and warnings than 2003 does. The one think I completely dislike about 2005 is that it only searches the current open document for key user defined comments like HACK, FIX LATER, etc. 2003 used to search the entire program which made it very nice when you wanted to go in and finally properly fix the 'HACK' sections of code. 2005 has made this feature unusable.

But overall I think MSVS is a very good compiler and it compiles very fast which is a definite plus. Neither 2003 or 2005 seem to use more than one thread when compiling which brings my quad-core processor at work to it's knees and maxes out 1 core all the time. Not sure why they could not fire off multiple threads for the actual compile process. Perhaps synchronization would have been too difficult. Again I do not know for sure how many threads are being used but it appears from the task manager that only 1 is used. If more were used I would expect to see some activity out of the other cores and yet they are usually flatlined or very close to it during compile.

I've also noticed on all MSVS versions that bringing up help internally is three times as slow as bringing it up externally. Not sure why this is. This seems to be even worse in 2005. I normally bring up my DirectX help and MSDN outside of the IDE b/c inside causes tons of problems. I do love the search ability in MSVS and it is extremely fast. Go to function declaration and function definition seem to be nearly broken in 2003. Half the time it cannot figure out the difference between declaration and definition. I wish there was a find option in the context menu b/c it's much easier than pressing CTRL SHIFT F. Same holds true for search and replace.

Warning level 4 is what I always use for compiles but it is almost useless if you are using the STL in MSVS. xtree.h throws up about 50 warnings from inside the STL which is just ridiculous. Does not make me feel very good when my STL is puking tons of warnings.

CornedBee
11-27-2008, 12:52 PM
Not sure why they could not fire off multiple threads for the actual compile process. Perhaps synchronization would have been too difficult.
I don't believe it. Launching multiple compilers in parallel must be among the simplest and most efficient ways of using multiple hardware threads.
But I don't have any idea why they don't do parallel builds either.

Elysia
11-27-2008, 02:18 PM
MSVS does have awful error recovery and since I'm using 2003 at work it also crashes or just exits a lot. It does not seem to do well with multi-threaded apps and often one bad thread will keep the GUI busy to the point you have to end task on it. 2003's function browser is also near non-existent b/c half the time you can click on the function and nothing happens. Then the focus get's stuck in the function drop down and you have to press the middle mouse button just to get focus back to the code window. 2003 also has a big error where it throws up some weird window that stretches to my second monitor and yet you cannot click on it or remove it.
2003 also gets confused if you are in a remote debugging session and the client system goes down for some reason. 2003 will basically say the program is still runing even though your second system is rebooting. It won't let you stop debugging or anything and will force you to end task on dev.exe. 2003 frequently exits the IDE but does not shut down the process. So the next time you fire up 2003 you will get an NCB error since the other existing dev.exe still has it locked. This forces you to shut down your second 2003 instance, end task on the first, and then restart 2003.
Well, wow. All I can say is that Microsoft do make buggy software.
Thankfully, I don't have 2003 anymore.


The so-called deprecated CRT is extremely annoying in 2005 but luckily can be turned off by defining_CRT_SECURE_NO_DEPRECATE in the preprocessor.
I find it annoying that no-one uses the safer versions and sticks to old, messy crap versions.


I know why Microsoft 'deprecated' it but they created a bigger mess when they renamed every CRT function with a trailing _s or _n. Now the MS version of the CRT is confusing at best.
So what should they have done? Replaced the old functions with their newer, safer ones? Now that would break a lot of code...


But overall I think MSVS is a very good compiler and it compiles very fast which is a definite plus. Neither 2003 or 2005 seem to use more than one thread when compiling which brings my quad-core processor at work to it's knees and maxes out 1 core all the time. Not sure why they could not fire off multiple threads for the actual compile process. Perhaps synchronization would have been too difficult. Again I do not know for sure how many threads are being used but it appears from the task manager that only 1 is used. If more were used I would expect to see some activity out of the other cores and yet they are usually flatlined or very close to it during compile.
Now that is very odd, because I know that with 2005, at the very earliest, the IDE does indeed utilize multiple cores.
It always spawns two threads when compiling for me, maximizing my dual core. It actually compiles multiple sources files in parallel.


Warning level 4 is what I always use for compiles but it is almost useless if you are using the STL in MSVS. xtree.h throws up about 50 warnings from inside the STL which is just ridiculous. Does not make me feel very good when my STL is puking tons of warnings.
Warnings level 4 is something I cannot live without. It provides huge amounts of valuable warnings.
Unused variables, potentially unused variables, non-standard extensions, conversion warnings, among what I can remember.
Not sure why a header would spit out a lot of warnings, though. Never did for me, but then again, I never included xtree.h.


I don't believe it. Launching multiple compilers in parallel must be among the simplest and most efficient ways of using multiple hardware threads.
But I don't have any idea why they don't do parallel builds either.
But it does...
At least it does for me.

matsp
11-27-2008, 02:31 PM
I find it annoying that no-one uses the safer versions and sticks to old, messy crap versions.


So what should they have done? Replaced the old functions with their newer, safer ones? Now that would break a lot of code...


No, they should have found a way to get consensus of how to get the standard library safe from all compiler/OS vendors - yes, that would take a lot longer. But if you want to write code that is portable, you will either have to write your own safe functions (assuming there is no safe standard function to use), as it stands.

I also find some of the MS versions of "safe" functions annoyingly strange (as in, they pass the arguments in a funny order in some cases - can't think of a particular case right now - just remember seeing something and thinking: "What the h*** where they thinking?").

--
Mats

Elysia
11-27-2008, 02:34 PM
No, they should have found a way to get consensus of how to get the standard library safe from all compiler/OS vendors - yes, that would take a lot longer. But if you want to write code that is portable, you will either have to write your own safe functions (assuming there is no safe standard function to use), as it stands.
I am with Microsoft on this boat, however.
This did it quickly - added what the industry needed.
Now, they can try to continue persuading the ones in charge of the standard later in hopes of getting something standard, but at the time, they certainly did the correct thing, if you ask me. The shortcoming is of the standard.


I also find some of the MS versions of "safe" functions annoyingly strange (as in, they pass the arguments in a funny order in some cases - can't think of a particular case right now - just remember seeing something and thinking: "What the h*** where they thinking?").
I don't really share that sentiment, but then again, when I first passed as a VB dev to a C++ dev, I thought why the heck did they put the destination argument first? Because in VB, the source argument was always first.
It's all a matter of taste. But I got used to it. Dst first, src later.

CornedBee
11-27-2008, 03:13 PM
I find it annoying that no-one uses the safer versions and sticks to old, messy crap versions.
Well, those "old, messy crap versions" happen to be part of an international standard.

Those shiny, new, safe versions are ... well, MS added them in 2005. Nobody else supports them. Nobody else will, anytime soon. Probably. No legacy code uses them, obviously. They're not attractive from a code maintenance viewpoint, from a "that's what I've always used" viewpoint, from a portability viewpoint.

That said, I do use them in a recent definitely-Windows-only project. Mostly because I don't want to go to the trouble of shutting up the warnings. For many of them, I don't see how they are any safer. What the hell is the difference between these?

int fprintf(FILE* f, const char* fmt, ...);
int fprintf_s(FILE* f, const char* fmt, ...);

Yeah, I know.

These functions differ from the non-secure versions in that the format string itself is also validated. If there are any unknown or badly formed formatting specifiers, these functions generate the invalid parameter exception.
Slow extra validation in the _s version. But not that slow. (Or I don't see why it would be.) I really don't think it's justified to create an extra function for this.

And yet, MS still hasn't managed to implement static parameter validation for the printf family (for compile-time-known format strings, obviously), something that GCC has had forever. Great fun! I can pass a CString to a formatter and never know until it crashes. But only if it wasn't the last parameter, because otherwise it just happens to work, since the pointer is a CString's first member.


Me, ranting? :)

Elysia
11-27-2008, 03:18 PM
Well, those "old, messy crap versions" happen to be part of an international standard.

Those shiny, new, safe versions are ... well, MS added them in 2005. Nobody else supports them. Nobody else will, anytime soon. Probably. No legacy code uses them, obviously. They're not attractive from a code maintenance viewpoint, from a "that's what I've always used" viewpoint, from a portability viewpoint.
I would dearly love them to be part of the new standard. Clearly, a good thing would be for the standard to add them and deprecate the old functions, meaning that in new projects, the new functions should be used.
But then again, perhaps they won't because of the extra overhead. C was not really meant for modern applications. It is nowadays better for certain embedded platforms where such extra checks might be too costly or not necessary.


And yet, MS still hasn't managed to implement static parameter validation for the printf family (for compile-time-known format strings, obviously), something that GCC has had forever. Great fun! I can pass a CString to a formatter and never know until it crashes. But only if it wasn't the last parameter, because otherwise it just happens to work, since the pointer is a CString's first member.
Anything that helps point out bugs and errors in the code is welcome by me. I would love such support in Visual C++. Get cracking, Microsoft! :)

CornedBee
11-27-2008, 03:43 PM
I think programmers see these safe versions as "hey, that's the job of a debug mode". For example, the invalid format string checking of printf_s. At the same time, they're warning that you should never ever use user input as the format string. (Very good advice, that.)
So the only source of format strings is the people involved in developing the application: programmers and translators (assuming you're inexperienced enough to think that printf is an even remotely adequate localization tool). But if that is so, then all invalid format strings should be catchable before the application leaves the developers, i.e. in debug mode.
Why, then, would I pay for the extra validation in release mode?

Others, such as strtok_s, are always useful. But then, strtok_s is actually in the POSIX standard - under the name strtok_r (for reentrant).

VirtualAce
11-27-2008, 03:46 PM
I could see utilizing the 'safe' versions in security applications but I don't see the benefit in using them in much else. They are wholly confusing and the naming convention is just awful.

Elysia
11-27-2008, 03:47 PM
Catching buffer overflows is very, very useful too. And it's part of strcpy_s, strcat_s, among others.
And not just security applications, but any desktop (ie PC/Mac) application.

CornedBee
11-27-2008, 03:52 PM
And it's part of strcpy_s, strcat_s, among others.
Only if you actually get the buffer size right. That's still a very real issue.

VirtualAce
11-27-2008, 03:58 PM
I can understand some of the idea of the safe functions but the fact is..they are only as safe as the programmer using them...which brings me to the idea that how are they any different than the unsafe ones which rely on the same fact?

As of right now I have no use for them since my current job is on embedded systems. I do not create desktop applications.

matsp
11-27-2008, 04:08 PM
And not just security applications, but any desktop (ie PC/Mac) application.

Well, of course it is, from a usability perspective. But in 99% of cases, a buffer overflow will achieve nothing more than a weird crash (and thus a loss of data, if you hadn't saved your file two seconds ago). There are cases where buffer overflows allow privilege hoisting. And of course, if the application is listening to arbitrary network traffic, then it would allow an outsider to send a packet that can allow the application to become the hose of all sorts of nasty stuff.

But if the application that I'm using is overflowing because I wrote a too long line in my text file, all that is going to happen is a crash. Which is not great. But it's not a security issue in any way shape or form - unless someone ELSE can convince me that I should open THEIR FILE with some weird data in it (that allows the app to perform some operations I didn't really want it to perform).

--
Mats

Perspective
11-27-2008, 04:09 PM
>I couldn't figure the compiler error...


as a general statement I'd say it's not fair to complain about bad compiler errors until you've tried to write you're own compiler. It's not always as easy as you think to come up with a nice user friendly explanation of what went wrong.

In this case it's pretty bad though. A lot of IDE's warn you about these types of errors, I guess the lesson is that code compilation and code validation are inherently different tasks.

adeyblue
11-27-2008, 05:25 PM
And yet, MS still hasn't managed to implement static parameter validation for the printf family (for compile-time-known format strings, obviously)


To be fair it seems to be only %s arguments that aren't validated, although that's little consolation considering %s is probably the most widely used.



int val = 0;
char* str = 0;
printf("%p", val); // C4313: 'printf' : '%p' in format string conflicts with argument 1 of type 'int'
sprintf(NULL, "%d", str); // C4313: 'sprintf' : '%d' in format string conflicts with argument 1 of type 'char *'
fprintf(NULL, "%u", str); // C4313: 'fprintf' : '%u' in format string conflicts with argument 1 of type 'char *'
printf("%s\n", val); // nothing


Looks like I missed something, compiling in VS2008 pro with /analyze gives


warning C6066: Non-pointer passed as parameter '2' when pointer is required in call to 'printf'
warning C6273: Non-integer passed as parameter '3' when integer is required in call to 'sprintf': if a pointer value is being passed, %p should be used
warning C6309: Argument '1' is null: this does not adhere to function specification of 'sprintf'
warning C6273: Non-integer passed as parameter '3' when integer is required in call to 'fprintf': if a pointer value is being passed, %p should be used
warning C6309: Argument '1' is null: this does not adhere to function specification of 'fprintf'
warning C6067: Parameter '2' in call to 'printf' must be the address of the string
warning C6387: 'argument 1' might be '0': this does not adhere to the specification for the function 'sprintf': Lines: 7, 8, 9, 10
warning C6387: 'argument 1' might be '0': this does not adhere to the specification for the function 'fprintf': Lines: 7, 8, 9, 10, 11




Neither 2003 or 2005 seem to use more than one thread when compiling which brings my quad-core processor at work to it's knees and maxes out 1 core all the time. Not sure why they could not fire off multiple threads for the actual compile process.

/MP (http://msdn.microsoft.com/en-us/library/bb385193.aspx) was introduced (though undocumented) in VS2005

CornedBee
11-27-2008, 06:09 PM
Well, we use 2005 at work, and I don't get any such warnings. Also, I believe the analyze option is not available in our edition.

Elysia
11-28-2008, 03:52 AM
Only if you actually get the buffer size right. That's still a very real issue.

Buffer size is (usually) easy to get right, however, if that's any consolidation. I think it's better than nothing. But they could be better, yes.

As for /analyze:

/analyze is only available in Enterprise (team development) versions for x86 compilers.
:(
That seems stupid. I only have access to Professional versions...

zacs7
11-28-2008, 04:05 AM
> Catching buffer overflows is very, very useful too. And it's part of strcpy_s, strcat_s, among others.
That's where our friend "Mr. Virtual Machine" comes in. But I forgot you don't like them ;)

> Slow extra validation in the _s version. But not that slow. (Or I don't see why it would be.)
Perhaps a benchmark is in order?

Elysia
11-28-2008, 04:13 AM
> Catching buffer overflows is very, very useful too. And it's part of strcpy_s, strcat_s, among others.
That's where our friend "Mr. Virtual Machine" comes in. But I forgot you don't like them ;)
Eh, but I'd rather like functions that tells me of overflows instead of me having to test for them.
And virtual machines are slow -_-

bling
11-28-2008, 11:27 AM
does anyone know why MS chose to use strcpy_s, strcat_s instead of preprocessor?

Elysia
11-28-2008, 12:03 PM
What? Preprocessor? How?

VirtualAce
11-28-2008, 12:19 PM
I think he means that MS could have gone the route that if you define a flag in the pre-processor that the standard CRT functions would become safe ones. However this is not possible since most of the 'safe' CRT functions require a specific count variable for copying strings/data.

Mad_guy
11-28-2008, 01:28 PM
Functions like this are not going to protect you from everything; there are far more subtle bugs that can appear, particularly relating to integers, that can bite you no matter what (in most all languages.) Stack overflows in particular are becoming harder to pull off with OS-level security constructs, and while people are finding smarter ways around them, things like W^X are particularly hard to beat, so people must search elsewhere for exploitation (and while we're on this subject, I personally believe language-enforced security is better than OS-enforced security, e.g. through a type system. You can encode security invariants in your types that *must* be checked by the compiler, and this ensures correctness. Techniques like this have been used in code that for example, statically checks that all pointers are on word-aligned boundaries, or that no array access can go beyond the bounds of the array, and these things are determined at compile time - this is something you can be sure is correct, thanks to some very important properties about type systems that I won't go into here.)

While I assume most of you are quite up-to-date on the standards, things like this can be easily overlooked:

#define MAX_LEN 1024

int func(int i) {
char buf[MAX_LEN];

if(i >= MAX_LEN)
return 1;

memcpy(buf,0,i);
return 0;
}

(The fix is obviously to replace "i > MAX_LEN" with "i > sizeof(buf)".)

Of course, this is an extremely simple example, but very subtle bugs like this have crept up in e.g. OpenBSD at the kernel level. These are things that are not as easily checked statically, especially in a language such as C.

Because I don't use Windows at all, I really can't use these safe functions. Regardless of that though, I can't really see how much they buy you in general; while they may perform some 'extra' checks, and that's certainly useful, many things are still in your hands, and problems like the above can't be checked by the compiler and have it warn you. But they very well could help a lot - all I'm saying I guess is don't place too much of an investment in them (especially because they aren't standard.)


And virtual machines are slow -_-
Red herring.

pheres
12-11-2008, 04:54 PM
I don't believe it. Launching multiple compilers in parallel must be among the simplest and most efficient ways of using multiple hardware threads.
But I don't have any idea why they don't do parallel builds either.



/MP (http://msdn.microsoft.com/en-us/library/bb385193.aspx) was introduced (though undocumented) in VS2005

and it is broken (at least in VS2005). from time to time the program database files gets corrupted while having /MP enabled. and creation of that pdb-files seems to be the thing that unsimplifies parallel builds.

under Tools/options/build settings one can find an documented switch to use more than one thread. but in opposite to /MP this applies only for compilation units from different projects which doesn't depend on each other.

here is a free VS plugin which promises to do better:
http://www.todobits.es/mpcl.html

Mario F.
12-12-2008, 10:54 AM
I personally believe language-enforced security is better than OS-enforced security, e.g. through a type system.

I partially agree. But language enforced security can only achieve this much on a programming language that one wants compiled, generic, and portable.

On the other hand a debug mode operating system is what we are missing.

...

Meanwhile, my gripe with MS "deprecated" features is the "deprecate" wording. the "_s" doesn't bother me. The functions themselves are mostly useless. Help catch a few errors that should be in the mind of the programmer anyways and not much else. Useful maybe if you like to become sloppy, or are in a hurry, and you are coding for Windows. Useless if you plan on using MVS for portable code.

I'd be very impressed indeed if these functions would make their way into the C++ Standard in anything more meaningful than a couple of additions.

matsp
12-12-2008, 10:56 AM
But as the implementor of the C library, MS are allowed to use _x names - that is exactly who those names are for, so that they do not collide with user-provided codez.

--
Mats

Mario F.
12-12-2008, 11:01 AM
We agree on that. I think you misread, although I may not have been clear enough either.

What bothers me is the "deprecate" wording. As if Microsoft could deprecate the C++ standard on wich it must base its compiler. The "_s" prefix meanwhile doesn't bother me at all.

CornedBee
12-12-2008, 11:10 AM
It's a suffix. As such, actually, it's not within MS's rights to use those names. But I can't blame them too much for not caring - it's not like the POSIX standard ever cared about adding to the C standard headers.

laserlight
12-12-2008, 08:22 PM
While I assume most of you are quite up-to-date on the standards, things like this can be easily overlooked:
...
(The fix is obviously to replace "i > MAX_LEN" with "i > sizeof(buf)".)
Okay, I admit, I do not get it. As far as I can tell MAX_LEN == sizeof(buf) since sizeof(char) is guaranteed to be 1, so what is wrong with the original code?

Mad_guy
12-14-2008, 02:55 AM
But language enforced security can only achieve this much on a programming language that one wants compiled, generic, and portable.

A type system and a garbage collected language can really go a long way to help a *lot* of things though - something I have never quite understood is why implementers of high level languages don't target something that's everywhere, like ISO C? You get the compiler-verified static-type checks and the resulting code works everywhere. Currently I'm working on a compiler for a HLL (haskell in particular) that does in fact target ISO C, meaning you can run applications it can compile just about anywhere. Having a defining feature like this is really nice, because this means it becomes far more feasible to write perhaps critical code and get good static guarantees at the same time since the code generated will run anywhere (think in embedded devices, or even a kernel!)

But this isn't some kind of wailing on C for having a fairly weak type system or anything, and I think this little rant is off topic anyway (I just mentioned it as an aside.)


Okay, I admit, I do not get it. As far as I can tell MAX_LEN == sizeof(buf) since sizeof(char) is guaranteed to be 1, so what is wrong with the original code?

By default ints are treated as signed integers - meaning they can be negative. If you pass a negative integer to the original code, it will bypass the >= MAX_LEN check, since that's changed to simply '1024' via the preprocessor. It will then call memcpy with i as the size parameter, and this will convert it to an unsigned integer (memcpy's 3rd parameter is a size_t,) and in the conversion the resulting value will be *huge* so memcpy copies way too much data (and yes, in practice things like this can be feasibly exploitable.)

The fix works because sizeof(buf) returns a size_t which is unsigned - the standard defines that a comparison between a signed and an unsigned integer proceeds by converting them both to unsigned, and then comparing. The negative value will become unsigned (and huge) and thus the function will just 'return 1'.

Elysia
12-14-2008, 03:00 AM
A type system and a garbage collected language can really go a long way to help a *lot* of things though - something I have never quite understood is why implementers of high level languages don't target something that's everywhere, like ISO C?

Perhaps because C wasn't designed for it, and was designed only as a low-level language?
For C++, on the other hand, it might be possible and/or worth it, since it is a higher level language.

laserlight
12-14-2008, 03:23 AM
By default ints are treated as signed integers - meaning they can be negative. If you pass a negative integer to the original code, it will bypass the >= MAX_LEN check, since that's changed to simply '1024' via the preprocessor. It will then call memcpy with i as the size parameter, and this will convert it to an unsigned integer (memcpy's 3rd parameter is a size_t,) and in the conversion the resulting value will be *huge* so memcpy copies way too much data (and yes, in practice things like this can be feasibly exploitable.)
Ah yes. However, I think that the real fix is to change the parameter to be an unsigned int instead, or perhaps more appropriately, size_t.

Mad_guy
12-14-2008, 04:30 AM
Perhaps because C wasn't designed for it, and was designed only as a low-level language?
For C++, on the other hand, it might be possible and/or worth it, since it is a higher level language.

What does C being a low level language have to do with a compiler that goes from HLL -> C? Assembly is pretty low level too, but it's not stopping anybody from targeting it. And why would C++ being higher level make it easier to write a compiler from HLL -> C++?


Ah yes. However, I think that the real fix is to change the parameter to be an unsigned int instead, or perhaps more appropriately, size_t.
Yes, but that was simply an illustration of the point that really nasty bugs like this can crawl around and 'safe functions' like this can't do anything to help you, and in many cases, no analysis the compiler will implement can help you either (like I said, bugs of this vein have crept into the worst places, like the OpenBSD/linux kernel.)