Everyone knows how to avoid buffer overruns, thank you. That doesn't mean buffer overruns didn't plague the Windows environment for over 2 decades on software... *cough*... built in well developed countries by expert software developers working for specialized companies while adopting strict development methodologies.
So let's cut to the chase and openly admit that this has nothing to do with how much one knows or how well they are organized. Bugs are bugs, are bugs.
Irregardless of the possibility of a piece of complex software to come out of a software house with 0 errors -- which I could believe in if these weren't only lone needles in the haystack of undefined behavior -- what is being presented here is not yet another educative measure, another book, another blog on Yet Again How To Avoid Buffer Overruns. And definitely this isn't about how ignorant we are for missing a buffer overrun. (Definitely better than being such because I happen to think I will never miss a buffer overrun).
This is just an only an offered partial solution to the problem. Take it or leave it.
Originally Posted by brewbuck:
Reimplementing a large system in another language to get a 25% performance boost is nonsense. It would be cheaper to just get a computer which is 25% faster.
That's because M$ has so many coders, and not all of them are competent.
Competent coders can manage to avoid flaws. Really, how many buffer overflows have there been in apache's core? I don't believe there have been any.
Let's face it, using some easy techniques, security flaws can be avoided by any sane programmer. It's just that coders are either stupid or lazy.
I'm not talking about bugs. They can happen. I'm talking about exploitable vulnerabilities. They shouldn't happen.
Finally, there was even an OS on slashdot a few days ago that was proven correct. If this proof is actually right, then that means: no security flaws. So I guess someone did it...
So, are we to assume that a programmer that is too stupid to check that the length will fit into the array would not be too stupid to put the correct length of the array into yet another parameter?Originally Posted by man memcpy
I don't get it.
If you code alone without any code review, I think that it is quite likely that any non-trivial program that you write will have a security vulnerability somewhere.Originally Posted by EVOEx
I would not be so harsh when talking about security flaws in general. For example, your advice to "check the buffers. Check the input. Check for NULL variables" can help to avoid buffer overflows and incomplete mediation, but are useless against time of check, time of use errors due to race conditions that may be difficult to detect.Originally Posted by EVOEx
Security vulnerabilities are bugs that should not happen.Originally Posted by EVOEx
Look up a C++ Reference and learn How To Ask Questions The Smart WayOriginally Posted by Bjarne Stroustrup (2000-10-14)
The _s functions make nothing safer. All these functions do is add a parameter where the caller must specify the destination buffer size. Nothing prevents you from passing the wrong value there. Unfortunately, people seem to believe that by using these functions they are making themselves immune to buffer overflow.
In my opinion the situation is worse than before, because now people think they can be lazy because they're using "safe" functions.
Code://try //{ if (a) do { f( b); } while(1); else do { f(!b); } while(1); //}
Isn't that a highly theoretical instance? Generally, significant vulnerabilities are ones that exist in widely distributed stuff that is not developed by one person -- or, at least, there should be ample opportunities for others to consider looking for vulnerabilities in it. After all, for a vulnerability to be be vulnerable, it must be found by someone with some knowledge and skills.
Again, it seems to me there are two significant factors here:
- the insecurites inherent to the windows operating system, which MS cannot go back in time and start again, so they are stuck with it
- malicous programmers (who may even work for MS) who will intentionally include an exploit and then covertly publicize it
The second one may seem "crazy" or "paranoid" but I believe it -- the only truly bizarre element is a motive, but IMO there is no point in analysing human behavior as if it could be rationalized, etc. Obviously, people write viruses; why? Those people are as likely to be MS employees as anything else. It probably counts as easy pickins'.
well who can't agree with that! But going with my assertion in #2, maybe when you lie down on the table you will want more than a friendly smile to reassure you that your medical practitioner isn't, occasionally, a psychopath in some subtle way, or at least that there are some regulations, etc, in place to discourage such behaviour.
Considering that the risk of injury from bad software is slim to none, I imagine the temptation is much greater. My point is this is almost certainly to deal with intentional exploitation, not mistakes, so there is no point discussing it as if it were just about "being clumsy" or not. Of course, there is no way that MS can present it that way, that would be just disastrous PR and probably hurt moral for "the coach"
C programming resources:
GNU C Function and Macro Index -- glibc reference manual
The C Book -- nice online learner guide
Current ISO draft standard
CCAN -- new CPAN like open source library repository
3 (different) GNU debugger tutorials: #1 -- #2 -- #3
cpwiki -- our wiki on sourceforge
Microsoft advertises a considerable reduction in disclosed vulnerabilities after the application of the SDL (which does include a vast number of other replacements that go also into many of its own API functions).
Personally I find these numbers... debatable
The truth is that this offers no possibility of static checking as you well point out. However, I believe that by forcing the inclusion of the target buffer size it at least moves the brain to focus on the size of both buffers and thus can help eliminate errors(*). By improving function signature(s) we know that, for instance, we can help guide towards a more correct usage of libraries we design. This is, at least a conscious design decision we are confronted sometimes. Or so I hear.
It's of no use to argue against the immaturity of coders falling prey to buffer overruns. The fact is that after many years of being given free reign, we haven't proved ourselves worthy. On the other hand the SDL is really not about the programmers. It is about the users of the applications we develop. They are the victims of our decisions (or lack thereof) and obviously throughout the years we didn't give a damn on how damaging our ego can be to the stability and security of their systems.
(*) It's possible that may also help more rapidly locate buffer overruns being originated by these functions.
EDIT: Feel I need to clarify something concerning the second paragraph of this post. I'd like to point out that the SDL Process Template sets in motion an application development lifecycle that indeed allows for static analysis during implementation and other methodologies during the Design and Verification phases that apparently take great advantage of these function replacements. Alone, these functions do little more than force the programmer to give it some thought. There's no other apparent advantage. It's probably possible to argue they may even add to the problem since they may give a false sense of security. But they are first and foremost a part of the SDL, where their context becomes a lot more obvious.
Last edited by Mario F.; 09-09-2009 at 02:08 PM.
Originally Posted by brewbuck:
Reimplementing a large system in another language to get a 25% performance boost is nonsense. It would be cheaper to just get a computer which is 25% faster.
If Microsoft's intentions are as innocent as they claim to be (all about security), they should make it clear, in the warning the compiler generates when you use an "unsafe" function, that the suggested replacement is non-standard and will only work for their compiler.
Admittedly, GCC is also not doing so well here, either. It only points out uses of GNU extensions if you pass the "-ansi" flag (it checks for standard conformity), but at least it won't suggest the programmer to ditch a standard function for a non-standard one and sound like the non-standard one is in all ways better than the standard function (including standard conformity). Is there a "standard compliant" option in VS?
I will make a bold prection:
M$ deprectes unsafe code, GCC follows 6-12 months later, changes are incorporated into the next C standard revision.
You heard it here first.
I love deprecated functions, thats just another way of saying its stable and wont get broken by a future release.
Agreed.
There's nothing deprecated about these functions. The Microsoft wording is particularly pernicious, as if the C++ programming language existed only to serve the Microsoft Windows operating system. This is my one and only beef with the whole affair.
Originally Posted by brewbuck:
Reimplementing a large system in another language to get a 25% performance boost is nonsense. It would be cheaper to just get a computer which is 25% faster.
I will make another bold prediction:
You don't know what you are talking about. Your statement is as hilarious as me saying M$ will open source Windows soon because they will realize it's a better software model.
Fanboy-ism like this is very dangerous. The computing world is much more than just Microsoft. You are being blinded by all your Microsoft love. Loving Microsoft is okay, of course, but try to go without Microsoft for a few months - Linux, OpenOffice, Firefox, GCC, open standards... You don't have to like it, but IMHO you should try it. Even if you end up still loving Microsoft, you won't be taking everything they say as the golden un-biased truth, and you will see many more faces to issues like the one at hand, and be able to stay neutral and objective.
BTW, yes, I am biased towards Linux/GCC/open source stuff, but I have tried Windows extensively also, for many years. Can you say the same?
Except that he's probably right, except it won't be until 201x. It seems one way or another the next C standard will have at least some "safer" replacements. (TR 24731).
Apparently they don't have total confidence in it...Originally Posted by Mario F.
Oops, there's 30 or so violations on the ban from the first three categories alone in modules new to Win7.Originally Posted by The MS Banlist
Introducing StrSafe, by our definition it's no safer than what you've got now. I guess they forgot it's built on top of the "banned" CRT functions. Well, unless you happen to know the magical mystery define that's not documented anywhere.Originally Posted by The MS Banlist