Closed Source Software
Do all closed source software have to undergo through some kind of tests before they are released?
Some test which checks the software for malicious content....or Is there an authority which can borrow (forcefully) the source code of a closed source software to verify that it has nothing malicious?
Let me first preface this with IANAL (I am NOT a legal professional)
There is absolutely no legal requirement for any of that. Closed source software can (and sometimes will) contain anything the producer likes. What testing is done is completely up to that producer - they may just compile it, or run 100 hours of tests, or whatever else they feel is "appropriate".
Obviously, it may be against the law in some countries to for example spy on what web-pages you visit, or some such, so if you know (or strongly suspect) that some software is doing this, you can take the company to court for breach of that particular law - if you take this up in criminal law,, it would be your job to convince the prosecution in the relevant country that you believe the software is in breach of such a law. It would then be the prosecutors task to find further evidence of such breach of law, and the company would have the right to defend itself.
In civil law, you could take the company to court on a law-suit - the burden of proof is perhaps lower, but it's also less clear-cut what is allowed and not.
Neither would be an easy case.
You personally would almost certainly not get to see the source-code either variant, instead an independent specialist of some sort would be called in to inspect the source code, and give his opinion of whether it does whatever you say it does or not.
I would think that all software goes through testing before being released, but the testing can vary from ad hoc tests by a lone developer for his own product, to a project with a dedicated QA team and beta testers from the target audience. If you are talking about an external code audit to verify that the software is not malicious... I think that the developers surely have reputation and legal backlash to consider, and then anti-virus software and the like can help to detect (and possibly stop) malicious behaviour anyway.
Besides, as Ken Thompson's Reflections on Trusting Trust shows, having the source code does not necessarily help one detect malicious code.
I can argue against some of that. All software is not genuine. It can be malicious or spy software hidden in a real, free or commercial product. Sometimes the EULA admits this and sometimes not (Vista spies on you, for example, but it's mentioned in the EULA).
There was also FlashGet, I believe, which also spied on users who used it.
Some release very high quality software. Some very poor quality software (aka Microsoft).
There are no standards to follow, even if the company claims it has higher quality bars (aka Microsoft).
You can not always trust closed software and, unfortunately, that's how it's going to remain.
> Do all closed source software have to undergo through some kind of tests before they are released?
Most likely. It's called quality assurance.
> Some test which checks the software for malicious content
People do the testing. Since most malicious code is found and exploited, making sure that the code is written with security in mind and that it runs securely on the host environment is the primary defense. Sometimes this works great, other times not so much. That's not sufficient to avoid reviewing your code though.
> or Is there an authority which can borrow (forcefully) the source code of a closed source software to verify that it has nothing malicious?
I suppose the government could subpoena versions of the code, if for no other reason than to accrue evidence and then settle a legal dispute. Cyberterrorism doesn't seem to be happening with software, exactly.
Mostly no and mostly no.
If you download an executable from the web, you have to trust the source.
If you buy boxed software, you have to trust the source, too, of course. But it's considerably harder to make a store sell a faked package.
If someone intentionally writes malicious software, they can be sued. In the web, the hard part is tracking down the author; that's easier with boxed software, obviously.
Also, software that you buy usually comes with at least partial liability. If the software is so buggy that it destroys your data or lets attackers in, you can try to sue the vendor, too. The outcome of the battle will depend on local law, but in general you'll have to prove that the vendor was criminally neglectful in releasing the software in that state.
In such a legal battle, the vendor may have to open the source to the court. But everyone who actually gets to see it is also bound by non-disclosure agreements.
OK, but there are special cases, of course. For example, some companies offer to show you the source in exchange for an additional fee, so you can do your own security audit. Other companies have policies for software that works together with their own. Windows driver licensing is one example: in order to get your driver signed (and thus usable under Vista), it has to pass a certification process from Microsoft, and that means you have to give MS access to your source.
Edit: four replies while I was writing this. Must be a record.
I don't think you need to give the source to MS - the testing is all done by the provider of the driver, and the results (with checksums of the results and the to-be-licensed executable file(s) etc) are then sent to MS for approval. You don't actually need to submit the source itself (at least that was how it worked about 3-4 years ago).
Originally Posted by CornedBee
Obviously, as mentioned, if you have a big enough business reason (e.g. you buy many millions copies of a software product), you can probably ask to have a software audit of the companies product. You may not be allowed to do this yourself, but certainly an independent third party can be allowed to do so.
Oh, well, I don't really know how the certification works. I'm really surprised they have black-box tests that are extensive enough, though.
The graphics suite contains quite a lot of tests - on a graphics hardware of modern speeds, it takes about 1 day to run all the tests. When we where running simulated hardware, it took a day just to run one class of tests... There are also "import library checks" to see that you don't use DLL's that you are not supposed to, and such. So the tests are pretty good - but as everything, it could probably be a little bit better...
Originally Posted by CornedBee
Elysia, contracts are legally binding. If you do a reality check, reading any contract is important: it doesn't matter what synonym you use (EULA). Not everything needs to be disclosed in the EULA either because customer feedback programs are explicitly opt-in or opt-out programs that can only take place after you agree to the liscense terms, and you have no legal recourse for willfull ignorance of your rights after that. What is important though is that you have to be sure that the contract is not breaking your country's law.
If you find that it is, you have a legal recourse after-the-fact but you have to convince a judge. However, the simplest thing to do is either not agree to the terms beforehand. I'm pretty sure that you are talking about the fact that Vista does data mining on your computer. This is the same type of data mining (as far as I'm aware) that tells Microsoft that bold, copy and paste are the most used features in Microsoft Word. It's aggregate statistics being misconstrued as spying, which Microsoft uses to figure out its' next big thing (in the context of the example, the next interface for Microsoft Word).
Another example: TiVos mine data about what you watch as well mostly so that TiVo can recommend other shows for you. Amazon does this with things you purchase. This type of thing has exploded all over the place. In my legal opinion the practice is way to ubiquitous to be constitutionally challenged as an invasion of privacy and have it mean anything, unless the response is just to levy a bunch of stupid fines. I'd like a case like that though, just so we're sure where the law stands on the issue.
I should mention that, to avoid legal issues, any data collected through "spying" becomes aggregate data, so it can't be narrowed down to finding out what Elysia bought yesterday or so forth. "Buyer beware," though. It could get ugly one day.