Quote:
I believe it was Reagan who used this concept quite often. It's a nice thought at first, but think about this on a smaller, more historically proveable scale. On December 7th 1941, America was attacked. Clearly there were people who wanted to do harm to America, and had the power to do so, but there were still rascists. There were still left-winger's and right winger's. There were still dishonest journalists, people who thought we should give negotiations with warring countries more time, people that felt we should go about the war different ways. Aliens wouldn't change things much. Sure we'd all be under a common threat, but people would start all sorts of "told you so" arguments, and people who refuse to fight because if it's God's will that we survive, we'll survive, and we shouldn't have to kill to do so. Then people who believe we should really do something to blow the aliens out of the universe would start wars to stop anti-war demonstrators.
Maybe it would improve our outlook on eachother a little more (definitely in the short term), but it's not going to get us to world peace.
Obviously you've never seen "Independance Day" then. Why would Hollywood lie about something like that?