I have been in the work force since the late '70's. Call me a whining whinger if you like - I really don't care.
I was an employee of a Danish firm, bought out by an American Corporation, asset stripped, then closed with the loss of all but a few jobs, (people forced to relocate to other countries).
My question is, why is the "quality of life" in the US more important then the same anywhere else? I am not making a personal comment against any one person, but it does appear that employees in the US are uncomfortable about being on the receiving end of something their businesses routinely hand out around the globe.