Until Americans are back to work there is no honor to restore. If corporations won’t do it, then the government should.
That’s right, I said it. The government should get into the business of hiring. For generations, we’ve been socialized to believe that it is the economic sector, dominated by corporations that provide the jobs. The role of the government is to create the atmosphere by which businesses will continue to hire, namely low taxes, lax regulation and nothing to say about wages and benefits. Let the invisible hand of the free market take care of all that stuff and all will be right with the world.
This blog has been moved to the updated Mad Sociologist Blog. If you wish to continue reading, Click here.