I am really and truly sick and tired of being called anti-business because I lean a little to the left. Why can’t a company be expected to pay decent wages to its employees? Why should a company be allowed to not offer affordable — and, good– health insurance which leaves the states to pick up the slack.
Back in the days when I was downsized a couple of times in a row, I had been looking into getting healthcare from one of the temporary companies for whom I worked. They wanted nearly $200 a month, but I would have a $500 deductible and half of the medications I was on at the time were not covered. Nor, were some doctor visits. But, yet, the company did “provide” health insurance — just not good health insurance.
So, why shouldn’t Democrats have a problem with Wal-Mart? The company is owned by the riches families in American, yet pays some of the lowest wages. And, it isn’t jus Wal-Mart. Numerous companies post high profits, yet low pay raises for their employees. Someone’s getting rich while the middle class just gets dumped on and taxed to death. The Republicans have had since 1994 to come up with a better plan. They added the White House to the mix in 2001 and yet, the middle class is shrinking and the upper 1% is just getting richer.
Exactly how can they think their way is the RIGHT way, when the nation is not headed in the right direction?