Friday, January 6, 2012

Why is it an employers responsibility to provide healthcare to their employee?

I understand that during WWII their was a wage freeze and employers started adding healthcare to entice good employees, but doesn't it make more sense to have a nationalized healthcare plan so businesses can focus on business (healthcare costs are killing businesses) and our taxes can go to helping all Americans?

0 comments:

Post a Comment