Actually telling like it is, gets you down voted. I mean I'm not a fan of republicans at all, but to think that Obama isn't bought is super fucking weird.
He basically gave you the shitty version of single payer healthcare, so that the Health insurance could still profit on the backs of Americans, he had the chance to give you universal healthcare (there were massive demands and protests). In his words, the american way is to get health insurance through your job.
Is it mandatory for woman?