When did President Obama cease to be the President of the United States and become only the President of the Democrats? And when did the Bush presidency cease to be in the collective consciousness? When did become OK to hope that the country fails?
When did it become OK to refer to other Americans as un-American because their views didn’t match yours? Or because they loved someone of the same sex?
When did it become acceptable to spew hate and vitriol about someone’s religion in the guise of religion? Or to claim that God was on your side, because you believe a certain way?
When did it become alright to denigrate someone for standing up for their rights as citizens and human beings in this great nation? When did it become OK to spit on the poor, the helpless, the sick or the elderly?
When did it become appropriate to claim that even education was an entitlement?
When did it become proper to question the medical decisions of anyone, but especially strangers? Especially women??
This country has become something unrecognizable. When did it become that way?