Perhaps I have lived away from America too long. Perhaps I have forgotten that the country was always this way. Perhaps so but I think not. There was certainly this level of vitriol in the years before I was born and the effect lingered into my early childhood in the early sixties. But it was not like this in my adult years in America until I moved away in 1993. When did it change? I haven't been back to the states for the past nine years and maybe it is only distance that has altered my perspective.
Where has the humanity gone? When did we cease treating each other like human beings?