Much, a lot, is said about a bunch of black thugs robbing/looting businesses and the subsequent blanket statement landslide from the white-owned media.
However, nothing is ever said about our "nature". White people's "nature".
In my opinion, we're one of the most ruthless killing machines in the history of human kind, and pointing the finger at other races is a monumental fail.
Yet, we engage in countless acts of finger pointing as though we, the white race, have never hurt anyone or anything, as though we've been (all of us) instrumental in some groundbreaking discovery that carried human evolution one step further.
Huh?