I don't like to disrespect anyone's nation, so I hope this does not come across as such, my apologies if so, but I do agree that the US seems to place more emphasis on race/race relations etc, comparatively to other nations. I have lived in the US, Europe as well as the Caribbean, and the difference is striking.