Political correctness, promoting multi-culturalism and bleeding heart lefties are slowly destroying the west. Much of Europe is already gone and the US is starting down a slippery slope.
If things were the other way around and it was Westerners going out to the middle east and Africa and imposing their "culture" do you think it would be tolerated? Hmmm...