In North America and elsewhere unless you are Native Americans/Indians, you/we are all fuc*kin immigrants. Don't let the bigots you call your parents lead you too believe otherwise.
Ironic isn't it now -- that the lands the whites have colonized stolen from others by pillaging, raping, and murdering (i.e. natives/aboriginals - term often used interchangeably) is suddenly theirs to call their own?
It never is!
And some of you want a vote on it whether whitey should let certain races in the country, because they are somehow the natural born authority on these lands?
Fu*ck this hypocritical sentimental idiotic dumb shit about protecting the white race!
Who the fuck said the lands whites conquered are their own dipshit? Colonization has ended and countries like India and China are rightfully ruling their own countries again. This thread is referring to a country in EUROPE, not the USA, do you know the difference?
Haha the old "we're all immigrants" line. In that case Turkey has to give back its entire county to the Byzantine Empire, herp derp.
Jesus Christ you're a fuckwit. It seems to have passed over you head that the American Indian example is used to highlight the dangers of allowing so many foreigners onto your shores.
You act as if whites were the only people to colonize other parts of the world. Newsflash dumbfuck, the most powerful and developed civilizations will ALWAYS do that to weaker ones.
If Muslims has enough power in the 15th Century onwards they would've conquered Europe by force.
You say that natives losing their land is a bad thing, however when it happens to modern whites it is suddenly acceptable? Whites aren't the authority to decide who gets let in to their OWN countries?
Epic contradiction!!! Now fuck off.