I'm just curious - how do you feel about British tourists coming to places like Spain, Turkey, Greece, Cuba, Malaysia, Hong Kong etc to escape the terrible British weather and staying at resorts set up especially for them, where they can sit by the pool and eat fish and chips, without the burden of having to experience any of the local culture? I am genuinely curious.
My husband's sister and brother-in-law are actually expats working in Hong Kong. They're lovely, but they pretty much stay inside the expat community, only make friends with white people, don't speak the local language and eat mostly food imported from Britain. They make a lot of money by working in Hong Kong, though.
It's not the same thing. There are no special laws made to give the expats special rights.
In fact, if you've paid attention, several countries take advantage by doing land grabs from the expats. We've seen this, especially with Spain and Mexico.
I'm one of those liberals who hates hypocrites.
Other countries haven't pandered to Western immigrants because the Western immigrants have just killed all round them and overrun the places. Australia, New Zealand, Canada, USA… but the best example of it is probably apartheid in South Africa.
A lot of those Mexicans you mention have Native American ancestry. Many Trump supporters in the USA are having nightmares at the thought of whites losing their dominance/privilege, and that's just hilarious.
So you are crying about historic things. Ok, let's play that game. Where's your whine about all the western borders that have changed?
Where's your whine about Palestine?
Where's your whine about all the legally bought property that was then "land grabbed" by the government? Spain and Mexico are well known to do this to foreigners.
Oh yeah, and I take it that you believe that no non-western (white) people have ever taken over someone else's land. Read the bible and the quran and you will see that is not true. Both books brag about their peoples going on rampages through the middle east killing everyone they come across. Africa and the middle east have always been hot bed of fighting over land.
South Africa and most other African countries are very hostile to whites. If we are honest, the colonized African countries were running well when the western governments left them. Now they are shit holes. Uganda, for example, went to shit in less than a decade after it was independent. This is the pattern for all the African countries.
Like most liberals, you probably blame whites for slavery, but we have records showing it has existed since 6,000+BC and it's still going today.