If America is such a racist vile place for brown people, why do they keep coming here
Title says it all.
The media and democrats want us to believe the country was founded on slavery, we have a racist president and the country is awful.
So why do brown people keep wanting to come here?
Perhaps it’s not and they don’t see it that way, and democrats and their shills in the media are lying?