Is "fairness"uniquely American?
I have quite a few non American friends, and they seem to consider the idea that people should be treated equally, ignoring gender, race ect as almost.. childish.
My Indian friends, for example, have a pretty set in stone role for males/females in their culture.
British males have attitudes that would be considered misogynistic.. I mean, they talk about women like they're sex objects. These same Brits also love to chastise "Yanks" about any impropriaties, but I wonder if that's because they're SJW's, or if they know Americans are obsessed with this stuff, and like to tease us of it.
I mean, didn't the English starve the Irish? Yet, no shame for it. Irish and English get alone just fine.
Really,, I can't think of many other cultures that take their social politics so seriously. At least, none that I know.