Originally posted by S_W_LeGenD
IMO, patriarchy is a culture which primarily empowers males. Males are tasked with greater responsibilities; leadership, protection, bread-winning, fighting and vice versa. Males have more rights then females. I know that it sounds extreme but every culture has its pros and cons.Matriarchy is the opposite of patriarchy. And USA is in transition phase; from Patriarchy to Matriarchy. Maybe in a few decades in the future, this transition will be complete.
I don't disagree with your definition. However I would say that, while some aspects of patriarchy have definitely changed, and rigid gender norms have been somewhat loosened, this still is the type of society we live in. Almost all CEOs of Fortune 500 companies are men, and have been men. All presidents and a majority of politicians are men and have been men. These facts alone insinuate that leadership and control is still squarely in the hands of men (a majority of rich, white, straight ... men, to be exact).
There has been some progress made to make this less the case, however we are nowhere near an gender equal society, and definitely not on the way to something matriarchal.
Patriarchy hurts men and women (women more however), and men's rights activists make some good points, however rather than working on eliminating these problems they side with the system that creates and upholds them and demonize feminists, who have done a hell of a lot to liberate both women and men from rigid gender roles.
Although both Digi and Oliver North are correct that not all feminists are for equality, not all feminists are smart, not all feminists can separate patriarchy from men...that's a completely different issue though, imo, feminism as a movement has had nothing but positive effects on both genders.