It's sad to think, in the pre-Christian days, women were once an integral part of society. Sex was once held to be a sacred act. A man was said not to be complete until being paire sexually with a woman. But after Christianity rose in the world, all women with power and status were disenfranchised and demonized, sex was demonized, everything about women became dirty and wrong. All because of what the proverbial 'Eve' did in Genesis
Women have finally (almost) regained what they had in ancient times, except in the Christian faith. Damn all Christians and their bigoted ignorance