Hey guys, so I know this is a touchy topic for a lot of people but I really wanted a thread where we could discuss modern-day feminists and feminist ideology, something I've recently become really interested in. I hope we can keep things civil and not get too riled up about differing opinions. By the way, I'm not talking about third world countries, I want to try to keep the discussion to the FIRST world, mainly the west.
Mini essay coming, sorry for wall of text.
Recently I've decided that I'm not a feminist (btw, I'm a girl). And before you say feminism is about ~equality~, that's what the textbook definition of feminism is. But more often than not ideologies play out differently in practice than they do on paper. I honestly think that a lot feminist ideas are pretty old and don't have a lot of relevance when you compare them to people's actual experiences. I actually hate the term feminism, I think it alienates men from the discussion and causes people in general to focus only on women's issues and leads to bias. I think feminism should be replaced with gender equality or humanism, two terms that are much more broad and encompassing than just 'feminism'.
My main reason for turning away from feminism was doubt. I didn't feel oppressed as a woman, and I just didn't see the woman hating world they were presenting to me. If anything, I think western society is very biased towards women in general. There's so many studies out there debunking the wage gap (I can post the links to at least a dozen studies), and many studies that say that women are actually two times more likely to be hired than men because people are trying desperately to get gender equality in the workplace, even if the female candidate is less qualified. More women attend post secondary studies and graduate from college and university than men. My whole life my family and friends and teachers encouraged me to do well in school, to play sports, to purse a science degree, to get a good career and take care of myself. I never felt in my whole life that society was 'out to get me' or that I was being held down because of my gender. I have never felt that way except on an individual basis where I came across a prejudice or sexist individual. I never felt like society or men as a whole was stopping me from moving forward.
If you are unaware of issues facing men I'll list a few issues where I often see men being treated unfairly (note: these aren't true 100% of the time, just some trends I've noticed):
Although men make up 40% of domestic violence cases you almost never ever hear about programs for men who are domestically abused. Men who are abused or raped are often laughed at when they come out about it. There are so many battered women's shelters out there, but almost none for men.From what I've observed, if a woman is the victim, she is almost ALWAYS believed, no matter what. If a guy gets a girl pregnant by accident and she decides to keep the baby, he has ZERO choice in the matter and must pay child support no matter what.Men get longer prison sentences than women.Man gets sexually harassed at work = laughed at, told to man up or told that he's gay. Girl gets sexually harrassed = taken extremely seriously. Man flirts at work = gets harassment charge. Girl flirts at work = completely toleratedMore men commit suicide than women. Men also suffer from breast cancer, but you never hear about it.Prostate cancer is just as much of a killer as breast cancer, yet it doesn't get nearly as much funding as breast cancer does.Nearly 70% of all homeless people are men.Men almost always lose custody battles.Masculinity is damaging to men.Don't worry, I know there are lots of women's issues out there too. I'm just pointing out the fact that both genders face different issues in western society.
To me the main cause of these kinds of problems and biases is gender roles. I think both men and women suffer from different things and the only way to combat it is not to blame men but to deny the idea that men are tough and women are soft, and you'll see a lot of prejudices and inequalities disappear. I think feminism tends to perpetuate the idea of masculinity vs femininity. I've seen so many feminists imply that men are just these savage beasts that love to fight and have sex, and women are these emotional, empathetic, creative beings that are being held down by masculinity. I feel like they focus on gender so much that they create prejudices themselves.. does that make any sense?
I don't truly hate feminism or feminists, I just really don't like where feminism is going these days and I honestly think it's completely unnecessary. All of the problems we have today can be fixed with humanism. I think feminism is just way too biased. What do you guys think?