Skip to main content

Feminism

What comes to mind when you hear the word feminism? Some thoughts might lead to stereotypes like aggressive woman who hate men, feminists are unfeminine, that they look down on women who lead a normal life style. This might be true for some, but you don’t have to be those stereotypes in order to be a feminist. In fact, feminism is not about any of the stereotypes stated. Feminism is not about hating men. It's not just for women and it's not a bad word.

Why has feminism turned into an uncomfortable word? Why have people chosen to not identify themselves as a feminist? By definition of the Webster Dictionary: feminism is the belief of equal rights for woman and men. So it’s a silly misconception when people assume feminists hate men. Simply feminism is the belief of the social and economic equality for both genders. 

Women and men should be paid the same amount for the same job. A woman should be able to make decisions about her own body. Both genders should be given the same respect. In a speech, given by Emma Watson, for the He for She movement, she stated “sadly, I can say that there is no one country in the world where all women can expect to receive these rights. No country can say they achieved gender equality.”

In some countries, girls don’t even get the opportunity to go to school, but are expected to stay home and care for their siblings. Malala Yousafzai is a young Pakistani girl who's strong belief in education nearly brought her to death. In America, you could say we are lucky because we have the right to learn. We don’t have to worry about getting hurt for going to school in order to get an education

Coincidentally, if gender equality is a problem that needs to change, it can’t just be women in on the fight. Women are just half the world, men are the other half. Society hasn’t just locked women under stereotypes. Men have been too and to achieve social equality, gender based stereotypes need to end. Both genders need to be included in order to change the inequalities. Women are oppressed and held back in many, many, ways, but gender roles can and will negatively affect men and women. Feminism will help and improve lives of both men and women in the world. The gender role of a man is not to be too feminine. To be expected to play sports. This is pressuring because a man may be judged or looked down upon if they don’t play into these stereotypes. Feminism will diminish the pressures on men to be aggressive, women to be submissive and all other roles genders are “expected” to play nowadays. Finally, maybe all genders will be able to be themselves instead of being caged into these unrealistic roles they are supposed to play.

If you don’t like the word feminism, don’t worry about it. It’s not the word that matters. It’s the movement and importance behind it. Many people refuse to identify themselves as a feminists in fear of the unattractiveness that has been paired with the word. Don’t stop fighting gender based roles just because of a word you don’t like. The movement behind feminism will allow women to be paid the same amount as men and to not be seen as objects, but as an equal human being. This is so important. Women should be afforded the same rights as men, that’s what the word feminism means. How could a word that could positively impact so many negative areas in the world be a bad thing?