Feminism, by definition, is the belief that men and women should have equal rights and opportunities. Unfortunately, this has become a term that has so many negative connotations. Feminists are perceived as men-hating people who are overly aggressive.
So what, really, is feminism?
In reality, feminists just want equality and equity among the sexes. This comes at no detriment to and with no inherent hatred towards men.
Feminism isn’t hating men or even hating chivalry, it’s hating chauvinism. Feminism isn’t about women being superior to men, it’s not about women treating men the way some men have historically treated women. It’s about the sexes being treated equally, by themselves and each other.
The term feminist has been around since the mid-1800s and prevailed internationally during many 19th-century women’s rights movements. This is recognised by many as the first wave of feminism.
Feminism has been the driving force behind many momentous societal changes.
In the UK, the suffragettes are famous for their feminist movement. The most influential protest resulted in getting votes for women in 1918. Another noteworthy area of the feminist movement came in 1973 with the “Battle of the Sexes”. World women’s tennis number one, Billie Jean King, challenged Bobby Riggs after he was taunting his female counterparts. She won all three sets and Riggs demanded a rematch. He never got his rematch but he did admit that he underestimated King’s abilities..
Feminism is believing that it is idiotic and destructive that one group of people should have more power over another, simply by virtue of their gender. Feminism is believing that anyone, regardless of their gender, should have equal opportunities and equal rights.
Feminism is wanting to see an improvement in the way children view the world. Young girls are now being taught that they can be firefighters or footballers. They’re being taught that those are not male professions and women can do any job they want.
We tell these girls that women can do any job but we still see a lack of female CEOs. 2018 saw less than 5% of Fortune 500 companies have female CEOs. Additionally, those that are in traditionally male roles can find that they are underpaid for their work. Feminism is believing that women should be given equal pay to their male counterparts, for the same work.
It is a movement that is often wrongly perceived as women wanting to take away men’s rights but that’s simply not true. Feminism is about helping all of society. It is not all about improving life for women.
Feminism is wanting to change the social norm. It is seeing the need to move towards a society where men are not afraid to be vulnerable and show emotions the way women can. Feminism is recognising that suicide is the biggest cause of death for young men in the UK and acknowledging that society needs to accept that men can have mental health issues, and that they are often linked to the fact that men – particularly young men – feel pressure to be unemotional and always strong.
Feminism tells boys it’s perfectly okay to want to be a nurse or a childminder, those traditional female jobs.
Benedict Cumberbatch, John Legend, Justin Trudeau and many other famous men have publicly advocated feminism. Despite this, only 32% of men in 2018 said that they identified as feminists. Additionally, and perhaps much more shockingly, only 41% of women advocated feminism.
People need to accept that there is nothing scary about being a feminist, or about feminism. Feminism just calls for a society where gender doesn’t have an impact on lives.
And for anyone still doubting the benefits of equality of opportunity (and unfortunately there are still such people), bear in mind the proven links between the treatment and education of girls, and economic wellbeing. Countries which traditionally treat women and girls as second class citizens are much more likely to suffer poverty.
Feature image: bustle.com
A reporter for the FA, Media Officer for a National League football team, and a journalism student.