7 signs you’re a feminist, even if you don’t know it

Feminism is a movement that seeks to create an equal society for women. The feminist movement has changed the world by making it socially unacceptable to sexually harass or discriminate against women and mandating equality in pay scales. The word “feminist” has a negative connotation for many, who associate it with radical feminists. However, feminism …

7 signs you’re a feminist, even if you don’t know it Read More »