Bryan Martin
Dec 25, 2021

Feminism is about gender equality, that much is true. Feminism is the belief that society favors men and aims to achieve gender equality through the advocacy of women's rights. While feminists may genuinely care about men's rights, Feminism does not.

Responses (1)