Feminism

Feminism is defined as a social and political movement that promotes women’s rights based on gender equality. Feminism does not deny biological differences between men and women, but it does demand equality in social, political and economic areas. The majority of academics believe feminist efforts are to blame for important historical advancements in women’s empowerment.…