The word "feminism" has had positive associations since the beginning of the movement for women's rights in the 19th century. People usually comprehend it as the teaching that men and women should be equal in all aspects of life. Women should have equal...