Does feminism give a much-needed voice to women in a patriarchal world? Or is the world not really patriarchal? Has feminism begun to level the playing field in a world in which women are more often paid less at work and abused at home? Or are women paid equally for the same work and not abused more at home? Does feminism support equality in education and in the military, or does it discriminate against men by ignoring such issues as male-only draft...