I find the twisted perspective of some women who currently call themselves feminists to be absolutely disgusting.
Although I have met some genuinely interesting feminists who are truly in favor of equality between men and women, many women who currently call themselves feminists are some of the most sexist people I have ever met. Most "feminists" of this variety consider women superior to men and waste no time telling you at every opportunity about the inferiority of the opposite sex. They work actively toward a culture of division and stereotype and work for greater stratification between genders rather than less. These "psuedo-feminists" actively strive to put men on the bottom of the totem pole, apparently for the purpose of "getting back" at the male gender for their "oppression" of the past (none of which was often actually experienced by them).
I cannot conceptualize strong enough language to express how grossly sick this brand of sexism makes me feel.
There are places in the world where feminism is still exceptionally relevant. There are many areas in which cultures have been groomed to exclude and demean women where women need to organize and be strong together in resisting tyranny to create a more equal playing field. There are places where religion is used as an excuse to reduce women to a slave class. And yes, some of these places do still exist in the United States and Europe. However, for the most part, women who are fortunate to exist in the United States are outside of the need to even bother with feminism in this form anymore, as, in most cases, women are actually given false advantage over men based solely upon their gender (as any man who has ever attempted to get a small business loan could likely express). Women have been given the Constitutional right to equal protection under the law, and those women who act in such a sexist manner toward men and still insist upon trying to push for feminine superiority rather than equality are totally blind and dumb to the damage they are doing to the image of the feminist on both a cultural and international level.
As an example of this, women in America currently have amazing power over the marketplace. Advertising from the past decade or so reflects this fact, as even manufacturers known for catering to formerly "male dominated" venues, such as Nike, began redirecting their advertising dollars toward catching the female eye. Women now are the center of the American universe, and are portrayed as Goddesses and "Super Moms" while men are often portrayed in television and other media as unintelligent, lazy, and single-minded (unless they happen to be gay). These stereotypes feed belief and preconceived notions that foster division, rather than unity, among the sexes. This exploitation of stereotypes wouldn't bother me so much if I didn't see so much evidence that so many people hold so firmly to the belief that these unfair perspectives were ultimately true.
Feminism is not a weapon to be used to take out one's aggression for being "damaged" by some heartless creature who happened to be male. It is not some ridiculous primate dominance game used to lord one's perceived superiority over another of their species. Feminism was, and should always be, about creating a society of equality and achievement based upon merit rather than an accident of X and Y chromosomes.
I would be grateful if every single female reading these words would take a long, hard look at herself and ask whether she was being unfair and sexist to the other half of the species. Sexism goes both ways, and it's well beyond time for women to recognize when they are being hypocritical and dividing, rather than uniting, the human species.
Copyright 2005 S.L. Olson
