After reading about someone's opinion on men and their societal roles, Which also happens to be very biased, a really strange thought came over me. What if women ruled the world? What would the world be like if women were in power?
My personal opinion is that things would really be no different. In fact, I think that even if women somehow did manage to bring a sense of peace over this world we live in, the backstabbing, the hormones, and everything else that
makes us men go " What the Fuck?, " would be the downfall of women. Just because somebody feels that women somehow are better than men, or vice versa, doesn't mean that it's necessarily true. What it boils down to is the same thing I see every other day on the news. One race related crime or somebody whining about how they were discriminated against. Racism is just bullshit anyway. Somebody that thinks they're better than someone else just because their skin color is different, or the fact that they have different chromosomes, *stares at Eire*, is just ignorant. Yes I do have a dick and yes i'm proud that I can pee on anything I want, but does that make me better than a woman?
- Listening to:Megadeth