Happy People Are Hiding Something
Sometimes I feel like I wasn't programmed right, like there's a bunch of information out there that everyone else knows, but I somehow missed that day of class.
I get the strangest looks from people when I tell them I don't own a car, never have, it's like I just killed a baby in front of them, or I'm some sort of immigrant that doesn't know the ways of Western culture. I would be the first to admit that I am the least "socially trained" person I know. Not that I lack class or culture, I just don't give in to the "this is the way things are supposed to be" bullshit that pervades our culture. Example; if a man chooses to stay at home with the kids, he's considered lazy, unemployable, a gigolo at the far end, or just not a man, but if a woman stays home to look after the kids, she's considered normal. Now, I know I'm glossing over thousands of years of male-dominated society stuff, and women were the primary care-givers for thousands of years, but it seems we've never left that sort of mindset, even in our supposedly "modern" age. Now, I realize this is a very broad and generalized statement, and doesn't apply in all cases, but if I don't own a car, does that make me less of a person? Yes! According to society it does. If I choose not to pollute the only mother we have, does that make me a loser? Yes! Again, according to society it does. Because I don't want a credit card, or a cellphone, does that make me a bad person? Yes! According to society it does.
When did we lose the idea that judging someone by their character, instead of their possessions, become the norm? Or has it actually always been the norm, and we just say shit to make ourselves feel better about our own selfish lives.
I hangout with new people all the time, and it seems people, more and more, are walling themselves off, especially gender wise. I feel like there are two clubs, men and women, but you can't belong to both according to society, you have to choose one. But since 90-95% of my friends are women, I have to choose the Women's club. So, when I talk to people, whether old friends or new, and I tell them most of my friends are women, their first question is always, Are you gay? Now, it's not usually asked in a negative context I should point out, which is wonderful and the subject of a later post hopefully, but out of curiosity. But why the question in the first place?
Does a man have to be gay to hang out with women? Is this a reflection on a media driven society? When I state that no, I am not gay, the questioner almost seems relieved, which brings up why, again, are they relieved? but I'll leave that alone for now. I should state that I am not gay, just for the record.
We are satisfied that if everyone has their role, then we can progress. Which is a wonderful thought, but can't people have multiple roles within their own "worlds" without being labeled as anything but "human"? I grow orchids, I like to cook and bake, I like classical music, I like to get people out of their comfort zones at parties, just to see how they react, and if they learn anything about themselves by their reaction, I like fashion, I like musicals, I like antiques, but I also like sports, heavy-metal and punk, getting dirty, shooting guns, drinking beer, going to the strippers, working on and looking at vehicles, building things in general. Why should I limit myself from any above activities, based on my gender and sexual preference? Is it fear that drives these people? Then who is really the pussy now? You are!!!
It seems like in our pursuit for equality and progress, we've actually drawn more lines in the sand, created new categories for things that shouldn't exist, and are actually going backwards in progress, but this time it isn't a group of Neo-Nazis, Homophobes, or religious fanatics, it's normal people that are the problem, every god damn one of you!!!