Now, before anyone gets up my ass about this, I am not trying to defend men in relation to women, especially white men. I realize we yield more power than anyone (even though I have the "disadvantage" of being gay, that is not something that is quite apparent....as long as I don't speak or move) and I'm not trying to deny that. This post is more about the twisted mores and folkways of society, a topic that is very near and dear to my heart.
Anyway. I couldn't help but notice that almost every woman I see wears make-up, and almost every man I see does not. What are the reasons behind this? First I would say tradition, because that is usually the main suspect in cases like these, but that's not the case; there are several precedents of men having wore make-up. What else? Well, I'm not positive, but I would probably have to say that, in our society, it's not exactly "butch" to care about what you look like. Sure there was that whole stupid "Metrosexual" fad but that fizzled away quickly. It's just not common for men to wear make-up, and I find that somewhat appalling. What's the big deal if I wear green eyeshadow or pink lipstick? Maybe it wouldn't be flattering on my skin tone, but I don't see why there should be any other reason. Women can put all kinds of color on their faces, but men cannot. And whose fault is this? Well, everybody's. Everybody who's not the least bit progressive in attitude, anyway. You're told it's not right, and you believe it. It's not something you see commonly, so it's weird.
I'm not sure what I can do about it, really. I don't really have a desire to wear make-up (because my skin is flawless and I think it's horribly fake to wear make-up) and if I do, people will think I'm transgendered (which I'm not). So I guess I will reserve myself to coping with the inequities of society.
Damn. I wish my first post wasn't so superficial and pessimistic. But, hey, go with what you know, I guess.