Why Does America Hate Women?
I probably got your attention with the headline here, and it may feel a tad overdramatic. But Dear America: as a woman born, raised, and educated here, I’m not feeling the love.
I was talking last evening to a couple of law school girlfriends, when one of them posed the question, “Why does America all of a sudden hate women?” I thought it was a really good question. And I personally feel the hatred–in the news, in the media, and in our government and policy makers, in particular.
Not sure how to answer her question, I tried to reason that it was a way for one political party to polarize from the other, but instead has splintered the parties. For example, I’ve ranted about this before–but I’m a woman without a political party right now. I’m fiscally conservative (I detest debt of any kind), and I’m socially liberal. I also don’t like my rights taken away from me that I thought were settled for at least most of my life and time on this planet. Now? I’m not so sure I’ll keep these rights, due to the current climate of hatred towards women in this country.
When (most) other countries have women at the helm in business or their governments, or even just better lives in general for women, I’m beginning to wonder if I really live in the greatest nation on Earth anymore. I’m disappointed and embarrassed by my country right now. I want my country back, where people can work hard to get what they deserve–a meritocracy. One where my government isn’t involved in every move I make. And one that actually I can identify with (because now? I currently cannot.)
I’ll go on and keep asking myself the leading question of this post. And I’ll fight to keep my rights as we move forward. And maybe, I’ll do something even crazier, like dare to represent other women, in order to fight for peace, love, and womanly awesomeness in our government, our corporate boardrooms, and our country.