WATCH: California's Harsher Criminal Penalties Are Working
Are Biden's Latest Pardons Legit?
The Republican Party Has Two New High Profile Members
Not Quite As Crusty As Biden Yet
Tom Homan Shreds Kathy Hochul Over 'Tone-Deaf' Post After Illegal Immigrant Sets Subway...
Key Facts About the Saudi National Accused of Terrorist Attack at German Christmas...
Celebrating Media Mayhem with The Heckler Awards - Part 2: The Individual Special...
The International Criminal Court Pretends to Be About Justice
The Best Christmas Gift of All: Trump Saved The United States of America
Who Can Trust White House Reporters Who Hid Biden's Infirmity?
The Debt This Congress Leaves Behind
How Cops, Politicians and Bureaucrats Tried to Dodge Responsibility in 2024
Celebrating the Miracle of Light
Chimney Rock Demonstrates Why America Must Stay United
A GOP Governor Was Hospitalized This Week
OPINION

The 6 Big Ways Liberals Are Destroying America’s Culture

The opinions expressed by columnists are their own and do not necessarily represent the views of Townhall.com.
Advertisement
Advertisement
Advertisement

There was a time when we used to be a nation that pushed freedom, responsibility, decency, patriotism and hard work. Some of us still believe in those things, but because of liberalism, much more destructive values have seeped into our culture like toxic waste. There are many ways that liberals have degraded our culture, but these are the worst of the worst.

Advertisement

1) The Politicization Of Everything: Want to watch football? There are players protesting the flag. Turn on a Hollywood awards ceremony? There are actors making political statements. Want to go to the bathroom in peace? Well, first, we need to know if you’re okay with sharing it with someone who feels like the opposite gender today. You’re just a regular person telling an off-color joke you heard? Better be careful; that could turn into a front page scandal if the wrong group gets offended. Want to buy a Halloween costume for your kid? Well, you better make sure it’s someone of the same race or people will be upset. Want to go on a date? If you date someone of the same race, you may be racist. Ready to marry your honey? Then you better support gay marriage or you’re a homophobe. I can remember a time in America where you could just live your life without paying attention to politics at all if you wanted and that was a good thing. Remember that old saying? Never discuss race, religion or politics in polite company? Well, because of liberals, you don’t have a choice anymore.

2) Political Correctness: I hate the idea that some normal person can tell a joke on Twitter and have his life destroyed for it by the liberal fun police. I think it’s disgusting to see that liberals have embraced fascism to such an extent that they can’t tolerate a contrary idea on their college campus. It’s like the whole country is in a relationship with someone who has borderline personality disorder. Everyone’s walking on eggshells because some buttercup might get horribly upset at, well….just about anything. Whatever happened to the idea that if some ordinary thing triggers you, YOU have the problem and it’s something YOU need to work on? Oh, yeah, liberals happened to it.

Advertisement

3) Victimhood: Only in America do people get so excited about the idea of being a victim that they will even fake hate crimes against themselves to get that status. Only in America are there women publicly crying that they were sexually assaulted and traumatized because a 93 year old, wheelchair-bound President pinched their butt. Only in America do we have to come up with things like white privilege and institutional racism because there’s not enough real racism happening to allow liberal minorities to feel like victims. There was a time in America when people wanted to feel strong, capable and able to handle their own problems instead of being victims. There still are people like that today and they’re called conservatives.

4) Liberal Feminism: The original “Women should have the same rights as men” feminism has been so widely accepted in society that it made feminism irrelevant. So, liberal feminists reinvented feminism as a combination of man-hating and victimization. Look out, it’s the patriarchy and rape culture! Don’t you dare hold the door for me! Stop mansplaining! Stop saying, “Not all men!” You just sit there in silence thinking about how you’ve oppressed women! Liberal feminism falsely makes women think they could have it all if those awful men weren’t getting in their way and it makes many guys unsure of what reaction they’ll get from women when they behave like men. Forget about the old “Women should be women and men should be men” philosophy; liberal feminism is about women being men and the men being shamed.

Advertisement

5) Tribalism:  Liberals work incessantly to split Americans in ever smaller groups that are at each other’s throats. If you want to get a sense of how bad it has gotten, we’re having ferocious public debates about transsexuals who, depending on how you define it, make up less than .25% to .75% of the population. Increasingly, the attitude is moving from the annoying, “You just can’t understand because of your race/color/gender” to “You HATE ME and I HATE YOU” because of differences that are often unchangeable. This is incredibly dangerous to our future as a country because you can’t hold any group of people including a nation together long term when people no longer believe they share the same goals and values as their neighbors.   Our nation’s motto is E pluribus unum (Out of many, one), but what happens when liberals insist that the many never become one?

6) “Non-Judgmentalism”: We’ve heard so many liberals talk about how important it is to be non-judgmental that it has seeped into the culture at large. When there’s a choice between what’s best for society and best for a particular individual who has done something wrong, we always default to protecting that individual. How’s that working out for us? Since the stigma around getting a divorce, having a child out of wedlock or having an abortion has faded, are we better or worse off? Are the kids who are aborted or who get raised by a single parent instead of a family better off? Are the guys who are extremely hesitant to get married because they fear divorce better off? Are we better off because people no longer wonder whether doing something horrible in public will hurt their good name or ruin their family’s reputation? The very fact that we are so unwilling to draw a line in the sand and say “That’s right” or “That’s wrong” has mired our culture in degeneracy. Maybe if there were more people who feel bad when they do things that “wouldn’t make their mothers proud,” there would be a lot more decent human beings.

Advertisement

Join the conversation as a VIP Member

Recommended

Trending on Townhall Videos