Wednesday, September 22, 2010

Why has the U.S. Shifted to the Far-left and Far-Right Politically? Opinion

Question: Why does the United States have to be Right or Left politically?

In our current environment, egged on by the mainstream media, we have entered an age where only the left and right politically get any attention.

To be left you must be a atheistic ultra-liberal and to be right, you must be a Christian ultra-conservative.  What ever happened to being centrist?  The United States, like it or not, is a country that was founded on Christian principles.  This of course makes those on the right happy, but may upset the left.  I personally don't think that there is anything wrong with having Christian religious beliefs injected into our morality.  It is what should be guiding us to do the "right thing".  You don't have to believe in it, but the moral principles alone, certainly won't hurt anyone.

The left spends a great deal of time pushing the idea of individual freedom, which within reason is admirable, but they do it at the expense of our moral obligation as a nation, and as world citizens.  The "anything goes" nonchalantness with which the left views public policy is not always in the best interest of the average citizen.  The right on the other-hand is just as guilty of blindness to the average American's interests.  As an American you are free to practice the religion of your choosing and speak what is on your mind as guaranteed by the 1st Amendment.  These things are not always pleasant, but they are just a part of the freedoms that we enjoy being Americans.  This does not mean that you are entitled to act irresponsibly.  As both ultra-liberals and ultra-conservatives are essentially unwavering in their beliefs, I don't believe that either side is right for America.

Why is it that only these two groups get the attention, I think it is because they are "glamorous", or at the least exciting and new-worthy, while Centrists are boring.  Although a Centrist, which you might consider and Independent in American politics may lean slightly left or slightly right, they are typically more open to the ideas of others, may have some religious tendencies althought they may or may not practice, and I believe that they genuinely look at our politics as neither a left or right proposition.  That is by definition of an Independent as neither a Republican or Democrat.

I think that it is time to put the "ultras" aside and look at what it truly best for America.  There are so many problems to be solved that left/right bickering is a pure waste of time.

What are your thoughts, I want to know?

No comments:

Post a Comment