When I read articles like this it breaks my heart. I think to myself "What kind of country am I leaving to my children?" Why does America, the greatest nation on this Earth, need to lose it's influence in the world?
I purposefully used the word need in the previous sentence. It is my opinion that much of the reason we are losing influence is done on purpose by corrupt politicians (both Republican and Democrat) and corrupt special interest groups.
Why is it, that my children will not get to realize the American Dream like so many have before them? This to me is unacceptable. Americans are better than this. We know that we have been lied to. We are not falling for it any more. It is time to clean house in our government: at the local, state, and federal levels.
NOVEMBER 3rd 2010 - Get ready for some real change America. A restorative change.
Link - US Losing Influence