America is not the greatest nation on the planet.
Let's see...... the US Government is a totally crooked big brother governement, it's a very violent country, in most states the laws and rules are incredibly strict, the economics are designed to fuck over the poor and create a slave-class out of them, throughout much of the USA there's an ingnorant and bonehead culture of morons, opportunities to make big money are not plentiful anymore, the people are often fairly cold and withdrawn wth a "leave me alone" kind of attitude......
Most people who say America is the greatest country on Earth are not well traveled. They're just local yocals and all they know is the US.
Why exactly is the US the best country?