America
As a non-American, I appreciate that some may think that I am attacking their country and thus feel protective of it. It is not intended to be an attack. Honest.
So, ahem...
While American politicians often seem to refer to America as being the greatest force for good in the world, and it doubtless has many strengths (no doubt, despite the right wing scum in power, ordinary Americans are free when compared to their counterparts in many other countries, for example).
However, some also see America as being full of corruption, and interfering when its influence is neither needed nor wanted.
My question to you is thus:
America- force for good or ill?
I don't mean the American people in general, I mean the political and military force that is America- the government, the army, etc. Is America the land of the free, or is George W Bush merely a dictator over the "free" world?