I’m not too sure on how to start this thread but here is a summary of international events that have prompted me to write it:
1) Increased and open aggression by China towards Vietnam, Philippines and Japan over disputed territories.
2) China’s increasing trade influence in the Middle East.
3) Russia’s annexation of Crimea.
4) The growing humanitarian crisis in the Middle East.
5) The West’s response or lack thereof to all of the above.
I’m slowly coming to the conclusion that the West is losing it. We’ve convinced ourselves that Western aggression has been the cause for many of the world’s problems. A position that I’ve generally agreed with myself particularly with regards to the Western world’s tendency to interfere with the internal affairs of other countries. But now I’m not sure and I think this is why:
1) The Arab uprising has been disastrous.
2) China’s embrace of the free market hasn’t resulted in its democratisation. Instead it has become more openly aggressive.
3) The collapse of the USSR has resulted in a Russia that now has no fear of the US let alone the west itself.
It seems that while western countries prefer their leaders to be democratic and fair with how they govern, the rest of the world seems to prefer an iron fist. More disconcerting though is the idea that these events would occur regardless of what the west does. Perhaps I’m just an old white guy who dislikes change but I can’t help shake the feeling that what we are witnessing, and I know I'm not the first person to think this, is the rise of a new world order.