But the West's moral authority was important - if only for our own self esteem - and it's gone. Whoosh!
Previously we felt the West meant civilisation and glorious empires - more recently morality and freedom.
Now the West represents nothing much except economic expertise and military force and they are questionable.
Gone to Nicaea
ReplyDelete