The world miltary topic got me thinking that we haven't really talked much about war in ED, which is odd. I haven't brought up a topic in here for ages (it's been pretty slim all round actually) so how about this:
Is war human nature?
I don't think it is. Sometimes you have to go to war for self-defence, so in some cases the defender must fight (unless they are suicidal) but the agressor always has the option of not attacking in the first place. People can choose not to fight, we ALWAYS have the choice.
I don't believe that humans are aggressive and greedy by nature or instinct or anything like that, and in fact I think people who spread this fallacy are the very cause of war and should be severely punished. Saying we are a violent species gives people an excuse to be violent. They think "oh we can't help it, it's human nature." Quite frankly I think that's bollocks, and evil bollocks to boot.
Virtually everything about humans is learnt I reckon, and I think children could easily be taught that war is evil and never justified. We could easily make it taboo so that anyone who suggested it was shunned by everyone else. All we would have to do is exercise some discipline and brainwash our kids into hating violence the same way we (hopefully) teach them to hate prejudice, slavery and torture.
For example, in the past it was assumed by everyone that women were inferior to men. One day after alot of hard work from pioneering feminists, society as a whole woke up to the idea that this was absurd. Everyone always knew this was a lie, it was obvious to every married man (if he was honest with himself) that his wife was no stupider than he, but society made people blind to the truth. Of course change is a slow process, but nowadays if a man says that women (or black people or whatever) are inherently inferior he is regarded at best as a kook. To each successive generation of children it is more and more obvious that the equality of the sexes is self evident. Eventually it will be so obvious that to question it will elicit disgust and fear and anyone who does so will be considered insane.
What I think is that we could do the same with war. It's not natural and could be overcome, and must be if we are to truly call ourselves civilized. I mean some cultures teach cannibalism and marry adults with children, you can't say that's human nature, though it seems so to the people who come from those culture. I think war is the same sort of thing.
It would take a while, but eventually we should get to the stage where anyone who suggests war as the solution to a problem is regarded with horror as an evil pervert. Perhaps it's already begun, and the fact that I can concieve of war being an unnatural perversion of human society rather than being an integral part of human nature means that war is doomed. I hope so.
What does everyone think of my wierd idea? In order for it to work properly we need globalization, but that seems pretty much inevitable anyway based on the current capitalistic model of world economics. Cultures are becoming more and more interchangeable, albeit slowly.
Is war human nature?
I don't think it is. Sometimes you have to go to war for self-defence, so in some cases the defender must fight (unless they are suicidal) but the agressor always has the option of not attacking in the first place. People can choose not to fight, we ALWAYS have the choice.
I don't believe that humans are aggressive and greedy by nature or instinct or anything like that, and in fact I think people who spread this fallacy are the very cause of war and should be severely punished. Saying we are a violent species gives people an excuse to be violent. They think "oh we can't help it, it's human nature." Quite frankly I think that's bollocks, and evil bollocks to boot.
Virtually everything about humans is learnt I reckon, and I think children could easily be taught that war is evil and never justified. We could easily make it taboo so that anyone who suggested it was shunned by everyone else. All we would have to do is exercise some discipline and brainwash our kids into hating violence the same way we (hopefully) teach them to hate prejudice, slavery and torture.
For example, in the past it was assumed by everyone that women were inferior to men. One day after alot of hard work from pioneering feminists, society as a whole woke up to the idea that this was absurd. Everyone always knew this was a lie, it was obvious to every married man (if he was honest with himself) that his wife was no stupider than he, but society made people blind to the truth. Of course change is a slow process, but nowadays if a man says that women (or black people or whatever) are inherently inferior he is regarded at best as a kook. To each successive generation of children it is more and more obvious that the equality of the sexes is self evident. Eventually it will be so obvious that to question it will elicit disgust and fear and anyone who does so will be considered insane.
What I think is that we could do the same with war. It's not natural and could be overcome, and must be if we are to truly call ourselves civilized. I mean some cultures teach cannibalism and marry adults with children, you can't say that's human nature, though it seems so to the people who come from those culture. I think war is the same sort of thing.
It would take a while, but eventually we should get to the stage where anyone who suggests war as the solution to a problem is regarded with horror as an evil pervert. Perhaps it's already begun, and the fact that I can concieve of war being an unnatural perversion of human society rather than being an integral part of human nature means that war is doomed. I hope so.
What does everyone think of my wierd idea? In order for it to work properly we need globalization, but that seems pretty much inevitable anyway based on the current capitalistic model of world economics. Cultures are becoming more and more interchangeable, albeit slowly.