|
|
#1 |
|
Sic itur ad astra
|
Hey guys, I thought about posting a CT thread about this, but never got around to it.
Well, as the title states, my question is about the one and only, war. Dictionary.com defines war as "a conflict carried on by force of arms, as between nations or between parties within a nation; warfare, as by land, sea, or air.". What I ask you people tonight is, is it worth it? In my point of view, there is NO point to war. Even if you defeat your enemy, even if they are weakened, in the end it all does nothing. After the war, the enemies will probably have a bigger grudge against you from the start. I also think that once war starts, it doesn't truly end until both sides are dead, or they realize their faults and try to make a negotiation. For example, most wars are ended because both sides do not want to lose any more people, or force themselves into poverty. Quoted from Wikipedia: The political and economic circumstances in the peace that follows war usually depends on the "facts on the ground". Where evenly matched adversaries decide that the conflict has resulted in a stalemate, they may cease hostilities to avoid further loss of life and property. They may decide to restore the antebellum territorial boundaries, redraw boundaries at the line of military control, or negotiate to keep or exchange captured territory. Negotiations between parties involved at the end of a war often result in a treaty, such as the Treaty of Versailles of 1919, which ended the First World War. I think that you should just not even begin war, for at the end, you just end off worse than when you started battling. So I ask you people, what is your take on this? Do you believe war is a worthwhile thing, that resolves issues? Or do you think it is meaningless, and that it just causes more trouble? Please respond. -Net ![]()
__________________
RIP Steve Van Ness <3 |
|
|
|
| Currently Active Users Viewing This Thread: 1 (0 members and 1 guests) | |
|
|