begin oe_protect.scr
Roy Schestowitz <newsgroups@xxxxxxxxxxxxxxx> espoused:
> __/ [ Marshall ] on Tuesday 15 August 2006 13:47 \__
>
>> Roy Schestowitz wrote:
>>> Open source project adds "no military use" clause to the GPL
>>>
>>> ,----[ Quote ]
>>> | That's intriguing enough, but the really interesting thing about GPU is
>>> | the license its developers have given it. They call it a "no military
>>> | use" modified version of the GNU General Public License (GPL).
>>> `----
>>>
>>> http://www.newsforge.com/article.pl?sid=06/08/14/1438204
>>
>> I'm a practical person living in a world full of rogue predatory sharks
>> and what helps to keep those sharks at bay is my countries military whom
>> I trust. As much as I want to see OSS succeed, I will not for the
>> foreseeable future agree with the stance of those that fixate on a
>> unrealistic and badly timed ideal.
>>
>> I have not forgotten that the military is served by flesh and blood
>> people that have families that they want to go home to in one piece.
>> Everything that helps to that end is what I want them to have.
>
> I have always feared the day when robots (not necessary
> human lookalikes) will replace humans in the battelfield,
> but also kill people who run them (think Terminator and
> classical apocalypse films). If the robots are autonomous,
> there is also the possibility of accidents--robots running
> amuck shooting innocent bystanders. All in all, I hope some
> legistlation bans military robots, but temptation leaves
> little chance for this to ever become a reality. Think, for
> example, about nuclear treaties and the end of the cold war.
> Despite all, there are many countries that attempt to
> harness the power of the hydrogen bomb. And returning to the
> subject of fighting robots, I believe that the Japanese have
> done some work in the area and maybe have some protootypes.
> But I can't recall for sure... smart bombs are half-way the-
> re. And for those who can't afford /smart/ bombs, there is
> artificial intellgence -- a suicide bomber with a 'trigger'.
One of the arguments Asimov used in his seminal Robot novels was that
the very presence of the robots, governed by his three laws, actually
put an end to petty violence, as the robots would always prevent it,
whilst minimising harm to the involved parties.
It was interesting to see "data" in Star-Trek TNG, as the Enterprise is
basically a fighting ship (prime directives sound great, but there're
still phasers and photon torpedoes!), then the impact of contributing
to the death of people of some kind might've been an interesting area
to explore. In one of Asimov's books, the definition of "human" was so
narrow that robots could actually kill people because the people they
were killing fell outside their definition. This would suit our racists
here very well indeed, of course, and is not all that far from pervading
attitudes of 50-100 years ago.
So who would police the "definition" of a human?
--
| Mark Kent -- mark at ellandroad dot demon dot co dot uk |
After your lover has gone you will still have PEANUT BUTTER!
|
|