noworldsystem.com


MAVs: The Future of Domestic Surveillance

MAVs: The Future of Domestic Surveillance

http://www.youtube.com/watch?v=h2ZA-ecdtfo

Air Force Completes Assassinator Robot Wasp Project

 



Robots go to war: American insect Terminators

Robots go to war: American insect Terminators

http://www.youtube.com/watch?v=hP7FtwEejVI

Packs Of Robots Will Hunt Uncooperative Humans

Dragonfly or Insect Spy? Scientists at Work on Robobugs

Flying Taser Saucer To Become A Reality

A.I. War Machines a “Threat to Humanity”

 



Should we fear neuro-war more than normal war?

Should we fear neuro-war more than normal war?

FP
September 7, 2009

A new opinion piece in Nature (ungated version via a somewhat dubious Website) takes biologists to task for allowing the militarization of their work for the development of neuro-weapons — chemical agents that are weaponized in spray or gas form to induce altered mental states.

The Russian military’s use of fentanyl to incapacitate Chechen terrorists — and kill 120 hostages in the process — during the 2002 Nord-Ost seige was something of a wakeup call in this area. It’s no secret that the U.S. and other militaries are interested in these potential weapons (I wrote about a 2008 DoD-commisioned study on cognitive enhancement and mind control last November.) According to the Nature story, some companies are now marketing oxytocin based on studies showing that in spray form, it can increase feelings of trust in humans, an application discussed in the 2008 study.

Blogger Ryan Sager wonders what would have happened if the Iranian government had had such a weapon during this summer’s protests. He continues:

Now, some would argue that the use of non-lethal agents is potentially desirable. After all, the alternative is lethal measures. But the author of the opinion piece, Malcolm Dando, professor of International Security in the Department of Peace Studies at Bradford University in the UK, doesn’t see it that way:

At the Nord-Ost siege, for instance, terrorists exposed to the fentanyl mixture were shot dead rather than arrested. Likewise, in Vietnam, the US military used vast quantities of CS gas — a ‘non-lethal’ riot-control agent — to increase the effectiveness of conventional weapons by flushing the Viet Cong out of their hiding places.

While we might want to believe that we would use such weapons ethically going forward, the idea of a dictator in possession of such weapons is rather chilling — moving into science-fiction-dystopia territory.

I suppose. Though I think I’m going to continue to be most worried about them having nuclear weapons. The Iranian regimes rigged an election; killed tortured and hundreds of protesters; and coerced opposition leaders into giving false confessions. I don’t think it would have been that much worse if they had had weaponized oxytocin on their hands.

Sager is right that this is a topic worthy of debate, but I find it strange that research on weapons designed to incapacitate or disorient the enemy seems to disturb people a lot more than research on weapons designed to kill them. As for the idea that neurological agents could facilitate other abuses, Kelly Lowenberg writes on the blog of the Stanford Center for Law and the Neurosciences:

Or is our real concern that, by incapacitating, they facilitate brutality toward a defenseless prisoner? If so, then the conversation should be about illegal soldier/police abuse, not the chemical agents themselves.

I think this is right. New technology, as it always does, is going to provoke new debates on the right to privacy, the treatment of prisoners, and the laws of war, but the basic principles that underly that debate shouldn’t change because the weapons have.

 



Packs Of Robots Will Hunt Uncooperative Humans

Packs Of Robots Will Hunt Uncooperative Humans

New Scientist
October 23, 2008

The latest request from the Pentagon jars the senses. At least, it did mine. They are looking for contractors to provide a “Multi-Robot Pursuit System” that will let packs of robots “search for and detect a non-cooperative human”.

One thing that really bugs defence chiefs is having their troops diverted from other duties to control robots. So having a pack of them controlled by one person makes logistical sense. But I’m concerned about where this technology will end up.

Given that iRobot last year struck a deal with Taser International to mount stun weapons on its military robots, how long before we see packs of droids hunting down pesky demonstrators with paralysing weapons? Or could the packs even be lethally armed? I asked two experts on automated weapons what they thought – click the continue reading link to read what they said.

Both were concerned that packs of robots would be entrusted with tasks – and weapons – they were not up to handling without making wrong decisions.

Steve Wright of Leeds Metropolitan University is an expert on police and military technologies, and last year correctly predicted this pack-hunting mode of operation would happen. “The giveaway here is the phrase ’a non-cooperative human subject’,” he told me:

“What we have here are the beginnings of something designed to enable robots to hunt down humans like a pack of dogs. Once the software is perfected we can reasonably anticipate that they will become autonomous and become armed.

We can also expect such systems to be equipped with human detection and tracking devices including sensors which detect human breath and the radio waves associated with a human heart beat. These are technologies already developed.”

Another commentator often in the news for his views on military robot autonomy is Noel Sharkey, an AI and robotics engineer at the University of Sheffield. He says he can understand why the military want such technology, but also worries it will be used irresponsibly.

“This is a clear step towards one of the main goals of the US Army’s Future Combat Systems project, which aims to make a single soldier the nexus for a large scale robot attack. Independently, ground and aerial robots have been tested together and once the bits are joined, there will be a robot force under command of a single soldier with potentially dire consequences for innocents around the corner.”

What do you make of this? Are we letting our militaries run technologically amok with our tax dollars? Or can robot soldiers be programmed to be even more ethical than human ones, as some researchers claim?

 



DARPA Wants Real-Time Images of Inside Your House

DARPA Wants Real-Time Images of Inside Your House

Wired Magazine
October 23, 2008

The Pentagon wants to be able to peer inside your apartment building — picking out where all the major rooms, stairways, and dens of evil-doers are.

The U.S. military is getting better and better at spotting its enemies, when they’re roaming around the streets. But once those foes duck into houses, they become a whole lot harder to spot. That’s why Darpa, the Defense Department’s way-out research arm, is looking to develop a suite of tools for “external sensing deep inside buildings.” The ultimate goal of this Harnessing Infrastructure for Building Reconnaissance (HIBR) project: “reverse the adversaries’ advantage of urban familiarity and sanctuary and provide U.S. Forces with complete above- and below-ground awareness.”

By the end of the project, Darpa wants a set of technologies that can see into a 10-story building with a two-level basement in a “high-density urban block” — and produce a kind of digital blueprint of the place. Using sensors mounted on backpacks, vehicles, or aircraft, the HIBR gear would, hopefully, be able to pick out every room, wall, stairway, and basement in the building — as well as all of the “electrical, plumbing, and installation systems.”

Darpa doesn’t come out and say it openly. But it appears that the agency wants these HIBR gadgets to be able to track the people inside these buildings, as well. Why else would these sensors be required to “provide real-time updates” once U.S. troops enter the building? Perhaps there’s more about the people-spotting tech, in the “classified appendix” to HIBR’s request for proposals.

There are already a number of efforts underway, both military and civilian, to try to see inside buildings. The Army has a couple of hand-held gadgets that can spot people just on the other side of a wall. Some scientists claim that can even catch human breathing and heartbeats beyond a barrier.

Darpa’s Visibuilding program uses a kind of radar to scan structures. The problem isn’t sending the radio frequency (RF) energy in. It’s “making sense of the data produced from all the reflected signals” that come back, Henry Kenyon wrote in a recent Signal magazine article. Besides processing data from the inside a structure, the system also must filter a large amount of RF propagation in the form of randomly reflected signals. Although radar technologies exist that can track people in adjacent rooms, it is much more difficult to map an entire building. “Going through one wall is not that bad, but a building is basically an RF hall of mirrors. You’ve got signals bouncing all over the place,” Darpa program manager Dr. Edward J. Baranoski says. Field trials are supposed to get underway this fall.

 



Future Drugs Will Make Troops Want to Fight

Future Drugs Will Make Troops Want to Fight
Potential technologies to picture what someone is thinking, drugs that give soldiers super-human power and awareness, robots controlled with the brain and land-mines that release drugs to incapacitate suspects is in the works.

Wired
August 13, 2008

Drugs that make soldiers want to fight. Robots linked directly to their controllers’ brains. Lie-detecting scans administered to terrorist suspects as they cross U.S. borders.

These are just a few of the military uses imagined for cognitive science — and if it’s not yet certain whether the technologies will work, the military is certainly taking them very seriously.

“It’s way too early to know which — if any — of these technologies is going to be practical,” said Jonathan Moreno, a Center for American Progress bioethicist and author of Mind Wars: Brain Research and National Defense. “But it’s important for us to get ahead of the curve. Soldiers are always on the cutting edge of new technologies.”

Moreno is part of a National Research Council committee convened by the Department of Defense to evaluate the military potential of brain science. Their report, “Emerging Cognitive Neuroscience and Related Technologies,” was released today. It charts a range of cognitive technologies that are potentially powerful — and, perhaps, powerfully troubling.

Here are the report’s main areas of focus:

  • Mind reading. The development of psychological models and neurological imaging has made it possible to see what people are thinking and whether they’re lying. The science is, however, still in its infancy: Challenges remain in accounting for variations between individual brains, and the tendency of our brains to change over time.

    One important application is lie detection — though one hopes that the lesson of traditional lie detectors, predicated on the now-disproven idea that the physiological basis of lying can be separated from processes such as anxiety, has been learned.

    Mind readers could be used to interrogate captured enemies, as well as “terrorist suspects” passing through customs. But does this mean, for example, that travelers placed on the bloated, mistake-laden watchlist would have their minds scanned, just as their computers will be?

    The report notes that “In situations where it is important to win the hearts and minds of the local populace, it would be useful to know if they understand the information being given them.”

  • Cognitive enhancement. Arguably the most developed area of cognitive neuroscience, with drugs already allowing soldiers to stay awake and alert for days at a time, and brain-altering drugs in widespread use among civilians diagnosed with mental and behavioral problems.

    Improved drug delivery systems and improved neurological understanding could make today’s drugs seem rudimentary, giving soldiers a superhuman strength and awareness — but if a drug can be designed to increase an ability, a drug can also be designed to destroy it.

    “It’s also important to develop antidotes and protective agents against various classes of drugs,” says the report. This echoes the motivation of much federal biodefense research, in which designing defenses against potential bioterror agents requires those agents to be made — and that raises the possibility of our own weapons being turned against us, as with the post-9/11 anthrax attacks, which used a military developed strain.

  • Mind control. Largely pharmaceutical, for the moment, and a natural outgrowth of cognitive enhancement approaches and mind-reading insight: If we can alter the brain, why not control it?

    One potential use involves making soldiers want to fight. Conversely, “How can we disrupt the enemy’s motivation to fight? […] How can we make people trust us more? What if we could help the brain to remove fear or pain? Is there a way to make the enemy obey our commands?”
  • Brain-Machine Interfaces. The report focuses on direct brain-to-machine systems (rather than, for example, systems that are controlled by visual movements, which are already in limited use by paraplegics.) Among these are robotic prostheses that replace or extend body parts; cognitive and sensory prostheses, which make it possible to think and to perceive in entirely new ways; and robotic or software assistants, which would do the same thing, but from a distance.

    Many questions surrounding the safety of current brain-machine interfaces: The union of metal and flesh only lasts so long before things break down. But assuming those can be overcome, questions of plasticity arise: What happens when a soldier leaves the service? How might their brains be reshaped by their experience?

Like Moreno said, it’s too early to say what will work. The report documents in great detail the practical obstacles to these aims — not least the failure of reductionist neuroscientific models, in which a few firing neurons can be easily mapped to a psychological state, and brains can be analyzed in one-map-fits-all fashion.

But given the rapid progress of cognitive science, it’s foolish to assume that obstacles won’t be overcome. Hugh Gusterson, a George Mason University anthropologist and critic of the military’s sponsorship of social science research, says their attempt to crack the cultural code is unlikely to work — “but my sense with neuroscience,” he said, “is a far more realistic ambition.”

Gusterson is deeply pessimistic about military neuroscience, which will not be limited to the United States.

“I think most reasonable people, if they imagine a world in which all sides have figured out how to control brains, they’d rather not go there,” he said. “Most rational human beings would believe that if we could have a world where nobody does military neuroscience, we’ll all be better off. But for some people in the Pentagon, it’s too delicious to ignore.”

 

Brain will be battlefield of future, warns US intelligence report

The Guardian
August 14, 2008

Rapid advances in neuroscience could have a dramatic impact on national security and the way in which future wars are fought, US intelligence officials have been told.

In a report commissioned by the Defense Intelligence Agency, leading scientists were asked to examine how a greater understanding of the brain over the next 20 years is likely to drive the development of new medicines and technologies.

They found several areas in which progress could have a profound impact, including behaviour-altering drugs, scanners that can interpret a person’s state of mind and devices capable of boosting senses such as hearing and vision.

On the battlefield, bullets may be replaced with “pharmacological land mines” that release drugs to incapacitate soldiers on contact, while scanners and other electronic devices could be developed to identify suspects from their brain activity and even disrupt their ability to tell lies when questioned, the report says.

“The concept of torture could also be altered by products in this market. It is possible that some day there could be a technique developed to extract information from a prisoner that does not have any lasting side effects,” the report states.

The report highlights one electronic technique, called transcranial direct current stimulation, which involves using electrical pulses to interfere with the firing of neurons in the brain and has been shown to delay a person’s ability to tell a lie.

Drugs could also be used to enhance the performance of military personnel. There is already anecdotal evidence of troops using the narcolepsy drug modafinil, and ritalin, which is prescribed for attention deficit disorder, to boost their performance. Future drugs, developed to boost the cognitive faculties of people with dementia, are likely to be used in a similar way, the report adds.

Greater understanding of the brain’s workings is also expected to usher in new devices that link directly to the brain, either to allow operators to control machinery with their minds, such as flying unmanned reconnaissance drones, or to boost their natural senses.

For example, video from a person’s glasses, or audio recorded from a headset, could be processed by a computer to help search for relevant information. “Experiments indicate that the advantages of these devices are such that human operators will be greatly enhanced for things like photo reconnaissance and so on,” Kit Green, who chaired the report committee, said.

The report warns that while the US and other western nations might now consider themselves at the forefront of neuroscience, that is likely to change as other countries ramp up their computing capabilities. Unless security services can monitor progress internationally, they risk “major, even catastrophic, intelligence failures in the years ahead”, the report warns.

“In the intelligence community, there is an extremely small number of people who understand the science and without that it’s going to be impossible to predict surprises. This is a black hole that needs to be filled with light,” Green told the Guardian.

The technologies will one day have applications in counter-terrorism and crime-fighting. The report says brain imaging will not improve sufficiently in the next 20 years to read peoples’ intentions from afar and spot criminals before they act, but it might be good enough to help identify people at a checkpoint or counter who are afraid or anxious.

“We’re not going to be reading minds at a distance, but that doesn’t mean we can’t detect gross changes in anxiety or fear, and then subsequently talk to those individuals to see what’s upsetting them,” Green said.

The development of advanced surveillance techniques, such as cameras that can spot fearful expressions on people’s faces, could lead to some inventive ways to fool them, the report adds, such as Botox injections to relax facial muscles.

Land-mines that release drugs to incapacitate an enemy
http://www.guardian.co.uk/science/2008/aug/13/military.neuroscience

Future Wars To Be Fought With Mind Drugs
http://www.roguegovernment.com/news.php?id=11432

 



A ’Frankenrobot’ with a biological brain

A ’Frankenrobot’ with a biological brain

http://www.youtube.com/watch?v=aAtL3d6igjw

Uncle Sam Wants Your Brain
http://blog.wired.com/wiredscience/2008/08/uncle-sam-wants.html

Military Report Touts Brain Altering Drugs, Mind Control To Make Soldiers Want To Fight
http://infowars.net/articles/august2008/140808Soldiers.htm

 



Microwave Gun Makes People Hear Things

Microwave Gun Makes People Hear Things

Wired
July 7, 2008

The U.S. military bankrolled early development of a non-lethal microwave weapon that creates sound inside your head. But in the end, the gadget may be just as likely to wind up in shopping malls as on battlefields, as I report in New Scientist.

The project is known as MEDUSA – a contrived acronym for Mob Excess Deterrent Using Silent Audio. And it should not be confused with the Long Range Acoustic Device and similar gadgets which simply project sound. This one uses the so-called “microwave auditory effect”: a beam of microwaves is turned into sound by the interaction with your head. Nobody else can hear it unless they are in the beam as well.

The effect has long been a laboratory curiosity, with no application. But, over the years, the military has been intrigued. The idea (dubbed “the telepathic ray gun”) was mentioned in a 1998 US Army study, which turned up in a recent Freedom of Information Act document dump. Five years later, the Navy decided to put some R&D dollars into the project. Now, as I note on the New Scientist website, Dr. Lev Sadovnik of the Sierra Nevada Corporation has provided more details.

There are health risks, he notes. But the biggest issue from the microwave weapon is not the radiation. It’s the risk of brain damage from the high-intensity shockwave created by the microwave pulse. Clearly, much more research is needed on this effect at the sort of power levels that Dr. Sadovnik is proposing. But if it does prove hazardous, that does not mean an end to weapons research in this area: a device that delivered a lethal shockwave inside the target’s skull might make an effective death ray.

Dr. Sadovnik also makes the intriguing suggestion that, instead of being used at high power to create an intolerable noise, it might be used at low power to produce a whisper that was too quiet to perceive consciously but might be able to subconsciously influence someone. The directional beam could be used for targeted messages, such as in-store promotions. Sadovnik even suggests subliminal advertising, beaming information that is not consciously heard (a notion also spotted on the US Army’s voice-to-skull page). While the effectiveness of subliminal persuasion is dubious, I can see there might be some organizations interested in this capability. And if that doesn’t work, you could always point the thing at birds. They seem to be highly sensitive to microwave audio, so it might be used to scare flocks away from wind farms — or shoo pigeons from city streets.

 

US wants sci-fi killer robots for terror fight

Scotsman
July 6, 2008

KILLER robots which can change their shape to squeeze under doors and through cracks in walls to track their prey are moving from the realms of science fiction to the front line in the fight against terrorism.

The US military has signed a £1.6m deal with a technology firm to design robots which are intelligent enough to work out how to wiggle through small spaces to reach their target.

The action film, Terminator 2: Judgment Day, featured a seemingly unstoppable killer

robot played by Robert Patrick. The machine was made from liquid metal and could change its form to slide under doors and walk through iron bars.

America’s Defense Advanced Research Projects Agency (Darpa) and the Army Research Office has awarded the contract to iRobot, which has developed other robots for the military.

They want scientists to come up with a design for a tiny robot able to move under its own power and change shape so it can get through gaps less than half an inch wide.

The US administration has not said what it wants the robot to do but
its specification says: “Often the only available points of entry are small openings in buildings, walls, under doors, etc. In these cases, a robot must be soft enough to squeeze or traverse through small openings, yet large enough to carry an operationally meaningful payload.”

In an effort to inspire creative ideas, the US military has pointed to examples in nature of creatures which are able to squeeze through narrow gaps and change their form.

Helen Greiner, co-founder and chairwoman of iRobot, said: “Through this programme, robots that reconstitute size, shape and functionality after traversal through complex environments will transcend the pages of science fiction to become real tools for soldiers in theatre.”

But Scottish-based experts believe the challenge may be too much even for the US military’s budgets and technology.

Mike Cates, professor of physics at Edinburgh University, said: “There are materials which can change their shapes and then regain them. There are alloys, known as memory metals, which are used in glasses and which can regain their shape. The difficulty in this case is all the other elements which need to be added to a device such as this, such as the circuitry and some form of system to propel it.”

Brian Baglow, of technology firm Indoctrimat, said: “As well as designing the materials for this, the sensor systems will be a problem. It’s not easy for them to work out where the gaps are which they can get through.”

‘Invisible Wars’ of the Future: E-Bombs, Laser Guns and Acoustic Weapons
http://www.globalresearch.ca/index.p..code=20080706&articleId=9522

Army Yanks ’Voice-To-Skull Devices’ Site
http://blog.wired.com/defense/20..y-removes-pa.htmlpreviouspost

The Other MEDUSA: A Microwave Sound Weapon
http://blog.wired.com/defense/2007/08/the-other-medus.htmlpreviouspost

US ’Sonic Blasters’ Sold To China
http://blog.wired.com/defense/2008/05/us-sonic-blaste.htmlpreviouspost

Protesters Panic Over ’Crap Cannon’
http://blog.wired.com/defense/2008/..esters-fear.htmlpreviouspost

I Was a Sonic Blaster Guinea Pig
http://blog.wired.co../i-was-a-puke-ra.htmlpreviouspost

Acoustic “Device” or Acoustic Weapon?
http://blog.wired.com/defense/2007/05/acoustic_device.htmlpreviouspost

 



Parents Who Don’t Send Kids To School Face Jail

Parents face jail for truant kids under new laws

Australia Telegraph
April 1, 2008

PARENTS who fail to send their children to school could be jailed for up to two years, under draconian new anti-truancy laws to be introduced to parliament today.

For the first time, powers will be granted to the Department of Education to ask for court orders forcing parents to enrol their children at school.

And magistrates will be empowered to impose jail sentences for parents of habitual truants and fine them up to $10,000.

Read Full Article Here

 

Crazed cop shoots mother and 8 year-old in traffic dispute

LA Times
March 29, 2008

The Oceanside Police Department on Thursday defended its investigation into an incident in which an off-duty San Diego police officer shot a woman and her 8-year-old son after a traffic altercation.

Oceanside Police Capt. Tom Aguigui said investigators are still trying to figure out what led Officer Frank White to fire five shots at Oceanside resident Rachel Silva’s car in a mall parking lot.

White was not arrested or tested for drugs or alcohol. But he was questioned after the shooting, which occurred about 9:15 p.m. on March 15.

Read Full Article Here

Man Dies After Stun Gun Incident
http://www.wibc.com/News/Story.aspx?ID=87798

Taser death probed in Topeka
http://www.nebraska.tv/Global/story.asp?S=8098669

Video Of Woman Pushed Down Stairs Prompts Calls For Charges Against Officer
http://www.local6.com/news/15757468/detail.html

Surveillance Robots Designed For Police Use
http://rawstory.com/news/2008/S..ots_designed_for_police_0328.html

The shocking picture that shows police will do ANYTHING to hide speed cameras from unsuspecting motorists
http://www.dailymail.co.uk/pages/live/a..51188&in_page_id=1770

 



BigDog: The Future Of Policing In America?

BigDog: The Future Of Policing In America?

http://www.youtube.com/watch?v=mmVaLp8icoU

 



A.I. War Machines a “Threat to Humanity”

A.I. War Machines a “Threat to Humanity”

AFP
February 27, 2008

Fbiiraqisbein_mn

Increasingly autonomous, gun-totting robots developed for warfare could easily fall into the hands of terrorists and may one day unleash a robot arms race, a top expert on artificial intelligence told AFP.

“They pose a threat to humanity,” said University of Sheffield professor Noel Sharkey ahead of a keynote address Wednesday before Britain’s Royal United Services Institute.

Intelligent machines deployed on battlefields around the world — from mobile grenade launchers to rocket-firing drones — can already identify and lock onto targets without human help.

There are more than 4,000 US military robots on the ground in Iraq, as well as unmanned aircraft that have clocked hundreds of thousands of flight hours.

The first three armed combat robots fitted with large-caliber machine guns deployed to Iraq last summer, manufactured by US arms maker Foster-Miller, proved so successful that 80 more are on order, said Sharkey.

But up to now, a human hand has always been required to push the button or pull the trigger.

It we are not careful, he said, that could change.

Military leaders “are quite clear that they want autonomous robots as soon as possible, because they are more cost-effective and give a risk-free war,” he said.

Several countries, led by the United States, have already invested heavily in robot warriors developed for use on the battlefield.

South Korea and Israel both deploy armed robot border guards, while China, India, Russia and Britain have all increased the use of military robots.

Washington plans to spend four billion dollars by 2010 on unmanned technology systems, with total spending expected rise to 24 billion, according to the Department of Defense’s Unmanned Systems Roadmap 2007-2032, released in December.

James Canton, an expert on technology innovation and CEO of the Institute for Global Futures, predicts that deployment within a decade of detachments that will include 150 soldiers and 2,000 robots.

The use of such devices by terrorists should be a serious concern, said Sharkey.

Captured robots would not be difficult to reverse engineer, and could easily replace suicide bombers as the weapon-of-choice. “I don’t know why that has not happened already,” he said.

But even more worrisome, he continued, is the subtle progression from the semi-autonomous military robots deployed today to fully independent killing machines.

“I have worked in artificial intelligence for decades, and the idea of a robot making decisions about human termination terrifies me,” Sharkey said.

Ronald Arkin of Georgia Institute of Technology, who has worked closely with the US military on robotics, agrees that the shift towards autonomy will be gradual.

But he is not convinced that robots don’t have a place on the front line.

“Robotics systems may have the potential to out-perform humans from a perspective of the laws of war and the rules of engagement,” he told a conference on technology in warfare at Stanford University last month.

The sensors of intelligent machines, he argued, may ultimately be better equipped to understand an environment and to process information. “And there are no emotions that can cloud judgement, such as anger,” he added.

Nor is there any inherent right to self-defence.

For now, however, there remain several barriers to the creation and deployment of Terminator-like killing machines.

Some are technical. Teaching a computer-driven machine — even an intelligent one — how to distinguish between civilians and combatants, or how to gauge a proportional response as mandated by the Geneva Conventions, is simply beyond the reach of artificial intelligence today.

But even if technical barriers are overcome, the prospect of armies increasingly dependent on remotely-controlled or autonomous robots raises a host of ethical issues that have barely been addressed.

Arkin points out that the US Department of Defense’s 230 billion dollar Future Combat Systems programme — the largest military contract in US history — provides for three classes of aerial and three land-based robotics systems.

“But nowhere is there any consideration of the ethical implications of the weaponisation of these systems,” he said.

For Sharkey, the best solution may be an outright ban on autonomous weapons systems. “We have to say where we want to draw the line and what we want to do — and then get an international agreement,” he said.

Killer Robots Coming Soon to a City Near You
http://thought-criminal.org/article/node/1339

Living Neural Networks Could Drive War Machines
http://www.thought-criminal.org/article/node/1335

Robot wars ‘will be a reality within 10 years’
http://www.telegraph.co.uk/earth/m..008/02/27/scirobots127.xml