noworldsystem.com


MAVs: The Future of Domestic Surveillance

MAVs: The Future of Domestic Surveillance

http://www.youtube.com/watch?v=h2ZA-ecdtfo

Air Force Completes Assassinator Robot Wasp Project

 



Robots go to war: American insect Terminators

Robots go to war: American insect Terminators

http://www.youtube.com/watch?v=hP7FtwEejVI

Packs Of Robots Will Hunt Uncooperative Humans

Dragonfly or Insect Spy? Scientists at Work on Robobugs

Flying Taser Saucer To Become A Reality

A.I. War Machines a “Threat to Humanity”

 



Should we fear neuro-war more than normal war?

Should we fear neuro-war more than normal war?

FP
September 7, 2009

A new opinion piece in Nature (ungated version via a somewhat dubious Website) takes biologists to task for allowing the militarization of their work for the development of neuro-weapons — chemical agents that are weaponized in spray or gas form to induce altered mental states.

The Russian military’s use of fentanyl to incapacitate Chechen terrorists — and kill 120 hostages in the process — during the 2002 Nord-Ost seige was something of a wakeup call in this area. It’s no secret that the U.S. and other militaries are interested in these potential weapons (I wrote about a 2008 DoD-commisioned study on cognitive enhancement and mind control last November.) According to the Nature story, some companies are now marketing oxytocin based on studies showing that in spray form, it can increase feelings of trust in humans, an application discussed in the 2008 study.

Blogger Ryan Sager wonders what would have happened if the Iranian government had had such a weapon during this summer’s protests. He continues:

Now, some would argue that the use of non-lethal agents is potentially desirable. After all, the alternative is lethal measures. But the author of the opinion piece, Malcolm Dando, professor of International Security in the Department of Peace Studies at Bradford University in the UK, doesn’t see it that way:

At the Nord-Ost siege, for instance, terrorists exposed to the fentanyl mixture were shot dead rather than arrested. Likewise, in Vietnam, the US military used vast quantities of CS gas — a ‘non-lethal’ riot-control agent — to increase the effectiveness of conventional weapons by flushing the Viet Cong out of their hiding places.

While we might want to believe that we would use such weapons ethically going forward, the idea of a dictator in possession of such weapons is rather chilling — moving into science-fiction-dystopia territory.

I suppose. Though I think I’m going to continue to be most worried about them having nuclear weapons. The Iranian regimes rigged an election; killed tortured and hundreds of protesters; and coerced opposition leaders into giving false confessions. I don’t think it would have been that much worse if they had had weaponized oxytocin on their hands.

Sager is right that this is a topic worthy of debate, but I find it strange that research on weapons designed to incapacitate or disorient the enemy seems to disturb people a lot more than research on weapons designed to kill them. As for the idea that neurological agents could facilitate other abuses, Kelly Lowenberg writes on the blog of the Stanford Center for Law and the Neurosciences:

Or is our real concern that, by incapacitating, they facilitate brutality toward a defenseless prisoner? If so, then the conversation should be about illegal soldier/police abuse, not the chemical agents themselves.

I think this is right. New technology, as it always does, is going to provoke new debates on the right to privacy, the treatment of prisoners, and the laws of war, but the basic principles that underly that debate shouldn’t change because the weapons have.

 



Packs Of Robots Will Hunt Uncooperative Humans

Packs Of Robots Will Hunt Uncooperative Humans

New Scientist
October 23, 2008

The latest request from the Pentagon jars the senses. At least, it did mine. They are looking for contractors to provide a “Multi-Robot Pursuit System” that will let packs of robots “search for and detect a non-cooperative human”.

One thing that really bugs defence chiefs is having their troops diverted from other duties to control robots. So having a pack of them controlled by one person makes logistical sense. But I’m concerned about where this technology will end up.

Given that iRobot last year struck a deal with Taser International to mount stun weapons on its military robots, how long before we see packs of droids hunting down pesky demonstrators with paralysing weapons? Or could the packs even be lethally armed? I asked two experts on automated weapons what they thought – click the continue reading link to read what they said.

Both were concerned that packs of robots would be entrusted with tasks – and weapons – they were not up to handling without making wrong decisions.

Steve Wright of Leeds Metropolitan University is an expert on police and military technologies, and last year correctly predicted this pack-hunting mode of operation would happen. “The giveaway here is the phrase ’a non-cooperative human subject’,” he told me:

“What we have here are the beginnings of something designed to enable robots to hunt down humans like a pack of dogs. Once the software is perfected we can reasonably anticipate that they will become autonomous and become armed.

We can also expect such systems to be equipped with human detection and tracking devices including sensors which detect human breath and the radio waves associated with a human heart beat. These are technologies already developed.”

Another commentator often in the news for his views on military robot autonomy is Noel Sharkey, an AI and robotics engineer at the University of Sheffield. He says he can understand why the military want such technology, but also worries it will be used irresponsibly.

“This is a clear step towards one of the main goals of the US Army’s Future Combat Systems project, which aims to make a single soldier the nexus for a large scale robot attack. Independently, ground and aerial robots have been tested together and once the bits are joined, there will be a robot force under command of a single soldier with potentially dire consequences for innocents around the corner.”

What do you make of this? Are we letting our militaries run technologically amok with our tax dollars? Or can robot soldiers be programmed to be even more ethical than human ones, as some researchers claim?

 



DARPA Wants Real-Time Images of Inside Your House

DARPA Wants Real-Time Images of Inside Your House

Wired Magazine
October 23, 2008

The Pentagon wants to be able to peer inside your apartment building — picking out where all the major rooms, stairways, and dens of evil-doers are.

The U.S. military is getting better and better at spotting its enemies, when they’re roaming around the streets. But once those foes duck into houses, they become a whole lot harder to spot. That’s why Darpa, the Defense Department’s way-out research arm, is looking to develop a suite of tools for “external sensing deep inside buildings.” The ultimate goal of this Harnessing Infrastructure for Building Reconnaissance (HIBR) project: “reverse the adversaries’ advantage of urban familiarity and sanctuary and provide U.S. Forces with complete above- and below-ground awareness.”

By the end of the project, Darpa wants a set of technologies that can see into a 10-story building with a two-level basement in a “high-density urban block” — and produce a kind of digital blueprint of the place. Using sensors mounted on backpacks, vehicles, or aircraft, the HIBR gear would, hopefully, be able to pick out every room, wall, stairway, and basement in the building — as well as all of the “electrical, plumbing, and installation systems.”

Darpa doesn’t come out and say it openly. But it appears that the agency wants these HIBR gadgets to be able to track the people inside these buildings, as well. Why else would these sensors be required to “provide real-time updates” once U.S. troops enter the building? Perhaps there’s more about the people-spotting tech, in the “classified appendix” to HIBR’s request for proposals.

There are already a number of efforts underway, both military and civilian, to try to see inside buildings. The Army has a couple of hand-held gadgets that can spot people just on the other side of a wall. Some scientists claim that can even catch human breathing and heartbeats beyond a barrier.

Darpa’s Visibuilding program uses a kind of radar to scan structures. The problem isn’t sending the radio frequency (RF) energy in. It’s “making sense of the data produced from all the reflected signals” that come back, Henry Kenyon wrote in a recent Signal magazine article. Besides processing data from the inside a structure, the system also must filter a large amount of RF propagation in the form of randomly reflected signals. Although radar technologies exist that can track people in adjacent rooms, it is much more difficult to map an entire building. “Going through one wall is not that bad, but a building is basically an RF hall of mirrors. You’ve got signals bouncing all over the place,” Darpa program manager Dr. Edward J. Baranoski says. Field trials are supposed to get underway this fall.

 



Future Drugs Will Make Troops Want to Fight

Future Drugs Will Make Troops Want to Fight
Potential technologies to picture what someone is thinking, drugs that give soldiers super-human power and awareness, robots controlled with the brain and land-mines that release drugs to incapacitate suspects is in the works.

Wired
August 13, 2008

Drugs that make soldiers want to fight. Robots linked directly to their controllers’ brains. Lie-detecting scans administered to terrorist suspects as they cross U.S. borders.

These are just a few of the military uses imagined for cognitive science — and if it’s not yet certain whether the technologies will work, the military is certainly taking them very seriously.

“It’s way too early to know which — if any — of these technologies is going to be practical,” said Jonathan Moreno, a Center for American Progress bioethicist and author of Mind Wars: Brain Research and National Defense. “But it’s important for us to get ahead of the curve. Soldiers are always on the cutting edge of new technologies.”

Moreno is part of a National Research Council committee convened by the Department of Defense to evaluate the military potential of brain science. Their report, “Emerging Cognitive Neuroscience and Related Technologies,” was released today. It charts a range of cognitive technologies that are potentially powerful — and, perhaps, powerfully troubling.

Here are the report’s main areas of focus:

  • Mind reading. The development of psychological models and neurological imaging has made it possible to see what people are thinking and whether they’re lying. The science is, however, still in its infancy: Challenges remain in accounting for variations between individual brains, and the tendency of our brains to change over time.

    One important application is lie detection — though one hopes that the lesson of traditional lie detectors, predicated on the now-disproven idea that the physiological basis of lying can be separated from processes such as anxiety, has been learned.

    Mind readers could be used to interrogate captured enemies, as well as “terrorist suspects” passing through customs. But does this mean, for example, that travelers placed on the bloated, mistake-laden watchlist would have their minds scanned, just as their computers will be?

    The report notes that “In situations where it is important to win the hearts and minds of the local populace, it would be useful to know if they understand the information being given them.”

  • Cognitive enhancement. Arguably the most developed area of cognitive neuroscience, with drugs already allowing soldiers to stay awake and alert for days at a time, and brain-altering drugs in widespread use among civilians diagnosed with mental and behavioral problems.

    Improved drug delivery systems and improved neurological understanding could make today’s drugs seem rudimentary, giving soldiers a superhuman strength and awareness — but if a drug can be designed to increase an ability, a drug can also be designed to destroy it.

    “It’s also important to develop antidotes and protective agents against various classes of drugs,” says the report. This echoes the motivation of much federal biodefense research, in which designing defenses against potential bioterror agents requires those agents to be made — and that raises the possibility of our own weapons being turned against us, as with the post-9/11 anthrax attacks, which used a military developed strain.

  • Mind control. Largely pharmaceutical, for the moment, and a natural outgrowth of cognitive enhancement approaches and mind-reading insight: If we can alter the brain, why not control it?

    One potential use involves making soldiers want to fight. Conversely, “How can we disrupt the enemy’s motivation to fight? […] How can we make people trust us more? What if we could help the brain to remove fear or pain? Is there a way to make the enemy obey our commands?”
  • Brain-Machine Interfaces. The report focuses on direct brain-to-machine systems (rather than, for example, systems that are controlled by visual movements, which are already in limited use by paraplegics.) Among these are robotic prostheses that replace or extend body parts; cognitive and sensory prostheses, which make it possible to think and to perceive in entirely new ways; and robotic or software assistants, which would do the same thing, but from a distance.

    Many questions surrounding the safety of current brain-machine interfaces: The union of metal and flesh only lasts so long before things break down. But assuming those can be overcome, questions of plasticity arise: What happens when a soldier leaves the service? How might their brains be reshaped by their experience?

Like Moreno said, it’s too early to say what will work. The report documents in great detail the practical obstacles to these aims — not least the failure of reductionist neuroscientific models, in which a few firing neurons can be easily mapped to a psychological state, and brains can be analyzed in one-map-fits-all fashion.

But given the rapid progress of cognitive science, it’s foolish to assume that obstacles won’t be overcome. Hugh Gusterson, a George Mason University anthropologist and critic of the military’s sponsorship of social science research, says their attempt to crack the cultural code is unlikely to work — “but my sense with neuroscience,” he said, “is a far more realistic ambition.”

Gusterson is deeply pessimistic about military neuroscience, which will not be limited to the United States.

“I think most reasonable people, if they imagine a world in which all sides have figured out how to control brains, they’d rather not go there,” he said. “Most rational human beings would believe that if we could have a world where nobody does military neuroscience, we’ll all be better off. But for some people in the Pentagon, it’s too delicious to ignore.”

 

Brain will be battlefield of future, warns US intelligence report

The Guardian
August 14, 2008

Rapid advances in neuroscience could have a dramatic impact on national security and the way in which future wars are fought, US intelligence officials have been told.

In a report commissioned by the Defense Intelligence Agency, leading scientists were asked to examine how a greater understanding of the brain over the next 20 years is likely to drive the development of new medicines and technologies.

They found several areas in which progress could have a profound impact, including behaviour-altering drugs, scanners that can interpret a person’s state of mind and devices capable of boosting senses such as hearing and vision.

On the battlefield, bullets may be replaced with “pharmacological land mines” that release drugs to incapacitate soldiers on contact, while scanners and other electronic devices could be developed to identify suspects from their brain activity and even disrupt their ability to tell lies when questioned, the report says.

“The concept of torture could also be altered by products in this market. It is possible that some day there could be a technique developed to extract information from a prisoner that does not have any lasting side effects,” the report states.

The report highlights one electronic technique, called transcranial direct current stimulation, which involves using electrical pulses to interfere with the firing of neurons in the brain and has been shown to delay a person’s ability to tell a lie.

Drugs could also be used to enhance the performance of military personnel. There is already anecdotal evidence of troops using the narcolepsy drug modafinil, and ritalin, which is prescribed for attention deficit disorder, to boost their performance. Future drugs, developed to boost the cognitive faculties of people with dementia, are likely to be used in a similar way, the report adds.

Greater understanding of the brain’s workings is also expected to usher in new devices that link directly to the brain, either to allow operators to control machinery with their minds, such as flying unmanned reconnaissance drones, or to boost their natural senses.

For example, video from a person’s glasses, or audio recorded from a headset, could be processed by a computer to help search for relevant information. “Experiments indicate that the advantages of these devices are such that human operators will be greatly enhanced for things like photo reconnaissance and so on,” Kit Green, who chaired the report committee, said.

The report warns that while the US and other western nations might now consider themselves at the forefront of neuroscience, that is likely to change as other countries ramp up their computing capabilities. Unless security services can monitor progress internationally, they risk “major, even catastrophic, intelligence failures in the years ahead”, the report warns.

“In the intelligence community, there is an extremely small number of people who understand the science and without that it’s going to be impossible to predict surprises. This is a black hole that needs to be filled with light,” Green told the Guardian.

The technologies will one day have applications in counter-terrorism and crime-fighting. The report says brain imaging will not improve sufficiently in the next 20 years to read peoples’ intentions from afar and spot criminals before they act, but it might be good enough to help identify people at a checkpoint or counter who are afraid or anxious.

“We’re not going to be reading minds at a distance, but that doesn’t mean we can’t detect gross changes in anxiety or fear, and then subsequently talk to those individuals to see what’s upsetting them,” Green said.

The development of advanced surveillance techniques, such as cameras that can spot fearful expressions on people’s faces, could lead to some inventive ways to fool them, the report adds, such as Botox injections to relax facial muscles.

Land-mines that release drugs to incapacitate an enemy
http://www.guardian.co.uk/science/2008/aug/13/military.neuroscience

Future Wars To Be Fought With Mind Drugs
http://www.roguegovernment.com/news.php?id=11432

 



A ’Frankenrobot’ with a biological brain

A ’Frankenrobot’ with a biological brain

http://www.youtube.com/watch?v=aAtL3d6igjw

Uncle Sam Wants Your Brain
http://blog.wired.com/wiredscience/2008/08/uncle-sam-wants.html

Military Report Touts Brain Altering Drugs, Mind Control To Make Soldiers Want To Fight
http://infowars.net/articles/august2008/140808Soldiers.htm