Menu
Menu
Modern warfare: Death-dealing drones and ... illegal parking?

Modern warfare: Death-dealing drones and ... illegal parking?

Military drones may not be the only autonomous weapons we have to fear in the future: Hacked self-driving cars could hurt us, too

A cloud of 3D-printed drones big enough to bring down the latest U.S. stealth fighter, the F35, was just one of the combat scenarios evoked in a discussion of the future of warfare at the World Economic Forum in Davos on Wednesday.

Much of the discussion focused on the changes computers are bringing to the battlefield, including artificial intelligence and autonomous systems -- but also the way the battlefield is coming to computing, with cyberwar, and social media psyops an ever more real prospect.

Former U.S. Navy fighter pilot Mary Cummings, now director of the Humans and Autonomy Lab at Duke University, delivered the first strike.

"The barrier to entry to drone technology is so low that everyone can have one, and if the Chinese go out and print a million copies of a drone, a very small drone, and put those up against an F35 and they go into the engine, you basically obviate what is a very expensive platform," she said.

Drones could not only defeat the F35, on which the U.S. is spending what Cummins called "a ridiculous amount of money," but also replace them, she said.

"ISIS can go out now and print drones with a 3D printer, can print thousands of drones with a 3D printer at very low cost, and arm them with conventional weapons or biological weapons for example, and basically result in much more devastation than an F35 in a surgical strike could cause," she said.

That gave Dutch Minister of Defense Jeanine Hennis-Plasschaert pause for thought. "As I placed an order for I don't know how many F35s, I just wonder if you could advise me whether I should continue or not?" she asked Cummings.

If the perceived value of an F35 is falling, though, so too is its cost. "The price is dropping, as I understood last week from Lockheed Martin,"Hennis-Plasschaert said.

In the Netherlands, there is a hot debate on the use of autonomous weapons, according to Hennis-Plasschaert. "It's important that the deployment of such weapons must always involve meaningful human control," she said. On the flip side, future enemies may not feel the same way: "We may face self-learning systems that are able to modify their own rules of conduct, and so there's this ethical question."

That's not the only ethical question governments will need to answer, though.

With war no longer just about territorial control, "we run the risk of cyberspace being the battle space in the future," Hennis-Plasschaert said.

Agreeing on limits to such conflicts will be difficult, as there is insufficient cooperation between governments at the moment.

The Law of the Sea treaty is a nice example, she said, "but to copy this for cyberspace is not easy."

There are other boundaries to set when it comes to drone warfare, too.

"We have fully autonomous defensive weapons today," Cummings said. She wondered why they are OK, while fully autonomous offensive weapons are not.

She raised the question of future autonomous missile technology that might be able to target a person not by their GPS coordinates, as today, but by their photograph. "That missile could do a better job of targeting a bad person than a human could," she said. That scenario would make her reluctant to put a blanket ban on autonomous offensive weapons, she said.

Targeting a specific person through their photo "really is an illustration of the blurring of the line between war and peace," said Jean-Marie Guéhenno
president and CEO of International Crisis Group and a former UN peacekeeper. The traditional way of dealing with that would be through a court or military tribunal, he said.

Airborne drones aren't the only autonomous vehicles that might cause concern, Cummings said.

"When we go to an internet of things for vehicles, we will have a potential worldwide connectivity of terrorism, where terrorists can get into the network and start hacking driverless cars."

Worse still, she said, they could hack a truck. They don't even have to have explosives on board to cause trouble she said: Hacking half a dozen trucks in the Washington, D.C., area and stopping them in the right places could bring traffic to a halt and open the way for all sorts of mischief.

But what of social media? "Does the power of social media mean traditional military might is less important?" asked Shirley Ann Jackson, president of Rensselaer Polytechnic Institute.

Social media plays a role, said Lawrence Freedman, emeritus professor of war studies at King's College London. "But I don't think we should consider that new," he said. "If we look back at the strategists of the past, what they called the psychological element was always there, was always important."

So there you have it: In the future, war may not be declared by drones dropping destruction on our heads, but by a spate of unexplained illegal parking downtown.

Join the CIO Australia group on LinkedIn. The group is open to CIOs, IT Directors, COOs, CTOs and senior IT managers.

Join the newsletter!

Error: Please check your email address.

More about AirborneAutonomyCumminsLockheed MartinRensselaer Polytechnic Institute

Show Comments

Market Place

Computerworld
ARN
Techworld
CMO
<img height="1" width="1" style="border-style:none;" alt="" src="//insight.adsrvr.org/track/evnt/?adv=bitgblf&ct=0:dn998liw&fmt=3"/>