British Army chief: A quarter of soldiers could be robots by 2030

British Army chief: A quarter of soldiers could be robots by 2030
Editor at TechForge Media. Often sighted at global tech conferences with a coffee in one hand and laptop in the other. If it's geeky, I'm probably into it.

The head of the UK’s military has said that he expects up to a quarter of soldiers could be robots by the 2030s.

Sir Nick Carter – a decorated general with service in Bosnia, Kosovo, Afghanistan, and Basra – made the comments to Sky News on Sunday.

“I suspect we could have an army of 120,000, of which 30,000 might be robots,” Sir Nick said.

In a paper called The Internet of Battle Things – published by the US Army Research Laboratory in 2016 – the authors envision future battlefields consisting almost entirely of robots.

Reconnaissance drones in similar size and capability of dragonflies would enter buildings to scout for the enemy. Larger armed drones and other automated weapons systems would then be called in to do the fighting.

“If I projected forward another 10 years, I think we should be in no doubt that warfare will look different, there will be robots on our battlefield in future — there already are today,” Carter said in a later interview on the BBC’s Andrew Marr Show.

Countries around the world have increased their research on military robots—from unmanned submarines, to autonomous soldiers like ‘Ivan the Terminator’.

The rush to weaponise AI and robotics has been likened to the nuclear arms race. Experts have called for international laws and norms to be established which could help to prevent accidents from occurring with potentially devastating consequences.

“The challenge for us is the threat is evolving the whole time, the threat is modernising in certain quarters and we need to modernise as well, so for us it is a challenge,” said Carter.

One of the leading calls is for a human to always make the final decision, especially for kill orders. An automated system can make a suggestion, but for accountability and safeguard purposes a human should make the judgement whether it’s the right course of action.

With the likening of the rush to militarise AI and robotics to the nuclear arms race, it’s worth remembering the story of Soviet Air Defence Force Officer Stanislav Petrov.

In 1983, Petrov’s early-warning system reported the launch of five intercontinental ballistic missiles from the US. Petrov used his human instinct to correctly determine there was an error with the system and not to retaliate—something which would have sent the world into full-scale nuclear war.

Left to a system’s judgement alone, the world would be a very different place today. Billions would have died and large parts of the Earth left uninhabitable. We should perhaps remind ourselves of that when militarising new technologies.

(Photo by Arseny Togulev on Unsplash)

Interested in hearing industry leaders discuss subjects like this? Attend the co-located 5G Expo, IoT Tech Expo, Blockchain Expo, AI & Big Data Expo, and Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London, and Amsterdam.

Tags: , , , , , ,

View Comments
Leave a comment

Leave a Reply

Your email address will not be published. Required fields are marked *