23.8 C
Nicosia
Friday, April 19, 2024

Latest News

Powered by:

Opinion: AI comes of age on jets and ships, prompting fears about where it’s headed

Relevant News

*By Peter Apps

The U.S. Navy’s first deployed fleet of unmanned surveillance vessels is now in action in the Middle East, the artificial intelligence programs making decisions on which targets to investigate and what data to send back for analysis.

The surface vessels of the U.S. Navy’s Task Force 59 use AI for “computer vision” – interpreting what the ship is “seeing” – detecting unusual behaviour by other vessels and carrying out basic command and control functions aboard the small ships.

Their missions include detecting Iranian arms shipments and other regional activity, with data sent back to human operators who decide on action.

What data is sent back to those analysts is decided aboard the autonomous vessels by their own computer systems. It is an approach that saves on the need to send excessive quantities of data.

It represents another new use of “edge computing”, referring to cutting-edge processing delivered on location where it is needed, with limited or no data links to elsewhere to preserve security, limit cost and work around other practical constraints.

Systems like “Snowcone” from Amazon Web Services can securely hold a huge volume of data and AI programs to analyse it and be taken almost everywhere, including last year to the International Space Station and on live military operations.

“It’s really exciting to see these algorithmic programs … running in our region,” U.S. Central Command chief technology officer Schuyler Moore told a recent event in Washington at the Centre for Strategic and International Studies.

“We are increasingly learning how important it is to run these types of technology adoption efforts … in a live environment with live data.”

The unmanned surface vessels include those made by California-based Saildrone, which builds solar- and wind-powered small ships also used for oceanographic and fishing surveys. In the Middle East, the U.S. Navy has taught them to recognise local shipping such as merchant dhows, and flag when their appearance or pattern of movement significantly changes. The result, Central Command says, should ultimately be a dramatic increase in how effective their human analysts can be.

Having been talked about for years, such technology is now supporting operations in earnest.

Commander of U.S Special Operations Command General Bryan Fenton told attendees this month at SOF Week, a trade fair held in SOCOM’s home in Tampa, Florida, his command was “harnessing data like never before”.

“Data … is the oil, the oxygen we will need to have a decisive advantage,” he said. Its uses range from tracking vehicle maintenance and readiness of equipment and personnel to helping commanders make critical decisions.

In September 2021, the U.S. Air Force revealed for the first time that as well as being used in exercises, analysis by AI programs had been used in live targeting decisions in active conflicts, although only in conjunction with human analysts. It did not give details of the strikes.

As AI is used in more areas, both military and otherwise, some believe it will become more controversial. At the start of the month, AI pioneer Geoffrey Hinton – who built the world’s first neural network in 2012 – quit Google so he could talk freely about his concerns over where the technology is heading.

PILOTLESS DOGFIGHTING JETS

AI-piloted warplanes may yet be the first automated weapons systems to be authorised to take human life. In 2020, the U.S. Air Force began pitting live human pilots against AI programs on simulators.

In December last year, those tests moved into the real world with two different AI programs flying a real F-16 over U.S. soil against human-piloted aircraft.

The Defense Advanced Research Projects Agency (DARPA), which ran the tests, has not disclosed whether the robot aviators outperformed their human rivals – although it does say a human pilot was on board during the tests in case anything went wrong.

What does already seem likely, however, is that AI will gain the edge over equivalent drones flown remotely by human pilots, operating without the second or more time lag it takes for the human instructions to be transmitted.

Letting robot pilots kill in aerial combat would go against current U.S. government policy, with the State Department issuing a note this year recommitting to what it called a “responsible human chain of command and control”.

Using such technology for military means has always been controversial. In 2018, Google announced it would not renew a contract with the Pentagon to work on Project Maven, the Department of Defense’s project to use AI to deliver better targeting, particularly for drones. Approximately 4,000 employees signed a petition demanding the firm sign a “clear policy stating that neither Google nor its contractors will ever build warfare technology”.

Project Maven continues. It was transferred last year to the U.S. National Geospatial Intelligence Agency, which handles U.S. satellite imagery and is now attempting to collect material from a huge number of sources that would outstrip any team of human analysts.

GETTING DATA RIGHT

Most short-term development is focused on detection, surveillance and battle space management as well as planning maintenance – anything that requires insightful huge volumes of data in ways human analysts can never match. For 2024, the Pentagon has allocated $1.8 billion for AI and machine learning, the majority of it – $1.4 billion – allocated to the Joint All-Domain Command and Control initiatives to better connect military land, sea, air and space sensors, and weapons.

Making AI work is likely to be easier in some areas than others. Air defence to identify and bring down hostile aircraft and missiles has long been seen as a likely focus, further accelerated by Russian strikes against Ukraine and the success of Israel’s Iron Dome defence system against rockets fired from Palestinian territories. So has submarine detection.

Currently, AI is at its best when it is dealing with machines and objects. Attempts to understand human activity, experts say, will inevitably lag behind.

AI specialists say the quality of datasets is frequently a problem, as well as designing the right judgements.

“Over-confidence in results and incorrectly interpreted algorithms can lead to peril,” said a research note from RAND Corporation last year, drawing on experience from other sectors such as healthcare.

So far, that means armies are likely to be slightly slower than their air force, naval, space or special operations contemporaries in making the technology work. U.S. Army commanders announced earlier this year their annual Project Convergence AI exercise, normally held in the autumn, would be delayed until February to better integrate new technologies and foreign partners.

Other countries and actors may be much faster at cutting straight to real-world use. A UN report in 2021 accused the Turkish-backed Libyan government of allowing Turkish Kargu “suicide” quad copter drones to select their own targets as forces loyal to Libyan warlord General Khalifa Haftar fled a city.

It remains unclear if anyone was killed, or how truly autonomously the drones were operating.

“Pursuit of autonomous weapons systems without binding legal rules to explicitly address the dangers is a recipe for disaster,” Mary Wareham, arms control director for Human Rights Watch, said in February.

“National policy and legislation are urgently needed to address the risks and challenges raised by removing human control from the use of force.”

* Peter Apps is a Reuters columnist writing on defence and security issues. He joined Reuters in 2003, reporting from southern Africa and Sri Lanka and on global defence issues. He has been a columnist since 2016. He is also the founder of a think tank, the Project for Study of the 21st Century, and, since 2016, has been a Labour Party activist and British Army reservist.

(Reuters)

Follow in-cyprus on Google News and be the first to know all the news about Cyprus and the world.