An AI military drone could potentially kill its user

Artificial intelligence can solve complex problems. But the usefulness of the technology has its limits. It kills users.

The American flag is on the top of a US Army soldier's uniform.

The American flag is on the top of a US Army soldier’s uniform.Daniel Karman / Dr

The US military has allegedly tested the usefulness of artificial intelligence (AI)-controlled drones in a virtual test. But according to a report in the British newspaper The Guardian, the consequences have been dire. Military style attacked anyone who intervened.

In detail, the Luftwaffe drone decided to eliminate its operator. It seems that the AI ​​drone wanted to avoid getting in the way of its mission. Col. Tucker “Cinco” Hamilton, Air Force Director of AI Tests and Operations, commented that the AI ​​in the simulated test “applied highly unexpected strategies to achieve its goal”.

Human control roles may have simply been ignored by AI

The primary task of the device was to destroy enemy air defense systems. However, not every airstrike was approved by the human observer in the test. Then the AI ​​allegedly decided to kill the leader in order to achieve its programmed goal.

However, it is unclear if the simulation actually took place. Air Force spokeswoman Ann Stefanik denied that such simulations took place in a statement to Business Insider.

“The Air Force Department has never conducted such simulations using AI drones and remains committed to the ethical and responsible use of AI technology,” Stefanik said. Alternatively, Hamilton’s statements could have been taken out of context.

Infobox image

LEAVE A REPLY

Please enter your comment!
Please enter your name here