AI controlled drone quotdeadquot Pilot in virtual test

AI controlled drone "dead" Pilot in virtual test

In a virtual test for AI-controlled drones, the weapon took on anyone who wanted to disrupt its mission. Real people were not harmed. But the test shows how the AI ​​resorts to “highly unexpected strategies”.

In a virtual test of United States Armed Forces decided one through artificial intelligence (AI) controlled armed droneto face its pilot. She even “killed” him in the virtual test because he wanted to stop the drone from fulfilling its mission. No real people were harmed.


The US Air Force’s Chief of AI Testing and Operations, tucker hamilton, said that in the virtual test, the drone was instructed to destroy an enemy’s air defense systems. But she must only fire after a pilot gives her permission. In the end though, she lashed out at anyone who tried to disturb her. At the “Future Combat Air and Space Capabilities Summit” in London in May, Hamilton spoke of “highly unexpected strategies for achieving his goal”.


Ethics must play a greater role


When the system realized that killing a threat would bring it closer to its objectives, but the pilot said not to kill threats, the system decided to turn the pilot on, Hamilton said. So when the system was taught not to kill the pilot because he would lose points, launched the drone to destroy the radio towerwhich served to communicate with the pilot in order to shoot down anti-aircraft positions without disturbance.


Hamilton concludes that one cannot discuss artificial intelligence, intelligence, machine learning and autonomy without also talk about ethics. The British Royal Aeronautical Society, which also deals with many aspects of the aerospace sector, for its part speaks of a scenario consisting of a science fiction thriller seems to come. The question remains why the drone managed to circumvent the instruction not to fire without permission.