Categories: world

Our “Lethality Automated System” is definitely not a Killer Robot

March 6, 2019 Science 0 Views Autonomous death machine In February, the US Army asked experts for ideas on how to build a system that would allow tanks and other vehicles to land quickly and automatically "acquire, identify and engage" targets. 19659003] Some saw this as a step towards autonomous killer robots, which led the army to now adjust its request. Yes, it says now that robots want to be able to identify and kill targets. But that doesn't mean we put the machine in a position to kill someone, an army's official told Defense One . Just a misunderstanding According to The defense A story, the army decided to revise its request for information to clarify that the Automatic Targeting and Mortality System (ATLAS) would not violate the defense ministry's policies as demands that a person always make a decision to use lethal force. To this end, the following paragraph was added to the request: All development and use of autonomous and semi-autonomous functions in weapon systems, including manned and unmanned platforms, remains subject to the guidelines of the Ministry of Defense (DoD) Directive 3000.09, which was updated in 201 7 Nothing in this communication should, of course, represent a change in the DoD policy towards autonomy in weapon systems. All use of machine learning and artificial intelligence in this program will be evaluated to ensure that they comply with DoD legal and ethical standards. Team Stake Bob Stephan, ATLAS Project Manager at Picatinny Arsenal, also took to clarify…

Autonomous death machine

In February, the US Army asked experts for ideas on how to build a system that would allow tanks and other vehicles to land quickly and automatically “acquire, identify and engage” targets. 19659003] Some saw this as a step towards autonomous killer robots, which led the army to now adjust its request.

Yes, it says now that robots want to be able to identify and kill targets. But that doesn’t mean we put the machine in a position to kill someone, an army’s official told Defense One .

Just a misunderstanding

According to The defense A story, the army decided to revise its request for information to clarify that the Automatic Targeting and Mortality System (ATLAS) would not violate the defense ministry’s policies as demands that a person always make a decision to use lethal force.

To this end, the following paragraph was added to the request:

All development and use of autonomous and semi-autonomous functions in weapon systems, including manned and unmanned platforms, remains subject to the guidelines of the Ministry of Defense (DoD) Directive 3000.09, which was updated in 201

7 Nothing in this communication should, of course, represent a change in the DoD policy towards autonomy in weapon systems. All use of machine learning and artificial intelligence in this program will be evaluated to ensure that they comply with DoD legal and ethical standards.

Team Stake

Bob Stephan, ATLAS Project Manager at Picatinny Arsenal, also took to clarify how the military depicts ATLAS working with human soldiers.

“The soldier would have to push the handle to start shooting,” he told Breaking Defense . “If it never goes down, the firing pin will never come to the weapon … So we should make sure that ATLAS never gets burned autonomously.”


Source link

Share
Published by
Faela