In the military, “kill chain” refers to the sequential process of searching for, securing, tracking, targeting, attacking and evaluating a target. The US military war machine has found that people are not as willing to kill as they would like.
After World War II, research by Brigadier General S.L.A. Marshall showed that only 20% of US infantrymen fired their weapons at the enemy, even when directly threatened. He attributed this to people’s innate resistance to killing. He wrote in his book, Men Against Fire, that “the average and healthy individual has such an inner and usually unrealised resistance to killing his fellow man that … at a vital moment he becomes a conscientious objector”. This led to changes in US military training, such as the use of human silhouette targets and the dehumanisation of the enemy.
Lieutenant Colonel Dave Grossman showed in his 1996 article “The Psychological Cost of Learning to Kill in War and Society” that increasing the killing rate of US infantrymen had a psychological cost in terms of the guilt and trauma of killing.
The National Longitudinal Study of Adolescent Health in 2011 showed that soldiers who believed they killed someone in combat were at higher risk of PTSD, depression, and suicidal thoughts.
For these reasons, the IDF, with the help of the US Army and Palantir, has been using AI on the battlefield. Destroying an entire building to kill one person may seem extreme to humans, but not to AI. It has been admitted that AI targeting systems murder innocent civilians 10% of the time, and the AI doesn’t mind that at all.
The Golden Dome, announced by President Trump, relies on AI-based monitoring and prevention systems. Next up are lethal autonomous weapons.
The Bullfrog M2 is a stand-alone .50-calibre machine gun developed for the US Army. It independently detects, tracks, identifies and acquires targets.
Last year, DARPA released footage of an AI flying an F-16, known as the X-62A, autonomously against a human-piloted F-16.
Palmer Luckey, the founder of Oculus VR, is the latest harmless-looking front man for the military-industrial complex. Luckey founded Anduril Industries, a military technology company focused on autonomous military weapon systems, in 2017.

The US Department of Defense’s goal is that “robots, not soldiers, will always make first contact with the enemy”. Robotic systems are being developed to carry out all kinds of surveillance and killing.
With an army of autonomous slaughter robots, ethical issues are no longer a concern. If the government wanted to unleash them against their own people, they would have no problem following orders.