Tag Archives: war crime

NYTIMES: THE UNSEEN SCARS OF THOSE WHO KILL VIA REMOTE CONTROL

original link

 

REDWOOD VALLEY, Calif. — “After hiding all night in the mountains, Air Force Capt. Kevin Larson crouched behind a boulder and watched the forest through his breath, waiting for the police he knew would come. It was Jan. 19, 2020. He was clinging to an assault rifle with 30 rounds and a conviction that, after all he had been through, there was no way he was going to prison.

Captain Larson was a drone pilot — one of the best. He flew the heavily armed MQ-9 Reaper, and in 650 combat missions between 2013 and 2018, he had launched at least 188 airstrikes, earned 20 medals for achievement and killed a top man on the United States’ most-wanted-terrorist list.

The 32-year-old pilot kept a handwritten thank-you note on his refrigerator from the director of the Central Intelligence Agency. He was proud of it but would not say what for, because like nearly everything he did in the drone program, it was a secret. He had to keep the details locked behind the high-security doors at Creech Air Force Base in Indian Springs, Nev.

There were also things he was not proud of locked behind those doors — things his family believes eventually left him cornered in the mountains, gripping a rifle.

 

In the Air Force, drone pilots did not pick the targets. That was the job of someone pilots called “the customer.” The customer might be a conventional ground force commander, the C.I.A. or a classified Special Operations strike cell. It did not matter. The customer got what the customer wanted.

And sometimes what the customer wanted did not seem right. There were missile strikes so hasty that they hit women and children, attacks built on such flimsy intelligence that they made targets of ordinary villagers, and classified rules of engagement that allowed the customer to knowingly kill up to 20 civilians when taking out an enemy. Crews had to watch it all in color and high definition.

Captain Larson tried to bury his doubts. At home in Las Vegas, he exuded a carefree confidence. He loved to go out dancing and was so strikingly handsome that he did side work as a model. He drove an electric-blue Corvette convertible and a tricked-out blue Jeep and had a beautiful new wife.

But tendrils of distress would occasionally poke up, in a comment before bed or a grim joke at the bar. Once, in 2017, his father pressed him about his work, and Captain Larson described a mission in which the customer told him to track and kill a suspected Al Qaeda member. Then, he said, the customer told him to use the Reaper’s high-definition camera to follow the man’s body to the cemetery and kill everyone who attended the funeral.

“He never really talked about what he did — he couldn’t,” said his father, Darold Larson. “But he would say things like that, and it made you know it was bothering him. He said he was being forced to do things that went against his moral compass.”

Drones were billed as a better way to wage war — a tool that could kill with precision from thousands of miles away, keep American service members safe and often get them home in time for dinner. The drone program started in 2001 as a small, tightly controlled operation hunting high-level terrorist targets. But during the past decade, as the battle against the Islamic State intensified and the Afghanistan war dragged on, the fleet grew larger, the targets more numerous and more commonplace. Over time, the rules meant to protect civilians broke down, recent investigations by The New York Times have shown, and the number of innocent people killed in America’s air wars grew to be far larger than the Pentagon would publicly admit.

Captain Larson’s story, woven together with those of other drone crew members, reveals an unseen toll on the other end of those remote-controlled strikes.


Drone crews have launched more missiles and killed more people than nearly anyone else in the military in the past decade, but the military did not count them as combat troops. Because they were not deployed, they seldom got the same recovery periods or mental-health screenings as other fighters. Instead they were treated as office workers, expected to show up for endless shifts in a forever war.

Under unrelenting stress, several former crew members said, people broke down. Drinking and divorce became common. Some left the operations floor in tears. Others attempted suicide. And the military failed to recognize the full impact. Despite hundreds of missions, Captain Larson’s personnel file, under the heading “COMBAT SERVICE,” offers only a single word: “none.”

Drone crew members said in interviews that, while killing remotely is different from killing on the ground, it still carves deep scars.

“In many ways it’s more intense,” said Neal Scheuneman, a drone sensor operator who retired as a master sergeant from the Air Force in 2019. “A fighter jet might see a target for 20 minutes. We had to watch a target for days, weeks and even months. We saw him play with his kids. We saw him interact with his family. We watched his whole life unfold. You are remote but also very much connected. Then one day, when all parameters are met, you kill him. Then you watch the death. You see the remorse and the burial. People often think that this job is going to be like a video game, and I have to warn them, there is no reset button.”

In the wake of The Times’s investigations, the Pentagon has vowed to strengthen controls on airstrikes and improve how it investigates claims of civilian deaths. The Air Force is also providing more mental-health services for drone crews to address the lapses of the past, said the commander of the 432nd Wing at Creech, Col. Eric Schmidt.

“We are not physically in harm’s way, and yet at the same time we are observing a battlefield, and we are seeing some scenes or being part of them. We have seen the effects that can have on people,” Colonel Schmidt said. In the past, he said, remote warfare was not seen as real combat, and there was a stigma against seeking help. “I’m proud to say, we have come a long way,” he added. “It’s sad that we had to.”

Captain Larson tried to cope with the trauma by using psychedelic drugs. That became another secret he had to keep. Eventually the Air Force found out. He was charged with using and distributing illegal drugs and stripped of his flight status. His marriage fell apart, and he was put on trial, facing a possible prison term of more than 20 years.


The invasive symptoms of PTSD can affect combat veterans and civilians alike. Early intervention is critical for managing the condition.


Because he was not a conventional combat veteran, there was no required psychological evaluation to see what influence his war-fighting experience might have had on his misconduct. At his trial, no one mentioned the 188 classified missile strikes or the funeral he had targeted. In January 2020, he was quickly convicted.

Desperate to avoid prison, reeling from what he saw as a betrayal by the military he had dedicated his life to, Captain Larson ran.


Captain Larson grew up in Yakima, Wash., the son of police officers. He was a straight-and-narrow Eagle Scout who went to church nearly every Sunday and once admonished a longtime friend to stay away from marijuana. At the University of Washington, where he was an honors student, he joined R.O.T.C. and the Civil Air Patrol, set on becoming a fighter pilot.

The Air Force had other plans. By the time he was commissioned in 2012, the Pentagon had a developed seemingly insatiable appetite for drones, and the Air Force was struggling to keep up. That year it turned out more drone pilots than traditional fighter pilots and still could not meet the demand.

“He was sobbing when he got the news. So disappointed. He wanted to fly,” his mother, Laura Larson, said in an interview. “But once he started, he enjoyed it. He really felt like he was doing something important.”

Captain Larson was assigned to the 867th Attack Squadron at Creech — a unit that pilots say worked largely with the C.I.A. and Joint Special Operations Command. The drone crews operated out of a cluster of shipping containers in a remote patch of desert. Each crew had three members: a sensor operator to guide the surveillance camera and targeting laser, an intelligence analyst to interpret and document the video feeds, and a pilot to fly the Reaper and push the red button that launched its Hellfire missiles.


The specifics of Captain Larson’s missions are largely a mystery. He kept the classified details hidden from his parents and former wife. His closest friends in the attack squadron and dozens of other current and former crew members did not respond to requests for interviews; secrecy laws and nondisclosure agreements make it a crime to discuss classified details.

But several pilots, sensor operators and intelligence analysts who did the same type of work in other squadrons spoke with The Times about unclassified details and described their struggles with the same punishing workload and vexing moral landscape.

More than 2,300 service members are currently assigned to drone crews. Early in the program, they said, missions seemed well run. Officials carefully chose their targets and took steps to minimize civilian deaths.

“We would watch a high-value target for months, gathering intelligence and waiting for the exact right time to strike,” said James Klein, a former Air Force captain who flew Reapers at Creech from 2014 to 2018. “It was the right way to use the weapon.”

But in December 2016, the Obama administration loosened the rules amid the escalating fight against the Islamic State, pushing the authority to approve airstrikes deep down into the ranks. The next year, the Trump administration secretly loosened them further. Decisions on high-value targets that once had been reserved for generals or even the president were effectively handed off to enlisted Special Operations soldiers. The customer increasingly turned drones on low-level combatants. Strikes once carried out only after rigorous intelligence-gathering and approval processes were often ordered up on the fly, hitting schoolsmarkets and large groups of women and children.


Before the rules changed, Mr. Klein said, his squadron launched about 16 airstrikes in two years. Afterward, it conducted them almost daily.

Once, Mr. Klein said, the customer pressed him to fire on two men walking by a river in Syria, saying they were carrying weapons over their shoulders. The weapons turned out to be fishing poles, Mr. Klein said, and though the customer argued that the men could still be a threat, he persuaded the customer not to strike.

In another instance, he said, a fellow pilot was ordered to attack a suspected Islamic State fighter who was pushing another man in a wheelchair on a busy city street. The strike killed one of the men; it also killed three passers-by.

“There was no reason to take that shot,” Mr. Klein said. “I talked to the pilot after, and she was in tears. She didn’t fly again for a long time and ended up leaving for good.”

Squadrons did little to address bad strikes if there was no pilot error. It was seen as the customer’s problem. Crews filed civilian casualty reports, but the investigative process was so faulty that they rarely saw any impact; often they would not even get a response.


Over time, Mr. Klein grew angry and depressed. His marriage began to crumble.

“I started to dread going in to work,” he said. “Everyone kind of expects you to do that stuff and just be fine, but it ate away at us.”

Eventually, he refused to fire any more missiles. The Air Force moved him to a noncombat role, and a few years later, in 2020, he retired, one of many disillusioned drone operators who quietly dropped out, he said.

“We were so isolated, that I’m not sure anyone saw it,’ he said. “The biggest tell is that very few people stayed in the field. They just couldn’t take it.”


In her job as a police officer, Captain Larson’s mother conducted stress debriefings after traumatic events. When officers in her department shot someone, they were required to take time off and meet with a psychologist. As part of the healing process, everyone present at the scene was required to sit down and talk through what had happened. She was not aware of any of that happening with her son.

“At one point I pulled him aside and told him, ‘If things start bothering you, you and your friends need to talk about it,’” Ms. Larson said. “He just smiled and said he was fine. But I think he was struggling more than he ever let on.”

The Air Force has no requirement to give drone crews the mental health evaluations mandated for deployed troops, but it has surveyed the drone force for more than a decade and consistently found high levels of stress, cynicism and emotional exhaustion. In one study, 20 percent of crew members reported clinical levels of emotional distress — twice the rate among noncombat Air Force personnel. The proportion of crew members reporting post-traumatic stress disorder and thoughts of suicide was higher than in traditional aircrews.

Several factors contribute — workload, constantly changing shifts, leadership issues and combat exposure. But the most damaging, according to Wayne Chappelle, the Air Force psychologist leading the studies, is civilian deaths.


Seeing just one strike that causes unexpected civilian deaths can increase the risk of PTSD six to eight times, he said. A survey published in 2020, several years after the strike rules changed, found that 40 percent of drone crew members reported witnessing between one and five civilian killings. Seven percent had witnessed six or more.

“After something like that, people can have unresolved, disruptive emotional reactions,” Dr. Chappelle said. “We would assume that’s unhealthy — having intrusive thoughts, intrusive memories. I call that healthy and normal. What do you call someone who is OK with it?”

Having time off to process the trauma is vital, he said. But during the years when America was simultaneously fighting the Taliban, the Islamic State and Al Qaeda, that was nearly impossible.

Starting in 2015, the Air Force began embedding what it called human performance teams in some squadrons, staffed with chaplains, psychologists and operational physiologists offering a sympathetic ear, coping strategies and healthy practices to optimize performance.

“It’s a holistic team approach: mind, body and spirit,” said Capt. James Taylor, a chaplain at Creech. “I try to address the soul fatigue, the existential questions many people have to wrestle with in this work.”

But crews said the teams were only modestly effective. The stigma of seeking help keeps many crew members away, and there is a perception that the teams are too focused on keeping crews flying to address the root causes of trauma. Indeed, a 2018 survey found that only 8 percent of drone operators used the teams, and two-thirds of those experiencing emotional distress did not.

Instead, crew members said, they tend to work quietly, hoping to avoid a breakdown.

Bennett Miller was an intelligence analyst, trained to study the Reaper’s video feed. Working Special Operations missions in Syria and Afghanistan in 2019 and 2020 from Shaw Air Force Base in South Carolina, the former technical sergeant saw civilian casualties “almost monthly.”

“At first it didn’t bother me that much,” he said. “I thought it was part of going after the bad guys.”


Then, in late 2019, he said, his team tracked a man in Afghanistan who the customer said was a high-level Taliban financier. For a week, the crew watched the man feed his animals, eat with family in his courtyard and walk to a nearby village. Then the customer ordered the crew to kill him, and the pilot fired a missile as the man walked down the path from his house. Watching the video feed afterward, Mr. Miller saw the family gather the pieces of the man and bury them.

A week later, the Taliban financier’s name appeared again on the target list.

“We got the wrong guy. I had just killed someone’s dad,” Mr. Miller said. “I had watched his kids pick up the body parts. Then I had gone home and hugged my own kids.”

The same pattern occurred twice more, he said, yet the squadron leadership did nothing to address what was seen as the customer’s mistakes. Two years later, Mr. Miller was near tears when he described the strikes in an interview at his home. “What we had done was murder, and no one seemed to notice,” he said. “We just were told to move on.”

Mr. Miller grew sleepless and angry. “I couldn’t deal with the guilt or the anxiety of knowing that it was going to probably happen again,” he said. “I was caught in this trap where if I care about what is happening, it’s devastating. And if I don’t care, I lose who I am as a person.”

At Shaw, he said, his squadron did not have a human performance team. “We just had a squadron bar.”

In February 2020, he got home from a 15-hour night shift, locked himself in his bedroom, put a cocked revolver to his head and through the door told his wife that he could not take it anymore. He was hospitalized, diagnosed with PTSD and medically retired.

Beyond their modest standard pensions, veterans with combat-related injuries, even injuries suffered in training, get special compensation worth about $1,000 per month. Mr. Miller does not qualify, because the Department of Veterans Affairs does not consider drone missions combat.

“It’s like they are saying all the people we killed somehow don’t really count,” he said. “And neither do we.”


In February 2018, Captain Larson and his wife, Bree Larson, got into an argument. She was angry at him for staying out all night and smashed his phone, she recalled in an interview. He dragged her out of the house and locked her out, barely clothed. The Las Vegas police came, and when they asked if there were any drugs or weapons in the house, Ms. Larson told them about the bag of psilocybin mushrooms her husband kept in the garage.

When she and Captain Larson had met in 2016, she said, he was already taking mushrooms once every few months, often with other pilots. He also took MDMA — known as ecstasy or molly — a few times a year. The drugs might have been illegal, but, he told her, they offered relief.

“He would just say he had a very stressful job and he needed it,” Ms. Larson said. “And you could tell. For weeks after, he was more relaxed, more focused, more loving. It seemed therapeutic.”

A growing number of combat veterans use the psychedelic drugs illicitly, amid mounting evidence that they are potent treatments for the psychological wounds of war. Both MDMA and psilocybin are expected to soon be approved for limited medical use by the Food and Drug Administration.

“It gave me a clarity and an honesty that allowed me to rewrite the narrative of my life,” according to a former Air Force officer who said he suffered from depression and moral injury after hundreds of Reaper missions; he asked not to be named in order to discuss the use of illegal drugs. “It led to some self-forgiveness. That was a huge first step.”

In Las Vegas, the civilian authorities were willing to forgive Captain Larson, but the Air Force charged him with a litany of crimes — drug possession and distribution, making false statements to Air Force investigators and a charge unique to the armed forces: conduct unbecoming of an officer. His squadron grounded him, forbade him to wear a flight suit and told him not to talk to fellow pilots. No one screened him for PTSD or other psychological injuries from his service, Ms. Larson said, adding, “I don’t think anyone realized it might be connected.”

As the prosecution plodded forward over two years, Captain Larson worked at the base gym and organized volunteer groups to do community service. He and his wife divorced. Struggling with his mental health, seeking productive ways to cope with the trauma, he read book after book on positive thinking and set up a special meditation room in his house, according to his girlfriend at the time, Becca Triano.

“I don’t know what he saw, what he dealt with,” she said. “What I did see toward the end was him really working hard to try to stay sane.”

The trial finally came in January, 2020. His former wife and a pilot friend testified about his drug use. The police produced the evidence. That was all.

After deliberating for a few hours on the morning of Jan. 17, the jury returned with guilty verdicts on nearly every count.


The pilot would be sentenced after a break for lunch. His lawyer told him to be back in an hour. Instead he took off.

He loaded his Jeep with food and clothes and sped away, convinced that he was facing a long prison sentence, Ms. Triano said. Within hours, the Air Force had a warrant out for his arrest.

Captain Larson headed southwest to Los Angeles and stayed the night with a friend, then started heading north. By the afternoon of Saturday, Jan. 18, he was driving by vineyards and redwood groves on U.S. Route 101 in Mendocino County, north of San Francisco, when the California Highway Patrol spotted his Jeep and pulled him over.

Captain Larson stopped and waited calmly for the officer to walk up to his window. Then he gunned it — down the highway and onto a narrow dirt logging road that snaked up into the mountains. After several miles, he pulled off into the trees and hid. The police could not find him, but they knew something he did not: All the roads in the canyon were dead ends, and officers were blocking the only way out.

Night fell. Nothing to do but wait.

In the morning, during a briefing at the bottom of the canyon, records show, Air Force agents explained to the Mendocino County sheriff’s deputies that the wanted man was a deserter who had fled a drug conviction, was probably armed and possibly suicidal.

The officers drove up the canyon and spotted tire tracks on a narrow turnoff. Agents crept up on foot until they spotted the blue Jeep in the trees, but did not risk going farther. The deputies had a better option, something that could get a view of the Jeep without any danger. A small drone soon launched into the sky.

Captain Larson was hiding behind a mossy boulder. There was no phone service deep in the canyon, no way to call for whatever hope or solace he might have conjured. He could only record a video message for his family members. One by one, he told them that he loved them. “I’m sorry,” he said. “I won’t go to prison, so I’m going to end this. This was always the plan.”


There was a lot he did not explain — things that have kept his family and friends wondering in the years since. He did not talk about the hundreds of secret missions or their impact. He did not say what it had felt like to have his commanders stand by quietly as civilian deaths became routine, then stay just as quiet when a decorated pilot was prosecuted for drug possession. He did not talk about the other pilots who had done the same drugs and then avoided him like a virus after he got caught.

Perhaps he was planning to say more, but as he spoke into the phone camera, he was interrupted by an angry buzzing, like a swarm of bees.

“I can hear the drones,” he said. “They’re looking for me.”

Had they found him alive, his pursuers would have been able to tell him this: In the end, the Air Force had decided not to sentence him to prison, only to dismissal.

But now, just as Captain Larson had done countless times, the officers could only study the drone footage and parse the evidence — slumped behind the boulder, shot with his own assault rifle — of another unintended death.

originally New York Times


If you are having thoughts of suicide, call the National Suicide Prevention Lifeline at 1-800-273-8255 (TALK). You can find a list of additional resources at SpeakingOfSuicide.com/resources.


Stop Drone Crimes, It’s No Accident.

What modern drone warfare means for both civilians and soldiers

“Last week, the Pentagon announced that no one would be disciplined for the U.S. drone airstrike that killed ten Afghan civilians in August.

New reporting suggests that decision follows a pattern. Locals in Afghanistan, Iraq and Syria are killed by U.S. drones and there’s little accountability after. But for many higher ups in the military, the civilian death toll is simply a cost of war. The benefits outweigh the collateral damage.”

GUESTS

Azmat Khan

investigative journalist, The New York Times

Christopher Aaron

former intelligence analyst for the CIA’s drone program

Wayne Phelps

retired Lieutenant Colonel, the Marine Corps; author, “On Killing Remotely: The Psychology of Killing with Drones”

 

What modern drone warfare means for both civilians and soldiers


“…Biden’s withdrawal of U.S. troops from Afghanistan is substantially less meaningful when analyzed in light of his administration’s pledge to mount “over-the-horizon” attacks in that country from afar even though we won’t have troops on the ground.” 

Marjorie Cohn

 

“Our troops are not coming home. We need to be honest about that. They are merely moving to other bases in the same region to conduct the same counterterrorism missions, including in Afghanistan.”

Rep. Tom Malinowski (D-New Jersey)

Drones  

People gather around a crater caused by an air strike in Amran province, northwest of Yemen’s capital Sanaa April 12, 2015. REUTERS/Khaled Abdullah

Ban Killer Drones

In solidarity with struggles for political, cultural, and economic liberation around the world, we are an international grassroots campaign committed to banning aerial weaponized drones and military and police drone surveillance.

Courage to Resist supports the troops who refuse to fight, or who face consequences for acting on conscience, in opposition to illegal wars, occupations, the policies of empire abroad and martial law at home.

Shut Down Creech

A national campaign to “shut down” the criminal U.S. drone terror program.  The campaign is a call for coast to coast mobilization for bi-annual week-long resistance in the spring and fall, at Creech Air Force Base, a principal drone control base in Indian Springs, Nevada, an hour north of Las Vegas. Using the powerful tool of nonviolent Gandhian resistance and peaceful protest, we uncover the lies and misinformation, educate, break the silence and put our bodies on the line for the global defenseless living under the daily terror of  remotely controlled U.S. militarized drones.  We invite other organizations to join this important campaign.

Photos from Shut Down Creech

Drone Papers from the Intercept

A cache of secret documents detailing the inner workings of the U.S. military’s assassination program in Afghanistan, Yemen, and Somalia. 

Air Wars

Tracking, assessing and archiving military actions and related civilian harm claims in conflict zones such as Iraq, Syria and Libya. We also work with militaries, where practicable, to help improve understanding of civilian harm allegations – with the aim of reducing battlefield casualties.


Divest from the Machine

NY Times: Hidden Drones – Hazmat Khan

VFP No Drones Working Group – Sign up

 

Killer Robots: Applying arms-control frameworks to autonomous weapons

October 5, 2021  |  Zachary Kallenborn

Brookings

“Mankind’s earliest weapons date back 400,000 years—simple wooden spears discovered in Schöningen, Germany. By 48,000 years ago, humans were making bows and arrows, then graduating to swords of bronze and iron. The age of gunpowder brought flintlock muskets, cannons, and Gatling guns. In modern times, humans built Panzer tanks, the F-16 Fighting Falcon, and nuclear weapons capable of vaporizing cities.

Today, humanity is entering a new era of weaponry, one of autonomous weapons and robotics.

The development of such technology is rapidly advancing and poses hard questions about how their use and proliferation should be governed. In early 2020, a drone may have been used to attack humans autonomously for the first time, a milestone underscoring that robots capable of killing may be widely fielded sooner rather than later. Existing arms-control regimes may offer a model for how to govern autonomous weapons, and it is essential that the international community promptly addresses a critical question: Should we be more afraid of killer robots run amok or the insecurity of giving them up?

The current state of autonomy

The first use of an autonomous weapon to kill is thought to have occurred in March of 2020 in Libya, but what actually happened in remains murky. According to a UN report, a Turkish-made Kargu-2 drone is reported to have autonomously “hunted down” members of the Libyan National Army. If the manufacturer’s claims are correct, the Kargu-2 can use machine learning to classify objects, apparently allowing it to “autonomously fire-and-forget.” Turkey denies using the Kargu-2 in this way, though seems to acknowledge the Kargu-2 can be used autonomously. Regardless of whether Kargu-2 was used autonomously in the episode in Libya, the claim that Kargu-2 can be autonomous is plausible on its face.

The United Nations report about the Kargu-2 caused an uproar. Sensationalist headlines compared the Kargu-2 to a “Terminator-style AI drone” that “hunted down human targets without being given orders.” These stories conjured images of out of control, sentient robots killing as they saw fit. To be blunt, that is nonsense. Although artificial intelligence—technically a super intelligent narrow AI—can beat the world’s best human chess and Go players, that is far from a generalized, human-level intelligence like the Terminator. In fact, a sticky note is enough to convince a cutting edge machine vision system that an apple is an iPod.

But simple autonomy is not that hard, and autonomous weapons have been a feature of warfare for centuries. Autonomy is about machines operating without human control. The weapon just needs a sensor, a way to process sensor information, and activate the harmful payload. During the American Civil War, Confederate forces deployed the “Rains Patent,” a simple landmine made of sheet iron with a brass cap sealed in beeswax to protect the fuse. When Union soldiers put sufficient pressure on the Rains Patent, it exploded.

Modern autonomous weapons play a real, but relatively limited role in military operations. The Ottawa Convention banned anti-personnel mines, but anti-vehicle and sea mines are still used. Loitering munitions are somewhere between a drone and a missile, hovering above a battlefield and striking targets that meet various designations. The U.S. Phalanx close-in weapon system, the Israeli Iron Dome, and various active defense systems defend against incoming missiles and other close range risks with varying degrees of autonomy. Along the demilitarized zone between South and North Korea, South Korea has deployed the SGR A-1 gun turret, which reportedly has an optional fully autonomous mode. This is just the beginning.

Though hype in the view of some, artificial intelligence has won over the world’s great military powers as the next great military technology. The U.S. National Security Commission on AI recently concluded that “properly designed, tested, and utilized AI-enabled and autonomous weapon systems will bring substantial military and even humanitarian benefit.” The Chinese People’s Liberation Army believes AI could fundamentally change the character of warfare. Russian President Vladimir Putin’s claim—that the world’s AI leader “will become the ruler of the world”— has become cliché.

Such excitement has translated to new research, prototypes, and increasingly operational autonomous weapons with increasing degrees of sophistication. The Defense Advanced Research Projects Agency—the U.S. military’s high risk, high-reward research and development center that helped birth the internet and GPS—ran a virtual dogfight between an F-16 Fighting Falcon and an artificial intelligence last year. The AI beat the human in each of five rounds. China is doing the same, with similar results. The United StatesChina, and Russia are all developing loyal wingman drones: unmanned semi-autonomous or autonomous aircraft that support manned aircraft. The pilot provides strategic decisions, while the artificial intelligence manages the details.

Growing autonomy is closely tied to the rise of unmanned platforms. Numerous states are testing, building, and deploying a wide range of unmanned aircraft, shipssubmarines, and tanks. Unmanned platforms require remote orders to achieve their mission. That’s tough when militaries cannot provide the full staff needed and pilots burn out from overwork. Plus, enemies seek to jam, manipulate, or otherwise interfere with the signals from the pilots to the drone. The more that unmanned platforms can operate without human control, the less need for those signals and the people sending them.

States are integrating unmanned platforms into drone swarms, and in May, Israel became the world’s first to deploy a swarm in combat. In a true drone swarm, the drones communicate and collaborate, forming a single weapons platform. While drone swarms are not necessarily autonomous weapons, no human could control 10,000 drones without artificial intelligence helping. Israel’s groundbreaking use of a drone swarm appears to have consisted of an unknown number of small drones equipped with a mixture of sensors and weapons. Israel’s swarm use is just the beginning. India tested a 75-drone swarm last year, and earlier this year, South Africa’s Paramount Group revealed a swarming system of 41-kilogram long-range drones that cruise at more than 100 miles per hour. Russia, meanwhile, is designing swarms for anti-submarine warfare. Numerous other states are developing other swarm applications.

Assuming these trends continue, autonomous weapons will increasingly enter the battlefield. For some, that’s terrifying.

In his Christmas address of 1937, the Archbishop of Canterbury posed a prescient question: “Who can think without horror of what another widespread war would mean, waged as it would be with all the new weapons of mass destruction?” Arms control debates are rooted in fear, and treaties to control the spread of weapons have sought to address them. Advocates of arms-control treaties fear the consequences of horrific weapons of war spreading widely. Opponents fear what might happen if their adversaries build such weapons, but they cannot. These dueling fears animate everything from gun debates around the dinner table to nuclear arms debates at the United Nations. The proliferation of autonomous weapons creates new fears: that autonomous weapons might misidentify a civilian as a soldier and killing him, that autonomous weapons will provide an enemy state a decisive edge in war.

In an autonomous weapon, the system decides when to engage by processing environmental stimuli. Landmines, for example, use simple pressure sensors—the sensor sensitivity determines whether the heft of a tank or the hands of a child are enough to trigger the explosion. Conversely, an anti-radar loitering munition homes in on radar signals. The risk of error—and by extension the arms control concern—depend on the type of environmental stimuli, how the stimuli is processed, and the type of decisions made.

Emerging autonomous weapons using machine learning process stimuli in more complex ways. Machine learning systems rely on large amounts of data to draw conclusions about what the system observes. But the data dependence also makes them brittle. Color differences, tree branches, or foggy days may confound the ability of the system to correctly identify a target. Although some states may adopt robust verification and testing programs to increase reliability, others may not. As autonomous weapons are deployed in larger numbers, arms control advocates fear a higher likelihood of something going horrifyingly wrong.

As autonomous weapons scale into massive drone swarms, the uncontrolability and potential for mass harm create a new weapon of mass destruction. Imagine 1,000 Slaughterbots flitting about a city, deciding who to kill. And that’s not terribly outlandish: India wants to build a swarm of 1,000 drones operating without human control. And the Naval Postgraduate School is modeling swarms of up to a million drones, operating underwater, on the ocean’s surface, and in the air. Particularly nefarious governments might equip the drones with facial recognition to assassinate regime opponents or carry out ethnic cleansing. States have adopted a wide range of policies to reduce similar risks from traditional weapons of mass destruction, including export controls, arms control treaties, and deterrent and coercive threats. If drone swarms are weapons of mass destruction, they deserve similar risk reduction policies.

At the same time, militaries see great value in the development of autonomous weapons. Autonomous weapons offer speed. A typical human takes 250 milliseconds to react to something they see. An autonomous weapon can respond far faster—Rheinmetall Defense’s Active Defense System can react to incoming rocket-propelled grenade in less than one millisecond. According to General George Murray, head of the U.S. Army’s Future Command, that speed may be necessary to defend against massive drone swarms. These weapons may be the difference between survival and defeat. Giving them up in an arms control treaty would be foolish.

Militaries also dispute the risk of error. Humans get tired, frustrated, and over-confident. That creates mistakes. Autonomous weapons have no such emotion, and advocates of military AI applications argue a reduced error-rate makes pursuing the technology a moral imperative. Plus, artificial intelligence can improve aiming. That reduces collateral harm. For example, Israel reportedly used an artificial intelligence-assisted machine gun to assassinate an Iranian nuclear scientist without hitting the scientist’s wife inches away. So, in their view, what are arms control advocates really afraid of?

Work on this issue is ongoing. Diplomats have debated autonomous weapons issues under the United Nations Convention on Conventional Weapons Group of Governmental Experts on Lethal Autonomous Weapons since 2014. These meetings have accomplished little to create an international treaty on autonomous weapons, but they have helped clarify state positions, brought greater attention to the topic, and better articulated concerns regarding autonomous weapons. Arms control advocates have called for bans and new treaties, but these vary in scope. Groups like the Campaign to Stop Killer Robots argue all autonomous weapons must be banned. Others, like the International Committee of the Red Cross, have a more nuanced view, focusing on “unpredictable” weapons. Human Rights Watch estimates 30 states have endorsed a complete ban.Great military powers have resisted a new arms-control regime, arguing existing international law is sufficient to cover autonomous weapons. Researchers have also floated alternatives to arms control treaties, such as norms and bilateral and multi-lateral confidence-building measures.

The global community must now resolve the tension of fear between arms-control and military advocates. That means serious debate on which types of autonomous weapon offer the most military value and which present the most risk to civilians and noncombatants. Weapons with high risk to civilians and low military value should form the basis of conversations around risk reduction.

Existing arms control treaties offer models to address these complexities. The Ottawa Convention on Anti-Personnel Landmines narrowly focuses on anti-personnel landmines, excluding anti-vehicle landmines that require high pressure to detonate. An autonomous weapon treaty might focus on anti-personnel weapons using machine learning, given the challenges of distinguishing farmers from soldiers. A more precise treaty may allow military powers to separate weapons they fear giving up from the weapons arms control advocates fear proliferating. This may make getting an okay from those powers easier.

Autonomous weapons might also be tiered based on characteristics that make them more or less risky, akin to the Chemical Weapons Convention’s schedules. The convention divides chemical agents into three schedules, based on their historical use as chemical weapons and use for civilian purposes. Chemicals in each schedule have different restrictions placed upon them. Autonomous weapons could also be tiered based on the risk the weapons pose, particularly the risk to civilian populations if the weapon errs and the likelihood of an error. Defensive turrets used at sea to defend against incoming missiles would likely be of lowest risk, while offensive weapons targeting people using machine learning would be a higher tier. Autonomous chemical, biological, radiological, and nuclear weapons are the highest risk, and should never be used.

Debate is needed on the best policy approaches to stem the proliferation of the highest risk weapons, and reduce broader global risks. Existing global discussion has focused on whether international treaties should ban the weapons, but that’s just a start. Even if autonomous weapons are banned in whole, in part, or not at all, governments must consider how to ensure they are not inadvertently exported to states not party to the ban. Restricting access to terrorist groups is an extra, different problem as autonomous weapons are simple enough to be made as a classroom project. And if a new international treaty is established, an obvious question is: How can it be given teeth? If a state uses a banned autonomous weapon, should they suffer retaliatory diplomatic or economic sanctions? When, if ever, should the United Nations Security Council endorse military action?

The era of killer robots is here. What comes next is up to the world.

Zachary Kallenborn is a research affiliate with the Unconventional Weapons and Technology Division of the National Consortium for the Study of Terrorism and Responses to Terrorism, a policy fellow at the Schar School of Policy and Government, and a U.S. Army Training and Doctrine Command “Mad Scientist.” 

 


War Criminals: Obama, Trump, every President (since WWII), A few reasons why…

KnowDrones was founded in 2012 to inform the American public about the illegality, immorality and dreadful human consequences of U.S. drone attacks in order to bring about: (1) a complete halt to drone attacks; and (2) an international ban on weaponized drones and military and police drone surveillance.

America’s Afghan War: A Defeat Foretold? by Adam Nossiter

August 21, 2021  |  Adam Nossiter  |  New York Times

 

Adam Nossiter is the Kabul bureau chief for the NY Times.

 

original link …

America’s Afghan War: A Defeat Foretold? by Adam Nossiter


Intercept: Top Defense [War Crime] Profits

Open Secrets: Last Afghan Contracts

“It was 8 a.m. and the sleepy Afghan sergeant stood at what he called the front line, one month before the city of Kunduz fell to the Taliban. An unspoken agreement protected both sides. There would be no shooting.

That was the nature of the strange war the Afghans just fought, and lost, with the Taliban.

President Biden and his advisers say the Afghan military’s total collapse proved its unworthiness, vindicating the American pullout. But the extraordinary melting away of government and army, and the bloodless transition in most places so far, point to something more fundamental.

The war the Americans thought they were fighting against the Taliban was not the war their Afghan allies were fighting. That made the American war, like other such neocolonialist adventures, most likely doomed from the start.

Recent history shows it is foolish for Western powers to fight wars in other people’s lands, despite the temptations. Homegrown insurgencies, though seemingly outmatched in money, technology, arms, air power and the rest, are often better motivated, have a constant stream of new recruits, and often draw sustenance from just over the border.

Outside powers are fighting one war as visitors — occupiers — and their erstwhile allies who actually live there, something entirely different. In Afghanistan, it was not good versus evil, as the Americans saw it, but neighbor against neighbor.

When it comes to guerrilla war, Mao once described the relationship that should exist between a people and troops. “The former may be likened to water,” he wrote, “the latter to the fish who inhabit it.”

And when it came to Afghanistan, the Americans were a fish out of water. Just as the Russians had been in the 1980s. Just as the Americans were in Vietnam in the 1960s. And as the French were in Algeria in the 1950s. And the Portuguese during their futile attempts to keep their African colonies in the ’60s and ’70s. And the Israelis during their occupation of southern Lebanon in the ’80s.

Each time the intervening power in all these places announced that the homegrown insurgency had been definitively beaten, or that a corner had been turned, smoldering embers led to new conflagrations.