The Poison is here, the poison is there. Forever Chemicals

We know the poison is here.  The decision makers play a shell game and shelter the government and corporate manufacturers.

I wish the elected and appointed would speak truth, damn it.  Instead, the corporate mainstream news sources and the politicians with any real power act as a filter most of the time to serve corporate government owners.

I’m no expert, but I believe that if Truax is a polluted SuperFund site that they should not be allowed to stir up the pollution with their construction projects.  It seems as though government, especially federal, is above the laws and justice is ignored.  They are building a new terminal now and getting ready for more weapons of war [-crimes.]

That is not how republic and representative government works.  In the end, I guess, we get what we allow.  

 
We also allow the rights of human beings around the world to be violated to serve the profits of the connected and powerful. 
 
1)  “Michael Regan, the head of the Environmental Protection Agency, said his agency is taking a series of actions to limit pollution from a cluster of long-lasting chemicals known as PFAS that are increasingly turning up in public drinking water systems, private wells and even food. 
 
The Defense Department said it is moving to assess and clean up PFAS-contaminated sites throughout the country, while the Food and Drug Administration will expand testing of the food supply to estimate Americans’ exposure to PFAS from food. The plan is intended to restrict PFAS from being released into the environment, accelerate cleanup of PFAS-contaminated sites such as military bases and increase investments in research to learn more about where PFAS are found and how their spread can be prevented.”

Associated Press

 


Per- and polyfluoroalkyl substances (PFAS) are a group of man-made chemicals that includes PFOA, PFOS, GenX, and many other chemicals. There is evidence that exposure to PFAS can lead to adverse human health effects.
https://www.safeskiescleanwaterwi.org/pfas/

Oct 26: VIGIL AGAINST DRONES ~ Wisconsin ~

DRONES KILL INNOCENT PEOPLE

VIGIL AGAINST THE DRONES

OUTSIDE THE GATES OF VOLK FIELD

TUESDAY OCTOBER 26, 2021    3:30-4:30 pm

We need YOU there

 

Dear Friends,

Killer drones continue to play a huge role in terrorizing people in the Middle East and in parts of Africa.  Even though Biden has declared the war in Afghanistan as ending, he has pledged to mount, what has been dubbed, “over the horizon” attacks using drones to continue to kill innocent children, women, and men in Afghanistan, as well as in Iraq, Yemen, Syria, Libya, Somalia, and Niger, says well-known attorney and activist, Marjorie Cohn in her article, War in Afghanistan Isn’t Over – It’s Taking the Form of Illegal Drone Strikes.

This article is a must-read as Cohn outlines several specific ways that show that the attacks using killer drones is illegal under international law, including the UN Charter, the Geneva Conventions, and the International Covenant on Civil and Political Rights.  These are all treaties that have been ratified by the U.S.

DRONE WARFARE IS ILLEGAL!  And yet Volk Field continues to be a training ground for operating killer drones.

Caskets for the dead are carried toward a gravesite as relatives and friends attend a mass funeral for members of a family that is said to have been killed in a U.S. drone airstrike, in Kabul, Afghanistan, on Aug. 30, 2021. (MARCUS YAM/LOS ANGELES TIMES/TNS)

WE MUST STAND TOGETHER AGAINST THE TERROR OF DRONE WARFARE.

The vigil at Volk Field is a legal vigil where we will be on public property.  As always, it will be a solemn vigil, remembering the victims of US government drone attacks.

DIRECTIONS – To get to the vigil, take the Camp Douglas exit off Interstate 90/94 between Mauston and Tomah.  When you exit take County Rd. C to the northeast.  You will see the base straight ahead, but follow County Rd. C to the right and within a few blocks is a picnic wayside where you can park along the side of the road.  The wayside is closed for the season and bathrooms are not available.

THE VIGIL – We will gather at the wayside around 3:15 for introductions and to review the plan for the vigil, and then process together to the gates of the base where we will hold a solemn vigil for one hour to remember those killed by drones.  Participants can stand in silence or read poems and stories about the effects of drone warfare.  It is important that the voices of the victims be brought to the gates of Volk Field.

Bring posters if you can.

A WORD ABOUT THE WEATHER – If you have questions about the vigil because of the weather, please make sure to call Joy at 608 239-4327 or Bonnie at 608-256-5088 for an update.

CARPOOLING –  If you are interested in carpooling to Volk Field from Madison, please contact Bonnie at 608-256-5088.

We hope to see you at the vigil on October 26.  If you can’t come this time, mark your calendar.  We usually vigil on the 4th Tuesday of every month.  If you have any questions please call or email Joy at 608 239-4327 or [email protected] or Bonnie at 608-256-5088 or [email protected] .

Peace,

Joy and Bonnie

Wisconsin Coalition to Ground the Drones and End the Wars

Killer Robots: Applying arms-control frameworks to autonomous weapons

October 5, 2021  |  Zachary Kallenborn

Brookings

“Mankind’s earliest weapons date back 400,000 years—simple wooden spears discovered in Schöningen, Germany. By 48,000 years ago, humans were making bows and arrows, then graduating to swords of bronze and iron. The age of gunpowder brought flintlock muskets, cannons, and Gatling guns. In modern times, humans built Panzer tanks, the F-16 Fighting Falcon, and nuclear weapons capable of vaporizing cities.

Today, humanity is entering a new era of weaponry, one of autonomous weapons and robotics.

The development of such technology is rapidly advancing and poses hard questions about how their use and proliferation should be governed. In early 2020, a drone may have been used to attack humans autonomously for the first time, a milestone underscoring that robots capable of killing may be widely fielded sooner rather than later. Existing arms-control regimes may offer a model for how to govern autonomous weapons, and it is essential that the international community promptly addresses a critical question: Should we be more afraid of killer robots run amok or the insecurity of giving them up?

The current state of autonomy

The first use of an autonomous weapon to kill is thought to have occurred in March of 2020 in Libya, but what actually happened in remains murky. According to a UN report, a Turkish-made Kargu-2 drone is reported to have autonomously “hunted down” members of the Libyan National Army. If the manufacturer’s claims are correct, the Kargu-2 can use machine learning to classify objects, apparently allowing it to “autonomously fire-and-forget.” Turkey denies using the Kargu-2 in this way, though seems to acknowledge the Kargu-2 can be used autonomously. Regardless of whether Kargu-2 was used autonomously in the episode in Libya, the claim that Kargu-2 can be autonomous is plausible on its face.

The United Nations report about the Kargu-2 caused an uproar. Sensationalist headlines compared the Kargu-2 to a “Terminator-style AI drone” that “hunted down human targets without being given orders.” These stories conjured images of out of control, sentient robots killing as they saw fit. To be blunt, that is nonsense. Although artificial intelligence—technically a super intelligent narrow AI—can beat the world’s best human chess and Go players, that is far from a generalized, human-level intelligence like the Terminator. In fact, a sticky note is enough to convince a cutting edge machine vision system that an apple is an iPod.

But simple autonomy is not that hard, and autonomous weapons have been a feature of warfare for centuries. Autonomy is about machines operating without human control. The weapon just needs a sensor, a way to process sensor information, and activate the harmful payload. During the American Civil War, Confederate forces deployed the “Rains Patent,” a simple landmine made of sheet iron with a brass cap sealed in beeswax to protect the fuse. When Union soldiers put sufficient pressure on the Rains Patent, it exploded.

Modern autonomous weapons play a real, but relatively limited role in military operations. The Ottawa Convention banned anti-personnel mines, but anti-vehicle and sea mines are still used. Loitering munitions are somewhere between a drone and a missile, hovering above a battlefield and striking targets that meet various designations. The U.S. Phalanx close-in weapon system, the Israeli Iron Dome, and various active defense systems defend against incoming missiles and other close range risks with varying degrees of autonomy. Along the demilitarized zone between South and North Korea, South Korea has deployed the SGR A-1 gun turret, which reportedly has an optional fully autonomous mode. This is just the beginning.

Though hype in the view of some, artificial intelligence has won over the world’s great military powers as the next great military technology. The U.S. National Security Commission on AI recently concluded that “properly designed, tested, and utilized AI-enabled and autonomous weapon systems will bring substantial military and even humanitarian benefit.” The Chinese People’s Liberation Army believes AI could fundamentally change the character of warfare. Russian President Vladimir Putin’s claim—that the world’s AI leader “will become the ruler of the world”— has become cliché.

Such excitement has translated to new research, prototypes, and increasingly operational autonomous weapons with increasing degrees of sophistication. The Defense Advanced Research Projects Agency—the U.S. military’s high risk, high-reward research and development center that helped birth the internet and GPS—ran a virtual dogfight between an F-16 Fighting Falcon and an artificial intelligence last year. The AI beat the human in each of five rounds. China is doing the same, with similar results. The United StatesChina, and Russia are all developing loyal wingman drones: unmanned semi-autonomous or autonomous aircraft that support manned aircraft. The pilot provides strategic decisions, while the artificial intelligence manages the details.

Growing autonomy is closely tied to the rise of unmanned platforms. Numerous states are testing, building, and deploying a wide range of unmanned aircraft, shipssubmarines, and tanks. Unmanned platforms require remote orders to achieve their mission. That’s tough when militaries cannot provide the full staff needed and pilots burn out from overwork. Plus, enemies seek to jam, manipulate, or otherwise interfere with the signals from the pilots to the drone. The more that unmanned platforms can operate without human control, the less need for those signals and the people sending them.

States are integrating unmanned platforms into drone swarms, and in May, Israel became the world’s first to deploy a swarm in combat. In a true drone swarm, the drones communicate and collaborate, forming a single weapons platform. While drone swarms are not necessarily autonomous weapons, no human could control 10,000 drones without artificial intelligence helping. Israel’s groundbreaking use of a drone swarm appears to have consisted of an unknown number of small drones equipped with a mixture of sensors and weapons. Israel’s swarm use is just the beginning. India tested a 75-drone swarm last year, and earlier this year, South Africa’s Paramount Group revealed a swarming system of 41-kilogram long-range drones that cruise at more than 100 miles per hour. Russia, meanwhile, is designing swarms for anti-submarine warfare. Numerous other states are developing other swarm applications.

Assuming these trends continue, autonomous weapons will increasingly enter the battlefield. For some, that’s terrifying.

In his Christmas address of 1937, the Archbishop of Canterbury posed a prescient question: “Who can think without horror of what another widespread war would mean, waged as it would be with all the new weapons of mass destruction?” Arms control debates are rooted in fear, and treaties to control the spread of weapons have sought to address them. Advocates of arms-control treaties fear the consequences of horrific weapons of war spreading widely. Opponents fear what might happen if their adversaries build such weapons, but they cannot. These dueling fears animate everything from gun debates around the dinner table to nuclear arms debates at the United Nations. The proliferation of autonomous weapons creates new fears: that autonomous weapons might misidentify a civilian as a soldier and killing him, that autonomous weapons will provide an enemy state a decisive edge in war.

In an autonomous weapon, the system decides when to engage by processing environmental stimuli. Landmines, for example, use simple pressure sensors—the sensor sensitivity determines whether the heft of a tank or the hands of a child are enough to trigger the explosion. Conversely, an anti-radar loitering munition homes in on radar signals. The risk of error—and by extension the arms control concern—depend on the type of environmental stimuli, how the stimuli is processed, and the type of decisions made.

Emerging autonomous weapons using machine learning process stimuli in more complex ways. Machine learning systems rely on large amounts of data to draw conclusions about what the system observes. But the data dependence also makes them brittle. Color differences, tree branches, or foggy days may confound the ability of the system to correctly identify a target. Although some states may adopt robust verification and testing programs to increase reliability, others may not. As autonomous weapons are deployed in larger numbers, arms control advocates fear a higher likelihood of something going horrifyingly wrong.

As autonomous weapons scale into massive drone swarms, the uncontrolability and potential for mass harm create a new weapon of mass destruction. Imagine 1,000 Slaughterbots flitting about a city, deciding who to kill. And that’s not terribly outlandish: India wants to build a swarm of 1,000 drones operating without human control. And the Naval Postgraduate School is modeling swarms of up to a million drones, operating underwater, on the ocean’s surface, and in the air. Particularly nefarious governments might equip the drones with facial recognition to assassinate regime opponents or carry out ethnic cleansing. States have adopted a wide range of policies to reduce similar risks from traditional weapons of mass destruction, including export controls, arms control treaties, and deterrent and coercive threats. If drone swarms are weapons of mass destruction, they deserve similar risk reduction policies.

At the same time, militaries see great value in the development of autonomous weapons. Autonomous weapons offer speed. A typical human takes 250 milliseconds to react to something they see. An autonomous weapon can respond far faster—Rheinmetall Defense’s Active Defense System can react to incoming rocket-propelled grenade in less than one millisecond. According to General George Murray, head of the U.S. Army’s Future Command, that speed may be necessary to defend against massive drone swarms. These weapons may be the difference between survival and defeat. Giving them up in an arms control treaty would be foolish.

Militaries also dispute the risk of error. Humans get tired, frustrated, and over-confident. That creates mistakes. Autonomous weapons have no such emotion, and advocates of military AI applications argue a reduced error-rate makes pursuing the technology a moral imperative. Plus, artificial intelligence can improve aiming. That reduces collateral harm. For example, Israel reportedly used an artificial intelligence-assisted machine gun to assassinate an Iranian nuclear scientist without hitting the scientist’s wife inches away. So, in their view, what are arms control advocates really afraid of?

Work on this issue is ongoing. Diplomats have debated autonomous weapons issues under the United Nations Convention on Conventional Weapons Group of Governmental Experts on Lethal Autonomous Weapons since 2014. These meetings have accomplished little to create an international treaty on autonomous weapons, but they have helped clarify state positions, brought greater attention to the topic, and better articulated concerns regarding autonomous weapons. Arms control advocates have called for bans and new treaties, but these vary in scope. Groups like the Campaign to Stop Killer Robots argue all autonomous weapons must be banned. Others, like the International Committee of the Red Cross, have a more nuanced view, focusing on “unpredictable” weapons. Human Rights Watch estimates 30 states have endorsed a complete ban.Great military powers have resisted a new arms-control regime, arguing existing international law is sufficient to cover autonomous weapons. Researchers have also floated alternatives to arms control treaties, such as norms and bilateral and multi-lateral confidence-building measures.

The global community must now resolve the tension of fear between arms-control and military advocates. That means serious debate on which types of autonomous weapon offer the most military value and which present the most risk to civilians and noncombatants. Weapons with high risk to civilians and low military value should form the basis of conversations around risk reduction.

Existing arms control treaties offer models to address these complexities. The Ottawa Convention on Anti-Personnel Landmines narrowly focuses on anti-personnel landmines, excluding anti-vehicle landmines that require high pressure to detonate. An autonomous weapon treaty might focus on anti-personnel weapons using machine learning, given the challenges of distinguishing farmers from soldiers. A more precise treaty may allow military powers to separate weapons they fear giving up from the weapons arms control advocates fear proliferating. This may make getting an okay from those powers easier.

Autonomous weapons might also be tiered based on characteristics that make them more or less risky, akin to the Chemical Weapons Convention’s schedules. The convention divides chemical agents into three schedules, based on their historical use as chemical weapons and use for civilian purposes. Chemicals in each schedule have different restrictions placed upon them. Autonomous weapons could also be tiered based on the risk the weapons pose, particularly the risk to civilian populations if the weapon errs and the likelihood of an error. Defensive turrets used at sea to defend against incoming missiles would likely be of lowest risk, while offensive weapons targeting people using machine learning would be a higher tier. Autonomous chemical, biological, radiological, and nuclear weapons are the highest risk, and should never be used.

Debate is needed on the best policy approaches to stem the proliferation of the highest risk weapons, and reduce broader global risks. Existing global discussion has focused on whether international treaties should ban the weapons, but that’s just a start. Even if autonomous weapons are banned in whole, in part, or not at all, governments must consider how to ensure they are not inadvertently exported to states not party to the ban. Restricting access to terrorist groups is an extra, different problem as autonomous weapons are simple enough to be made as a classroom project. And if a new international treaty is established, an obvious question is: How can it be given teeth? If a state uses a banned autonomous weapon, should they suffer retaliatory diplomatic or economic sanctions? When, if ever, should the United Nations Security Council endorse military action?

The era of killer robots is here. What comes next is up to the world.

Zachary Kallenborn is a research affiliate with the Unconventional Weapons and Technology Division of the National Consortium for the Study of Terrorism and Responses to Terrorism, a policy fellow at the Schar School of Policy and Government, and a U.S. Army Training and Doctrine Command “Mad Scientist.” 

 


Every US President has committed War Crimes (since WWII), Here’s why….

KnowDrones was founded in 2012 to inform the American public about the illegality, immorality and dreadful human consequences of U.S. drone attacks in order to bring about: (1) a complete halt to drone attacks; and (2) an international ban on weaponized drones and military and police drone surveillance.

WI State Journal, Hubbuch: Future noise concerns could scuttle housing along planned transit corridor

original link

 

useful links:

Alders for City of Madison

Dane County Board of Supervisors

Contact other elected officials – Safe Skies Website

“With its strip malls, auto repair shops and used car lots, the stretch of East Washington Avenue between Aberg Avenue and Stoughton Road shows no signs of the revitalization happening a couple of miles to the west, near Downtown.

That could soon change with the addition of a planned bus rapid transit system.

Bill Connors, who heads a coalition of real estate developers, envisions three- and four-story buildings with ground-floor retail stores below apartments, much like those that have sprung up on the Isthmus.

City plans call for high-density housing that would both provide equitable access regardless of income and support a new bus rapid transit (BRT) system that’s expected to begin shuttling commuters between the city’s East and West sides in 2024.

But with the Air National Guard expected to begin flying a fleet of new F-35 fighter jets from nearby Truax Field in 2023, this ¾-mile strip is expected to be subject to noise levels considered too loud for residential development without significant soundproofing.

The conflict has created a dilemma for leaders of a fast-growing city in desperate need of more housing: By allowing the type of high-density development that would support rapid transit, Madison could also subject thousands more people to unhealthy levels of noise.

Connors argues the market will solve the problem, as builders who don’t do enough to muffle the sound will struggle to keep their buildings full.

City Council president Syed Abbas has appointed a council workgroup to explore possible alternatives, including a development moratorium or zoning changes, in an effort to prevent a situation where poor and minority people bear a disproportionate share of the environmental impacts.

“I have to see the situation with the lens of environmental justice,” Abbas said. “If you go historically, the market decided to put all the people of color there — Black and brown folks.”

Military decision

The Air Force last year selected the Wisconsin Air National Guard’s 115th Fighter Wing as one of the first Guard units to fly the military’s new F-35 fighter jets.

There is disagreement on just how much louder the F-35s will be compared to the F-16s that currently fly out of Truax. But there would be more takeoffs and landings, at least initially, which would increase the overall noise exposure for those living near the airport.