Author Archives: Website Administrator

Why Do Oligarchs Ignore United Nations Day? Victor Madeson

This newsletter, from Victor Madeson,  is based on United Nations Day.

 

The oligarchs didn’t want you to remember that today is the 73rd United Nations Day. They know that people united can never be defeated so their obvious strategy is to keep us divided. In 1945, after the horror of World War II, 50 governments gathered in San Francisco and began drafting the UN Charter, which was adopted 25 June and officially took effect on 24 October. In 1947 the UN General Assembly declared (UN Resolution 2782) 24 October as the anniversary date with the significant statement that UN Day “shall be devoted to making known to the people of the world the aims and achievements of the United Nations and to gaining their support for its work”.

In 1971, the UN General Assembly recommended that it should be observed as a public holiday by UN member states to recall how countries came together to promote peace throughout the world after World War II.

Between 1988 and 2000, the number of UN Security Council peacekeeping efforts has more than doubled but a lack of U.S. support undermines its mission. For example, an 18 October 2021 Sanctions Review Report ordered by President Biden shows the US is intensifying its economic warfare despite the COVID Pandemic. This warfare is directed against 39 countries (a third of the world population) with secondary sanctions against more countries, some of which are US allies. Over half the UN members denounced this use of unilateral economic measures. Pope Francis appealed to “powerful countries to stop aggression, blockades and unilateral sanctions against any country anywhere on earth….” “No to neo-colonialism.”

The purpose of the UN is described in UN, UNDHR, Oligarchy (2021) 5p  with powerful illustrations to define the philosophy of oligarchy and its removal. The latter was made difficult by commencement of the Cold War but it’s not too late to start again. For example, changing the word “flag” to “Constitution” in the Pledge of Allegiance might be a good idea. That symbolic imposition was part of the same 1893 Chicago Exposition that honored the brutality of Columbus.


VETERANS:You should know that the U.S. Department of Veterans Affairs (VA) provides veterans hospital and outpatient care with “needed” services intended to promote, preserve, or restore health. The VA system has grown extensively to achieve this. Over 400,000 full-time healthcare professionals and support staff handle this need. They are spread across 170 VA medical centers, including 1,240 healthcare facilities. The VA has three main subdivisions; each headed by an Undersecretary:
1. Veterans Health Administration (VHA): responsible for providing health care in all its forms, as well as for biomedical research, Outpatient Clinics (CBOCs), Regional Medical Centers (VAMC), and Readjustment Counseling Services (RCS);
2. Veterans Benefits Administration (VBA): responsible for initial veteran registration, eligibility determination, and 5 entitlement or benefit lines: Vocational Rehab & Employment, Education, Compensation & Pension, Home Loan Guarantee, and Insurance.
3. National Cemetery Administration: responsible for burial and memorial benefits to eligible veterans and family members. Most Lehigh Valley veterans use Outpatient Clinics under Wilkes-Barre VA Medical Center (570-824-3521). Others within 65 miles of Allentown: Coatesville, Crescenz, Lebanon, Lyons.
If you haven’t kept up with what the VA offers, please download a copy of the Veterans Welcome Kit(Nov.2020, 42p). As of late July, the hard copy did not reach some local clinics. The blue folder is in two parts. The second part has 13 “Quick start” guides to VA benefits and services (2 pages each, ~Oct.2020): Apply for VA Healthcare (p.17); Mental-Health Services (p.19); Community Care (p.21); Accessing Urgent Care (p.23); Caregiver Support Program (p.25); Women-Veterans-Health Services (p.27); Vet-Center-Services (p.29); Services for 65+-veterans (p.31); Apply for Disability Rating (p.33); If You Disagree with VA decision (p.35); Apply for Survivor Benefits (p.37); Apply for Education Benefits (p.39); and Apply for Burials & Memorials (p.41). More Guides are on VA.gov website: Modernized Decision Review Process; Understand Food & Nutrition Services; Veteran State benefits/services; and Whole Health Services (VA.gov). The two bold Guides are benefits for veteran family members. Most VA clerks don’t know how to explain such benefits to veterans.
Simple challenge: go to a Vet Center/Clinic as if for first time, will you get a complete written description of VA services. If you do, could you quickly get someone to verbally provide an accurate explanation of relevant items?
Also (1) A veteran over age 64 can be presumed disabled and may be entitled to a Veterans Pension, but must apply (means-tested and won’t do much for veterans with household wealth, but could be a big deal for someone in poverty or at a nursing home).
(2) Consider getting Planning Your Legacy: VA Survivors and Burial Benefits Kit at >www.benefits.va.gov/BENEFITS/docs/VASurvivorsKit.pdf< (2020, 68p). Veterans & wives can access 135 national cemeteries.
(3) Service connected disability claims are best handled through Veterans Service Organizations (VSO); see Q&A at: >https://crsreports.congress.gov/product/pdf/R/R46412<.
(4) During medical/behavioral health emergency, VA encourages veterans to seek immediate attention without checking with VA before calling ambulance. See VA FS 20-43 (Apr.2021, 1p),
For now, please share as you feel appropriate. Feedback welcomed.
With liberty and justice for all, 
Victor Madeson (7PA/134)
(VFW, VVA, DAV, TPF)

The Poison is here, the poison is there. Forever Chemicals

We know the poison is here.  The decision makers play a shell game and shelter the government and corporate manufacturers.

I wish the elected and appointed would speak truth, damn it.  Instead, the corporate mainstream news sources and the politicians with any real power act as a filter most of the time to serve corporate government owners.

I’m no expert, but I believe that if Truax is a polluted SuperFund site that they should not be allowed to stir up the pollution with their construction projects.  It seems as though government, especially federal, is above the laws and justice is ignored.  They are building a new terminal now and getting ready for more weapons of war [-crimes.]

That is not how republic and representative government works.  In the end, I guess, we get what we allow.  

 
We also allow the rights of human beings around the world to be violated to serve the profits of the connected and powerful. 
 
1)  “Michael Regan, the head of the Environmental Protection Agency, said his agency is taking a series of actions to limit pollution from a cluster of long-lasting chemicals known as PFAS that are increasingly turning up in public drinking water systems, private wells and even food. 
 
The Defense Department said it is moving to assess and clean up PFAS-contaminated sites throughout the country, while the Food and Drug Administration will expand testing of the food supply to estimate Americans’ exposure to PFAS from food. The plan is intended to restrict PFAS from being released into the environment, accelerate cleanup of PFAS-contaminated sites such as military bases and increase investments in research to learn more about where PFAS are found and how their spread can be prevented.”

Associated Press

 


Per- and polyfluoroalkyl substances (PFAS) are a group of man-made chemicals that includes PFOA, PFOS, GenX, and many other chemicals. There is evidence that exposure to PFAS can lead to adverse human health effects.
https://www.safeskiescleanwaterwi.org/pfas/

Oct 26: VIGIL AGAINST DRONES ~ Wisconsin ~

DRONES KILL INNOCENT PEOPLE

VIGIL AGAINST THE DRONES

OUTSIDE THE GATES OF VOLK FIELD

TUESDAY OCTOBER 26, 2021    3:30-4:30 pm

We need YOU there

 

Dear Friends,

Killer drones continue to play a huge role in terrorizing people in the Middle East and in parts of Africa.  Even though Biden has declared the war in Afghanistan as ending, he has pledged to mount, what has been dubbed, “over the horizon” attacks using drones to continue to kill innocent children, women, and men in Afghanistan, as well as in Iraq, Yemen, Syria, Libya, Somalia, and Niger, says well-known attorney and activist, Marjorie Cohn in her article, War in Afghanistan Isn’t Over – It’s Taking the Form of Illegal Drone Strikes.

This article is a must-read as Cohn outlines several specific ways that show that the attacks using killer drones is illegal under international law, including the UN Charter, the Geneva Conventions, and the International Covenant on Civil and Political Rights.  These are all treaties that have been ratified by the U.S.

DRONE WARFARE IS ILLEGAL!  And yet Volk Field continues to be a training ground for operating killer drones.

Caskets for the dead are carried toward a gravesite as relatives and friends attend a mass funeral for members of a family that is said to have been killed in a U.S. drone airstrike, in Kabul, Afghanistan, on Aug. 30, 2021. (MARCUS YAM/LOS ANGELES TIMES/TNS)

WE MUST STAND TOGETHER AGAINST THE TERROR OF DRONE WARFARE.

The vigil at Volk Field is a legal vigil where we will be on public property.  As always, it will be a solemn vigil, remembering the victims of US government drone attacks.

DIRECTIONS – To get to the vigil, take the Camp Douglas exit off Interstate 90/94 between Mauston and Tomah.  When you exit take County Rd. C to the northeast.  You will see the base straight ahead, but follow County Rd. C to the right and within a few blocks is a picnic wayside where you can park along the side of the road.  The wayside is closed for the season and bathrooms are not available.

THE VIGIL – We will gather at the wayside around 3:15 for introductions and to review the plan for the vigil, and then process together to the gates of the base where we will hold a solemn vigil for one hour to remember those killed by drones.  Participants can stand in silence or read poems and stories about the effects of drone warfare.  It is important that the voices of the victims be brought to the gates of Volk Field.

Bring posters if you can.

A WORD ABOUT THE WEATHER – If you have questions about the vigil because of the weather, please make sure to call Joy at 608 239-4327 or Bonnie at 608-256-5088 for an update.

CARPOOLING –  If you are interested in carpooling to Volk Field from Madison, please contact Bonnie at 608-256-5088.

We hope to see you at the vigil on October 26.  If you can’t come this time, mark your calendar.  We usually vigil on the 4th Tuesday of every month.  If you have any questions please call or email Joy at 608 239-4327 or [email protected] or Bonnie at 608-256-5088 or [email protected] .

Peace,

Joy and Bonnie

Wisconsin Coalition to Ground the Drones and End the Wars

Killer Robots: Applying arms-control frameworks to autonomous weapons

October 5, 2021  |  Zachary Kallenborn

Brookings

“Mankind’s earliest weapons date back 400,000 years—simple wooden spears discovered in Schöningen, Germany. By 48,000 years ago, humans were making bows and arrows, then graduating to swords of bronze and iron. The age of gunpowder brought flintlock muskets, cannons, and Gatling guns. In modern times, humans built Panzer tanks, the F-16 Fighting Falcon, and nuclear weapons capable of vaporizing cities.

Today, humanity is entering a new era of weaponry, one of autonomous weapons and robotics.

The development of such technology is rapidly advancing and poses hard questions about how their use and proliferation should be governed. In early 2020, a drone may have been used to attack humans autonomously for the first time, a milestone underscoring that robots capable of killing may be widely fielded sooner rather than later. Existing arms-control regimes may offer a model for how to govern autonomous weapons, and it is essential that the international community promptly addresses a critical question: Should we be more afraid of killer robots run amok or the insecurity of giving them up?

The current state of autonomy

The first use of an autonomous weapon to kill is thought to have occurred in March of 2020 in Libya, but what actually happened in remains murky. According to a UN report, a Turkish-made Kargu-2 drone is reported to have autonomously “hunted down” members of the Libyan National Army. If the manufacturer’s claims are correct, the Kargu-2 can use machine learning to classify objects, apparently allowing it to “autonomously fire-and-forget.” Turkey denies using the Kargu-2 in this way, though seems to acknowledge the Kargu-2 can be used autonomously. Regardless of whether Kargu-2 was used autonomously in the episode in Libya, the claim that Kargu-2 can be autonomous is plausible on its face.

The United Nations report about the Kargu-2 caused an uproar. Sensationalist headlines compared the Kargu-2 to a “Terminator-style AI drone” that “hunted down human targets without being given orders.” These stories conjured images of out of control, sentient robots killing as they saw fit. To be blunt, that is nonsense. Although artificial intelligence—technically a super intelligent narrow AI—can beat the world’s best human chess and Go players, that is far from a generalized, human-level intelligence like the Terminator. In fact, a sticky note is enough to convince a cutting edge machine vision system that an apple is an iPod.

But simple autonomy is not that hard, and autonomous weapons have been a feature of warfare for centuries. Autonomy is about machines operating without human control. The weapon just needs a sensor, a way to process sensor information, and activate the harmful payload. During the American Civil War, Confederate forces deployed the “Rains Patent,” a simple landmine made of sheet iron with a brass cap sealed in beeswax to protect the fuse. When Union soldiers put sufficient pressure on the Rains Patent, it exploded.

Modern autonomous weapons play a real, but relatively limited role in military operations. The Ottawa Convention banned anti-personnel mines, but anti-vehicle and sea mines are still used. Loitering munitions are somewhere between a drone and a missile, hovering above a battlefield and striking targets that meet various designations. The U.S. Phalanx close-in weapon system, the Israeli Iron Dome, and various active defense systems defend against incoming missiles and other close range risks with varying degrees of autonomy. Along the demilitarized zone between South and North Korea, South Korea has deployed the SGR A-1 gun turret, which reportedly has an optional fully autonomous mode. This is just the beginning.

Though hype in the view of some, artificial intelligence has won over the world’s great military powers as the next great military technology. The U.S. National Security Commission on AI recently concluded that “properly designed, tested, and utilized AI-enabled and autonomous weapon systems will bring substantial military and even humanitarian benefit.” The Chinese People’s Liberation Army believes AI could fundamentally change the character of warfare. Russian President Vladimir Putin’s claim—that the world’s AI leader “will become the ruler of the world”— has become cliché.

Such excitement has translated to new research, prototypes, and increasingly operational autonomous weapons with increasing degrees of sophistication. The Defense Advanced Research Projects Agency—the U.S. military’s high risk, high-reward research and development center that helped birth the internet and GPS—ran a virtual dogfight between an F-16 Fighting Falcon and an artificial intelligence last year. The AI beat the human in each of five rounds. China is doing the same, with similar results. The United StatesChina, and Russia are all developing loyal wingman drones: unmanned semi-autonomous or autonomous aircraft that support manned aircraft. The pilot provides strategic decisions, while the artificial intelligence manages the details.

Growing autonomy is closely tied to the rise of unmanned platforms. Numerous states are testing, building, and deploying a wide range of unmanned aircraft, shipssubmarines, and tanks. Unmanned platforms require remote orders to achieve their mission. That’s tough when militaries cannot provide the full staff needed and pilots burn out from overwork. Plus, enemies seek to jam, manipulate, or otherwise interfere with the signals from the pilots to the drone. The more that unmanned platforms can operate without human control, the less need for those signals and the people sending them.

States are integrating unmanned platforms into drone swarms, and in May, Israel became the world’s first to deploy a swarm in combat. In a true drone swarm, the drones communicate and collaborate, forming a single weapons platform. While drone swarms are not necessarily autonomous weapons, no human could control 10,000 drones without artificial intelligence helping. Israel’s groundbreaking use of a drone swarm appears to have consisted of an unknown number of small drones equipped with a mixture of sensors and weapons. Israel’s swarm use is just the beginning. India tested a 75-drone swarm last year, and earlier this year, South Africa’s Paramount Group revealed a swarming system of 41-kilogram long-range drones that cruise at more than 100 miles per hour. Russia, meanwhile, is designing swarms for anti-submarine warfare. Numerous other states are developing other swarm applications.

Assuming these trends continue, autonomous weapons will increasingly enter the battlefield. For some, that’s terrifying.

In his Christmas address of 1937, the Archbishop of Canterbury posed a prescient question: “Who can think without horror of what another widespread war would mean, waged as it would be with all the new weapons of mass destruction?” Arms control debates are rooted in fear, and treaties to control the spread of weapons have sought to address them. Advocates of arms-control treaties fear the consequences of horrific weapons of war spreading widely. Opponents fear what might happen if their adversaries build such weapons, but they cannot. These dueling fears animate everything from gun debates around the dinner table to nuclear arms debates at the United Nations. The proliferation of autonomous weapons creates new fears: that autonomous weapons might misidentify a civilian as a soldier and killing him, that autonomous weapons will provide an enemy state a decisive edge in war.

In an autonomous weapon, the system decides when to engage by processing environmental stimuli. Landmines, for example, use simple pressure sensors—the sensor sensitivity determines whether the heft of a tank or the hands of a child are enough to trigger the explosion. Conversely, an anti-radar loitering munition homes in on radar signals. The risk of error—and by extension the arms control concern—depend on the type of environmental stimuli, how the stimuli is processed, and the type of decisions made.

Emerging autonomous weapons using machine learning process stimuli in more complex ways. Machine learning systems rely on large amounts of data to draw conclusions about what the system observes. But the data dependence also makes them brittle. Color differences, tree branches, or foggy days may confound the ability of the system to correctly identify a target. Although some states may adopt robust verification and testing programs to increase reliability, others may not. As autonomous weapons are deployed in larger numbers, arms control advocates fear a higher likelihood of something going horrifyingly wrong.

As autonomous weapons scale into massive drone swarms, the uncontrolability and potential for mass harm create a new weapon of mass destruction. Imagine 1,000 Slaughterbots flitting about a city, deciding who to kill. And that’s not terribly outlandish: India wants to build a swarm of 1,000 drones operating without human control. And the Naval Postgraduate School is modeling swarms of up to a million drones, operating underwater, on the ocean’s surface, and in the air. Particularly nefarious governments might equip the drones with facial recognition to assassinate regime opponents or carry out ethnic cleansing. States have adopted a wide range of policies to reduce similar risks from traditional weapons of mass destruction, including export controls, arms control treaties, and deterrent and coercive threats. If drone swarms are weapons of mass destruction, they deserve similar risk reduction policies.

At the same time, militaries see great value in the development of autonomous weapons. Autonomous weapons offer speed. A typical human takes 250 milliseconds to react to something they see. An autonomous weapon can respond far faster—Rheinmetall Defense’s Active Defense System can react to incoming rocket-propelled grenade in less than one millisecond. According to General George Murray, head of the U.S. Army’s Future Command, that speed may be necessary to defend against massive drone swarms. These weapons may be the difference between survival and defeat. Giving them up in an arms control treaty would be foolish.

Militaries also dispute the risk of error. Humans get tired, frustrated, and over-confident. That creates mistakes. Autonomous weapons have no such emotion, and advocates of military AI applications argue a reduced error-rate makes pursuing the technology a moral imperative. Plus, artificial intelligence can improve aiming. That reduces collateral harm. For example, Israel reportedly used an artificial intelligence-assisted machine gun to assassinate an Iranian nuclear scientist without hitting the scientist’s wife inches away. So, in their view, what are arms control advocates really afraid of?

Work on this issue is ongoing. Diplomats have debated autonomous weapons issues under the United Nations Convention on Conventional Weapons Group of Governmental Experts on Lethal Autonomous Weapons since 2014. These meetings have accomplished little to create an international treaty on autonomous weapons, but they have helped clarify state positions, brought greater attention to the topic, and better articulated concerns regarding autonomous weapons. Arms control advocates have called for bans and new treaties, but these vary in scope. Groups like the Campaign to Stop Killer Robots argue all autonomous weapons must be banned. Others, like the International Committee of the Red Cross, have a more nuanced view, focusing on “unpredictable” weapons. Human Rights Watch estimates 30 states have endorsed a complete ban.Great military powers have resisted a new arms-control regime, arguing existing international law is sufficient to cover autonomous weapons. Researchers have also floated alternatives to arms control treaties, such as norms and bilateral and multi-lateral confidence-building measures.

The global community must now resolve the tension of fear between arms-control and military advocates. That means serious debate on which types of autonomous weapon offer the most military value and which present the most risk to civilians and noncombatants. Weapons with high risk to civilians and low military value should form the basis of conversations around risk reduction.

Existing arms control treaties offer models to address these complexities. The Ottawa Convention on Anti-Personnel Landmines narrowly focuses on anti-personnel landmines, excluding anti-vehicle landmines that require high pressure to detonate. An autonomous weapon treaty might focus on anti-personnel weapons using machine learning, given the challenges of distinguishing farmers from soldiers. A more precise treaty may allow military powers to separate weapons they fear giving up from the weapons arms control advocates fear proliferating. This may make getting an okay from those powers easier.

Autonomous weapons might also be tiered based on characteristics that make them more or less risky, akin to the Chemical Weapons Convention’s schedules. The convention divides chemical agents into three schedules, based on their historical use as chemical weapons and use for civilian purposes. Chemicals in each schedule have different restrictions placed upon them. Autonomous weapons could also be tiered based on the risk the weapons pose, particularly the risk to civilian populations if the weapon errs and the likelihood of an error. Defensive turrets used at sea to defend against incoming missiles would likely be of lowest risk, while offensive weapons targeting people using machine learning would be a higher tier. Autonomous chemical, biological, radiological, and nuclear weapons are the highest risk, and should never be used.

Debate is needed on the best policy approaches to stem the proliferation of the highest risk weapons, and reduce broader global risks. Existing global discussion has focused on whether international treaties should ban the weapons, but that’s just a start. Even if autonomous weapons are banned in whole, in part, or not at all, governments must consider how to ensure they are not inadvertently exported to states not party to the ban. Restricting access to terrorist groups is an extra, different problem as autonomous weapons are simple enough to be made as a classroom project. And if a new international treaty is established, an obvious question is: How can it be given teeth? If a state uses a banned autonomous weapon, should they suffer retaliatory diplomatic or economic sanctions? When, if ever, should the United Nations Security Council endorse military action?

The era of killer robots is here. What comes next is up to the world.

Zachary Kallenborn is a research affiliate with the Unconventional Weapons and Technology Division of the National Consortium for the Study of Terrorism and Responses to Terrorism, a policy fellow at the Schar School of Policy and Government, and a U.S. Army Training and Doctrine Command “Mad Scientist.” 

 


Every US President has committed War Crimes (since WWII), Here’s why….

KnowDrones was founded in 2012 to inform the American public about the illegality, immorality and dreadful human consequences of U.S. drone attacks in order to bring about: (1) a complete halt to drone attacks; and (2) an international ban on weaponized drones and military and police drone surveillance.