Category Archives: T – war

UK Debating Killer Robots

 

Representatives from both sides of the House of Commons in the United Kingdom agree that fully autonomous weapons raise numerous concerns warranting further deliberation, including at the international level. The Parliamentary Under Secretary of State at the Foreign and Commonwealth Office, Alistair Burt, however emphasized that the government does not support the call for a moratorium on these future weapons that would select and attack targets without further human intervention, described as “lethal autonomous robotics” in the parliamentary adjournment debate held late in the evening of 17 June 2013.

Burt said the statement that ’robots may never be able to meet the requirements of international humanitarian law’ is “absolutely correct; they will not. We cannot develop systems that would breach international humanitarian law, which is why we are not engaged in the development of such systems and why we believe that the existing systems of international law should prevent their development.” He emphasized that as a matter of policy, “Her Majesty’s Government are clear that the operation of our weapons will always be under human control as an absolute guarantee of human oversight and authority and of accountability for weapons usage.”

 

Ref: United Kingdom debating killer robots – Campaign to Stop Killer Robots

Losing Humanity: The Case against Killer Robots

On November 21, 2012, the US Department of Defense issued its first public policy on autonomy in weapons systems. Directive Number 3000.09 (the Directive) lays out guidelines for the development and use of autonomous and semi-autonomous weapon systems by the Department of Defense. The Directive also represents the first policy announcement by any country on fully autonomous weapons, which do not yet exist but would be designed to select and engage targets without human intervention.

The Directive does not put in place such a preemptive ban. For a period of up to ten years, however, it allows the Department of Defense to develop or use only fully autonomous systems that deliver non-lethal force, unless department officials waive the policy at a high level. Importantly, the Directive also recognizes some of the dangers to civilians of fully autonomous weapons and the need for prohibitions or controls, including the basic requirement that a human being be “in the loop” when decisions are made to use lethal force. The Directive is in effect a moratorium on fully autonomous weapons with the possibility for certain waivers. It also establishes guidelines for other types of autonomous and semi-autonomous systems.

While a positive step, the Directive does not resolve the moral, legal, and practical problems posed by the potential development of fully autonomous systems. As noted, it is initially valid for a period of only five to ten years, and may be overridden by high level Pentagon officials. It establishes testing requirements that may be unfeasible, fails to address all technological concerns, and uses ambiguous terms. It also appears to allow for transfer of fully autonomous systems to other nations and does not apply to other parts of the US government, such as the Central Intelligence Agency (CIA). Finally, it lays out a policy of voluntary self-restraint that may not be sustainable if other countries begin to deploy fully autonomous weapons systems, and the United States feels pressure to follow suit.

 

Ref: Review of the 2012 US Policy on Autonomy in Weapons Systems – Human Rights Watch
Ref: Say no to killer robots – The Engineer
Ref: Losing Humanity: The Case against Killer Robots – Human Rights Watch

Campaign to Stop Killer Robots

 

Over the past decade, the expanded use of unmanned armed vehicles has dramatically changed warfare, bringing new humanitarian and legal challenges. Now rapid advances in technology are resulting in efforts to develop fully autonomous weapons. These robotic weapons would be able to choose and fire on targets on their own, without any human intervention. This capability would pose a fundamental challenge to the protection of civilians and to compliance with international human rights and humanitarian law.

Several nations with high-tech militaries, including China, Israel, Russia, the United Kingdom, and the United States, are moving toward systems that would give greater combat autonomy to machines. If one or more chooses to deploy fully autonomous weapons, a large step beyond remote-controlled armed drones, others may feel compelled to abandon policies of restraint, leading to a robotic arms race. Agreement is needed now to establish controls on these weapons before investments, technological momentum, and new military doctrine make it difficult to change course.

Allowing life or death decisions to be made by machines crosses a fundamental moral line. Autonomous robots would lack human judgment and the ability to understand context. These qualities are necessary to make complex ethical choices on a dynamic battlefield, to distinguish adequately between soldiers and civilians, and to evaluate the proportionality of an attack.  As a result fully autonomous weapons would not meet the requirements of the laws of war.

Replacing human troops with machines could make the decision to go to war easier, which would shift the burden of armed conflict further onto civilians. The use of fully autonomous weapons would create an accountability gap as there is no clarity on who would be legally responsible for a robot’s actions: the commander, programmer, manufacturer, or robot itself? Without accountability, these parties would have less incentive to ensure robots did not endanger civilians and victims would be left unsatisfied that someone was punished for the harm they experienced.

 

Ref: Campaign to Stop Killer Robots

Machine That Predicts Future

 

An Iranian scientist has claimed to have invented a ‘time machine’ that can predict the future of any individual with a 98 per cent accuracy.

Serial inventor Ali Razeghi registered “The Aryayek Time Traveling Machine” with Iran’s state-run Centre for Strategic Inventions, The Telegraph reported.

According to a Fars news agency report, Mr Razeghi, 27, claims the machine uses algorithms to produce a print-out of the details of any individual’s life between five and eight years into their future.

Mr Razeghi, quoted in the Telegraph, said: “My invention easily fits into the size of a personal computer case and can predict details of the next 5-8 years of the life of its users. It will not take you into the future, it will bring the future to you.”

Razeghi is the managing director of Iran’s Centre for Strategic Invention and reportedly has another 179 inventions registered in his name.

He claims the invention could help the government predict military conflict and forecast fluctuations in the value of foreign currencies and oil prices.

According to Mr Razeghi his latest project has been criticised by his friends and family for “trying to play God”.

 

Ref: Iranian scientist claims to have invented ‘Time Machine’ that can predict the future – The Independent (via DarkGovernment)

Palantir

 

In Afghanistan, U.S. Special Operations Forces use Palantir to plan assaults. They type a village’s name into the system and a map of the village appears, detailing the locations of all reported shooting skirmishes and IED, or improvised explosive device, incidents. Using the timeline function, the soldiers can see where the most recent attacks originated and plot their takeover of the village accordingly. The Marines have spent years gathering fingerprint and DNA evidence from IEDs and tried to match that against a database of similar information collected from villagers. By the time the analysis results came back, the bombers would be long gone. Now field operatives are uploading the samples from villagers into Palantir and turning up matches from past attacks on the spot, says Samuel Reading, a former Marine who works in Afghanistan for NEK Advanced Securities Group, a U.S. military contractor. “It’s the combination of every analytical tool you could ever dream of,” Reading says. “You will know every single bad guy in your area.”

 

Ref: Palantir, the War on Terror’s Secret Weapon – Bloomberg (via zmaril)

Navy Identifies Software Algorithms As a Major Point of Concern For 2013

Enter the Office of Naval Research. One of its new special program announcements for 2013 identifies software algorithms as a major point of concern: It wants more robust logic tools play nicely across hardware and software platforms, pre-assembling a mosaic of threats. Don’t bother writing them better search tools for sifting through their data archives: The Navy expressly rules that out. It wants the imaging equipment of pre-cut vegetables in a salad bag.

[…]

A related effort, called Automated Image Understanding, gets more explicit. It’s about “detection and tracking of objects on water or in urban areas and inferring the threat level they may pose” — sharply enough that the algorithm should be able to pick out “partially occluded objects in urban clutter.” All this has to happen in real time.

 

Ref: Navy Wants You to Write Algorithms That Automatically ID Threats – Wired

The Overly Documented Life

I’m acting healthier. I walk my ten thousand steps, I pass up my son’s offer of pink ice-cream-filled Oreos.

And yet, sometimes my Mood Panda drops to 3. I feel like I’m getting a preview of a dystopia worthy of a young-adult novel. When we all start extreme recording, we’ll all have to censor ourselves. We’ll all be as careful as politicians, knowing that we risk making our own version of Romney’s 47 percent remark. We’ll all have to worry more about hackers and Big Brother poaching our data. It will be a world with a lot less mystery, which might mean a lot less fun. How do you plan a surprise party when all your friends know exactly where you are at all times?

And yes, you’ll have a full record of life — but will it be the record of a lesser life? Because that’s the problem with reality — it’s not really life. Reality is messy, nuanced, repetitive, and dull.

 

Ref: The Overly Documented Life – Esquire (via Fresser)