Canada asked to help keep 'killer robots' off battlefields

A robot is pictured in front of the Houses of Parliament and Westminster Abbey as part of the...

A robot is pictured in front of the Houses of Parliament and Westminster Abbey as part of the Campaign to Stop Killer Robots in London April 23, 2013. REUTERS/Luke MacGregor

Jessica Hume, National Bureau

, Last Updated: 11:58 PM ET

OTTAWA — Somewhere deep in a lab in China, scientists are working toward building autonomous military machines that could some day end up on a battlefield.

It's not just China. Russia and Israel are working on their own deadly hardware.

The U.K., U.S. and South Korea have even conducted tests on autonomous weapons in military scenarios.

The potentially profound implications of this developing technology led to the formation of the Campaign to Stop Killer Robots — an international effort to see these machines banned.

The efforts are similar to nuclear weapons compared to nuclear energy, in that robots are OK, but killer robots are not.

Canada isn't known to be developing such weapons, so the Campaign to Stop Killer Robots believes that's why the country would be well positioned as a global leader to initiate an international treaty against such development.

However, the use of robotic killers fighting our battles has supporters.

Proponents point to the potential to reduce the number of human soldiers on a battlefield and casualties - as was the argument for landmines decades ago.

Speaking to reporters in Ottawa on Tuesday, Ian Kerr, a University of Ottawa professor and Canada Research Chair in ethics, acknowledges human error and computer error are both realities, but that human error in military situations is preferable.

"We're asking the question of whether we want to relinquish human control to machines," he said.

"Human frailties are part of the human condition which is part of decision making and human judgment, which is the cornerstone of our justice system."

Machines don't have contextual awareness -- the ability to gauge the value of a military target or the value of civilians lives potentially lost, he said.

"There's no reciprocity if the decision maker is a machine. In no sense is the machine a stakeholder."

That's unlike even the use of drones — already a newly contentious addition to battlefields.

Washington has used drones to strike and kill terror group members and leaders overseas, raising ethical questions of such attacks.


Videos

Photos