Autonomous weaponry is poised to be thrust into the military market and used by the world's leading nations.

Group Campaign to Stop Killer robots has even warned the murderous machines are on the cusp of "mass production".

But an expert in artificial intelligence believes their introduction will come at a truly devastating cost.

Oklahoma University lecturer Dr Subhash Kak has warned a flaw in their design could result in a large number of deaths.

Dr Kak told Daily Star Online: "The manufacturers are cognisant of such malfunction of fault which they will do their best to minimise or eliminate.

"At the same time they would pressure parliament or other legislative bodies to give them exemption from liability.

"There could be a bug in the code of the robot that promotes such behaviour."

His comments came after he previously told us: "Killer robots could easily go wrong.

"They may be used by crazed individuals or religious extremists to terrorise and kill people.

"They could go wrong due to a bug, or an unknown coding flaw that showed up as response to an unforeseen or unanticipated environment or situation.

"Or they could be hacked."

Senior arms researcher at Campaign to Stop Killer Robots, Bonnie Docherty, previously warned killer robots will "proliferate around the world".

And she believes they would violate ethical and legal standards.

She said: "Permitting the development and use of killer robots would undermine established moral and legal standards.

"Countries should work together to preemptively ban these weapons systems before they proliferate around the world.

"The groundswell of opposition among scientists, faith leaders, tech companies, nongovernmental groups, and ordinary citizens shows that the public understands that killer robots cross a moral threshold.

"Their concerns, shared by many governments, deserve an immediate response."

A number of countries, including the US and Russia, have recently opposed a treaty banning killer robots.

It led Noel Sharkey, a roboticist, to slam them as "shameful".

He said: "The two main options on the table for next year’s work were binding regulations in the form of a political declaration led by Germany and France and negotiations towards a new international law to prohibit the use and development of autonomous weapons systems led by Austria, Brazil and Chile.

"Cuba was particularly stubborn and would not accept any wording that even hinted that there might be any benefits.

"The others concede in the end with a compromise to take out the word 'risks' although the risks themselves remained.

"It is shameful that a handful of states can prevent the majority from moving towards negotiations that would regulate or prevent the use of these morally reprehensible weapons."

He added: "The campaign strongly objects to permitting the development of weapons systems that, once activated, would be able to select and attack targets without human intervention."