Be the first to hear our latest news

Roboethics – the big debate

Oct 16, 2019

  • A robot may not injure a human being, or, through inaction, allow a human being to come to harm 
  • A robot must obey the orders given it by human beings except where such orders would conflict with the First Law
  • A robot must protect its own existence as long as such protection does not conflict with the First or Second Law

Today, Asimov's Laws represent more problems and conflict to roboticists than they solve, with the ongoing debate on machine ethics. Who or what is going to be held responsible when or if an autonomous system malfunctions or harms humans? 

Ethics and roboethics

Ethics is the branch of philosophy which studies human conduct, moral assessments, the concepts of good and evil, right and wrong, justice and injustice. 

Roboethics –also called machine ethics– deals with the code of conduct that robotic designer engineers must implement in the Artificial Intelligence of a robot. Roboticists must guarantee that autonomous systems are going to be able to exhibit ethically acceptable behaviour in situations where robots or any other autonomous systems interact with humans.

Ethical issues continue to be on the rise as we have two distinct sets of robotic applications: service robots which are created to peacefully live and interact with humans, and lethal robots, created to fight in the battlefield as military robots.Roboethics Military robots

Military robots are certainly not just a thing of the present. They date back to World War II and the Cold War. Today, military robots are being developed to fire a gun, disarm bombs, carry wounded soldiers, detect mines, fire missiles, fly, and so on. 

However, what kind of roboethics are going to be embedded to military robots and who is going to decide upon them? Asimov's laws cannot be applied to robots that are actually designed to kill humans. The debate continues…

Robots a greater risk than nuclear weapons

TECH billionaireElon Musk founded OpenAI in 2015 with the goal of developing AGI to be able to learn and master several disciplines. Beating the world’s best human players at the video game Dota 2 soon made the world pay attention.

However, Mr Musk has since stepped back from the AI startup due to concerns about the risks artificial intelligence poses to humanity. He claims its development poses a greater risk than nuclear weapons.

He has also issued warnings about Artificial Intelligence “bot swarms” which he warns could be the first signs of a robot takeover. 

Swarm robotics theory is inspired by the behaviour of social insects such as ants, which use communication between the members of the group that build a system of constant feedback. The swarm behaviour involves constant change of individuals in cooperation with others, as well as the behaviour of the whole group.

Reaping benefits whilst avoiding pitfalls

Microsoft invested $1 billion in the venture, and boss Brad Smith has separate concerns over the rise of ‘killer robots,’ which he feels has become unstoppable. 

He said the use of “lethal autonomous weapon systems” posed a host of new ethical questions, which need to be considered by governments as a matter of urgency. “Robots must not be allowed to decide on their own to engage in combat and who to kill.”

OpenAI CTO Greg Brockman allayed fears saying, “We want AGI to work with people to solve problems, including global challenges such as climate change, affordable and high-quality healthcare, and personalised education. AI has such huge potential, it is vital to research how to reap its benefits while avoiding potential pitfalls.”

Are robots capable of moral decision making?

Roboethics has to, and must, become increasingly important as we enter an era where more advanced and sophisticated robots as well as Artificial General Intelligence (AGI) are becoming more and more an integral part of our daily life. 

Some believe that robots will contribute to building a better world. Others argue that robots are incapable of being moral agents and should not be designed with embedded moral-decision making capabilities. What do you think?

(Extracts taken from “Roboethics:The Human Ethics Applied to Robots”, Interesting Engineering, Sept 22nd 2019 & “Elon Musk’s AI Project to replicate the Human Brain receives $1 Billion from Microsoft”, The Independent, 23 July 2019)

Tim's thoughtsPlunkett Associates thoughts…

Engineers work to solve a problem and the task will eventually be solved irrespective of the philosophical discussions that may be revolving around it.

We cannot uninvent things, the human mind let alone AI, works to solve challenges. Its called human nature or “progress”. 

I don't believe this natural trait can (or should) be stopped but I also accept that those at the coalface are probably not the best placed people to assess the ethical consequences. So who should?

  • Plunkett Associates worked on the design and production of titanium frames for an artificial heart valve where precision and consistency was paramount. Communications were efficient and productive with concepts readily incorporated into the final design, and suggestions for improvements all highly appropriate. The quality of the end product was ideal and I have no doubt that I will use the advice and expertise of Plunkett Associates again.
    David Wheatley, Wheatley Research Ltd
    david-wheatley-wheatley-research-ltd-main-original
  • “Doing business with you has been an absolute pleasure. You have made my goals a reality. You have been patient, helpful and above all, responsive. Best service that I have experienced in ages. I consider Plunkett Associates to be an integral part of my business plan moving forward.”
    Jet Cooper, Sculptor
    jet-cooper-sculptor-main-original
  • Thanks for the present that arrived this morning, as promised. Parts look first class, as always.
    Jon Todd, Design Manager, PHS Group
    jon-todd-design-manager-phs-group-main-logo
  • Plunketts developed a solution that met our precision requirements at a cost significantly lower than alternative manufacturing approaches we had investigated
    Nick Skaer, CEO, Orthox Ltd
    nick-skaer-ceo-orthox-ltd-main-logo
  • Plunkett Associates have excellent breadth of experience, work with high quality partners and have a flexibility that we have found enables challenges to be met successfully, cost effectively and in a timely manner.
    Nick Skaer, CEO, Orthox Ltd
    nick-skaer-ceo-orthox-ltd-main-logo
  • Interestingly, people who look at it in its own right with little knowledge of what it does, just like it! It’s tactile and it looks good…
    Tim Collings, Operations Director, Troika Systems Ltd
    client-name-main-logo
  • To create such a consistent quality finish in chrome was an amazing achievement and the prototypes really gave our stand the wow factor.
    James Clark, Group Commercial Director, The Airdri Group
    client-name-2-main-logo
  • I can’t recommend Tim and his team highly enough. We ended up with a design which can be produced and assembled quickly, efficiently and reliably, in volumes that make sense
    Simon Leggett, Director, Olivewood Technology
    simon-leggett-olivewood-technology-main-logo
  • Kirintec were thrilled with Plunketts. They made a complex process simple and we were just amazed at the lead times achieved. Plus, every single deadline was met without any drama.
    Penny While, Marketing Director, Kirintec
    penny-while-marketing-director-kirintec-main-original
  • Plunkett Associates truly offer a complete turnkey solution with Speed, Effectiveness and Precision.
    David Harden, Director, Enica Ltd
    david-harden-director-enica-ltd-main-original
  • Thanks to Plunkett Associates flexible, bespoke and cost effective service I was able to produce the objects that I had originally had in mind.
    Simon Ryder, Artist
    simon-ryder-artist-main-original
  • Plunkett Associates have a fantastic supply chain, work with all sorts of production processes and will always do their utmost to solve a problem.
    Naim Audio
    naim-audio-2-main-logo
  • We always know exactly what we’re getting and we know we won’t be let down on quality, price or lead times.
    Mark Williams, Research and Development Director, Johnson Safety Products
    mark-williams-research-and-development-director-johnson-safety-products-main-logo
  • "With their simple to understand language, willingness to go the extra mile and approachability, Plunkett Associates were a great partner for our company. We couldn’t be happier with the results of our project. (A working prototype that they got right first time, by the way). Thank you!"
    Carolina Ballon-Forrester, Director, Kittyrama Ltd
    carolina-ballon-forrester-director-kittyrama-ltd-main-original
  • “We use Plunkett Associates because we want someone we can trust, that knows what they’re talking about, has all the contacts, will provide reliable advice and feedback and will get the right result for us.”
    Jim Copestick. Design Manager, Thorlux Lighting
    jim-copestick-design-manager-thorlux-lighting-main-original
  • “Accuracy, finish and delivery all first class.”
    Andrew Peters, JMDA Ltd
    andrew-peters-jmda-ltd-main-original
  • “We were really surprised at how quickly they turned round such a high quality prototype and the parts certainly proved their worth. They were of high enough quality to provide a fantastic visual representation and to undergo rigorous testing."
    Stuart Wilson, Design Director, TRT Lighting
    stuart-wilson-design-director-trt-lighting-main-original
  • "Tim's experience, expertise and supplier base have been fundamental to us getting this project off the ground."
    Sam Pearce, Jelly Products Ltd
    sam-pearce-jelly-products-ltd-main-original
  • Plunkett associates always find a solution. Whatever we need to prototype, they have a great grasp on the technologies available, costs and lead-times so we get the right solution tailored to the project needs. Have always delivered and never let us down.
    Naim Audio
    naim-audio-2-main-logo

Get in Touch with us for your Prototyping, Machining and Tooling & Moulding requirements…