ROBOTS VS HUMANS
COULD ROBOTS CROSS THE LINE SOMEDAY?
BY Angelo Fernando
What images come to mind when you think of robots? Robotic arms on assembly lines or humanoids in sci-fi movies? Or perhaps you’re thinking of autonomous vehicles, which are essentially robots on wheels.
As robotics matures and becomes more embedded in life, the looks and roles of robots have evolved and the lines have blurred – I’m talking about the lines between machines that we have built to benefit humanity and those that could work against us.
Robots that could replicate human skills were only the start of the process. Large portions of a working population could be replaced by these machines.
Don’t get me wrong… I am all for the use of precision robots or the emerging field of drone journalism for data gathering and even exoskeletons to assist people who would otherwise be immobile. But could others go too far given the capacity to combine robotics with AI?
To look at this more objectively, let’s consider two types of robots.
BATTLE ROBOTS AI experts recently called for a boycott of the Korea Advanced Institute of Science and Technology (KAIST) because they foresee it as the start of a new arms race – one marching towards autonomous machines of war. Translation: a time when machines would be at war with machines.
The US military has been working with Lockheed Martin to develop autonomous armoured vehicles. And then there’s Google, which has become engaged in this space and is working with the military on artificial intelligence to analyse data gathered by drones.
BENIGN BOTS A more benign pedigree of robots comes in the form of disc-shaped vacuum cleaners and bots that roam around the office delivering files to cubicle dwellers. Or consider the da Vinci surgical robot that demonstrated its skill in a user-friendly setting at a coffee shop in Alabama.
This gangly four armed robot performs gall bladder and hernia procedure. Da Vinci is not an autonomous robot; it needs a human operator – a surgeon at a console.
But as we rush to automate, develop machine learning and embrace AI, we could be crossing the line without even realising it.
Are we really ready for this?
LAWS OF ROBOTICS I teach robotics so I realise how easy it is to see the related technology as a magic bullet, neglecting to frame the field with expediency and ethics. It propels us to develop breakthrough robots simply because we can.
There is such a thing as the Four Laws of Robotics, which was written not by policy makers or ethicists but science fiction writer Isaac Asimov.
The first law says that a robot should not injure a human through deliberate or unintended actions. The second states that a robot must obey orders given to it by human beings. The third is that a robot must protect its own existence as long as such protection does not conflict with the first or second law. And finally, the fourth law decrees that a robot may not harm humanity. Not just a human being but humanity at large.
All this comes into sharp focus today because of developments involving robotics on a global scale. Recently, the UN addressed the topic through the Convention on Certain Conventional Weapons (CCW) and the UN Institute for Disarmament Research. The debate involved what ‘autonomous’ and ‘automated’ imply, and what constitutes human control of these devices.
You know it was serious business when 116 leaders in the field of AI and robotics wrote to the UN last year urging it to step in and possibly ban the development of autonomous weapons in the name of war – a.k.a. killer robots. Signatories to the 2017 letter included SpaceX and Google.
Back in 2013, there had been a call to the UN to put the brakes on autonomous robots being developed.
It is reported that 123 members at the CCW conference unanimously agreed to begin formal discussions on autonomous weapons. Yet, only 19 of them called for an outright ban. It is possible to read between the lines and understand the dire implications of this scenario.
HEARTLESS ROBOTS In April this year, senior researcher in the arms division at Human Rights Watch Bonnie Docherty addressed a UN gathering and stated that ‘compassionless machines’ do not quite appreciate the value of human life. “Delegating life-and-death decisions to machines undermines the dignity of their victims,” she cautioned.
The UN should not be the only referee in this game. Indeed, given how drone strikes resemble video games, this is not a game – robots are here to stay. Whether someday they make our coffee or deliver packages from Amazon without laws or ethical guards, these same devices could undermine or work against us.
So while we cheer on the robots that could play table tennis or assist the elderly, let’s proceed with caution before the game is beyond human control.
The game will change when robots learn empathy and are able to react viscerally to stimuli in the world. Then Asimov’s laws will have to change for we would be creating the ‘new slavery’ if we don’t make robots full participants in society.
This is a scary thought! Why do we want robots to take over? Then is this the end of humanity?
I learnt many lessons by reading this article and I have changed my mind about robots and technology that is beyond human beings and their understanding. A few billionaires are doing this because maybe they think that they are having fun but this is serious business and this will affect future generations in a big way.
I agree that we should also be watchful of heartless robots as I think the writer is saying. I think that the world has become a greedy place because we want to do things for the sake of it and experiment to prove that we are geniuses.
While all of this is going on, there are millions of human beings living in poor conditions so why don’t we divert the money that we spend on such high tech to help them? Something is wrong!
I agree with Malinda’s views on this article.
I also agree.