Eric Hilgendorf of Würzburg University admitted to the Süddeutsche Zeitung newspaper, that the idea of putting a robot on trial was still "pure science fiction.”
But he said he was, “sure we will live to see the day when a machine is made culpable in a civil suit, or that we demand compensation for damage caused by a robot."
Alongside robotics expert Klaus Schilling, Hilgendorf has been running a research centre for robot law at his university for the past two years. He is now in the final stages of organizing the first major international robot law congress at the Centre for Interdisciplinary Research in Bielefeld on May 7.
One of the most treacherous ethical fields is care for the sick and elderly. "Care robots are already in use in Japan, and will probably start caring for the old and senile in Germany in the next five years," he said.
"A machine stands by the bed and gives a signal when a patient calls for help or when they need to take medication. And the patient can even have rudimentary conversations with it."
"It's possible that the machine mistakes serious breathing problems for snoring and fails to signal for the doctor, and the patient then suffers a stroke," he said.
In a case like this, who is legally liable for the resulting financial costs? The manufacturer, the programmer, the doctor, or even the victim? Or is there a way of making the machine itself responsible?
"Of course, the machine has no financial fortune, but we have considered whether autonomous machines could be legally obliged to be insured," said Hilgendorf. "But then the industry would have to be prepared to offer that kind of insurance policy. A kind of liability insurance."
Next week's congress will bring together an eclectic mix of lawyers, engineers, and philosophers to discuss such issues.
Philosopher Andreas Matthias proposed in 2008 that there should be a criminal law for robots, because at some point robots will be responsible for their own actions, like a teenager who leaves home.
But Hilgendorf dismisses such discussions for his forum. "It's not about tin comrades who develop consciousness and create problems like in 'Terminator'," he told the Süddeutsche Zeitung.
"It's more about autonomous machines, which will need insurance cover. We don't need new laws and no new criminal law."
He says insurance liability throws up enough problems to be going on with. "Who is liable if automated parking instructions cause an accident?" the professor pondered.
"At the moment, the driver still makes the final decisions, but with safety measures like speed limits, that won't be the case for long. New systems prevent a driver from racing through a town at 250 kph, but that also takes away his power to make a final decision."
Another example is surgeon robots that perform operations. "In Germany there was already a "Robodoc" but the Federal Court stopped its use because it made mistakes in hip replacement operations. But then, human doctors make more mistakes – that was confirmed by studies. But we forgive humans, not robots."