The Dana Centre in London is hosting a public debate about the future of robotics and society; and a recent article on BBC.com
highlighted one of the major issues to be discussed: increasing autonomous capabilities of robots and the possible consequences. It's a bit of a stretch, but i think questions of accountability such as ""If an autonomous robot kills someone, whose fault is it?" sort of relate to controversies in the realm of copyright and technologies that can infringe on copyright. Should we blame people who invent technology for instances where that technology was used for evil (read either killing people or copyright infringement). Copyright holders who want to protect their work are put in a pickle when other people use their protected work in subversive ways; and I don't a clear line can really be drawn between where credit turns into blame.
Obviously it's a bit different because with robots, there is no intervening arbiter...yet. So the onus is on the designer. But "as robots become more autonomous that line or responsibility becomes blurred."
This makes me very uncomfortable. Especially put in the context of military operations. Cool fact (...er) "Samsung has developed a robotic sentry to guard the border between North and South Korea. It is equipped with two cameras and a machine gun."
How apocalyptic is that?