The fear that mankind will become victim of its own success in developing science and technology has always been with us. All new technologies from fire to fusion carry the potential of immense danger as well as immense benefit. Robots and artificial intelligence are just the latest manifestations. So far, we have managed to avoid the worst nightmares of annihilation but there is no shortage of great minds, from Isaac Asimov to Stephen Hawking and Elon Musk, warning of the dangers of Artificial Intelligence (AI) and robotics. Nick Bostrom’s Future of Humanity Institute regards AI as human-kinds greatest existential threat.
Robotethics.co.uk pulls together publicly available material about robots and artificial intelligence, and make it more easily available to policy makers, developers, researchers, and designers of AI and robotic systems. This aims to facilitate the consideration of ethical issues at all stages of development.
Video clips, UK Parliament, Artificial Intelligence Committee, parliamentlive.tv, 12 December 2017, under 2 minutes
Witnesses to the Committee on AI emphasise the importance of teaching the ethics of technology in schools
Digital Minister, Matt Hancock, says the UK can lead the world in understanding the ethical implications of AI
Matt Hancock described the role of the Centre for Data Ethics and Innovation
See the fuller deliberations of the AI Committee amongst the videos on the page below.
At present the www.robotethics.co.uk website is structured mainly by media type as follows:
- Audio links (mainly radio)
- Video links (mainly Youtube)
- Written Material – Reports, articles, Blogs and News
- Organisations – Working in or have an interest in ethics applied to AI and robotics
The site is open access. If you are aware of pubicly available materials about robot or artificial intelligence ethics that might be added, want to comment or otherwise make contact then please email: email@example.com
(IEEE consultation on artificial intelligence now closed but see response at: robot ethics response).