|Course name||Seminar: The Ethics of "Deathbots"|
|Current number of participants||16|
|Home institute||LE Cognitive Science|
|Courses type||Seminar in category Offizielle Lehrveranstaltungen|
|First date||Wednesday, 06.04.2022 14:00 - 16:00|
|Art der Durchführung||Präsenz-Sitzungen ohne Video-Aufzeichnung [präsenz]|
Seminar: The Ethics of "Deathbots" - DetailsYou are not logged into Stud.IP.
- Wednesday: 14:00 - 16:00, weekly (11x)
- No room preference
- Wednesday: 14:00 - 16:00, weekly
- Bachelor of Science Informatik > CS-BWP-PHIL - Philosophy for Cognitive Science, valid from WiSe 2019/20
- Bachelor of Science Mathematik > CS-BWP-PHIL - Philosophy for Cognitive Science, valid from WiSe 2019/20
- Bachelor of Science Cognitive Science > CS-BWP-PHIL - Philosophy for Cognitive Science, valid from WiSe 2019/20
- Master of Science Mathematik > CS-BWP-PHIL - Philosophy for Cognitive Science, valid from WiSe 2019/20
Is it wrong to have deathbots? Or may they, on the contrary, be a good development? And if so, why? Should deathbots be allowed or should their use be restricted? What does the use of deathbot mean for our understanding of the deceased and bereft person? These are pressing ethical questions as the first deathbots enter the market. The sheer possibility to program deathbots does not mean that they are ethically permissible and should be implemented – while at the same time just because they mark a new development, it does not automatically mean that they are ethically impermissible. This seminar will give an overview over the ethics of deathbots. We will discuss literature on the ethics of chatbots and deathbots, debate deathbots in contrast to other (digital) afterlife presences and have a look at the ethics of digital remains more broadly.
The course is part of admission "Anmeldung gesperrt (global)".
The following rules apply for the admission:
- Admission locked.
After enrolment, participants will manually be selected.
Potential participants are given additional information before enroling to the course.