AI ethicists are raising concerns about the potential dangers of digital recreations of deceased individuals, warning that urgent regulation is needed to prevent harm. Researchers from the University of Cambridge have suggested that users could soon upload conversations with their dead relatives to create chatbot versions of them.
These services could be marketed to parents with terminal illnesses or even to healthy individuals looking to preserve their entire life digitally. However, there are fears that unscrupulous companies and thoughtless business practices could cause lasting psychological harm and disrespect the rights of the deceased.
One of the main concerns is that companies may monetize these digital legacy services through advertising, leading to potential uncomfortable situations and privacy concerns. Furthermore, researchers warn that allowing children to interact with these “deadbots” could actually do more harm than good by disrupting the normal mourning process.
To address these issues, experts suggest implementing best practices such as procedures for sensitively “retiring” deadbots and limiting interactive features to adults only. Examples of similar technologies that already exist include chatbots created using GPT-3 and animated videos of ancestors created by companies like MyHeritage.
MyHeritage, for example, has introduced features like Deep Nostalgia and DeepStory, which allow users to create animated and talking videos of deceased loved ones. As these technologies become more advanced and widespread, it is crucial for regulations and guidelines to be put in place to ensure that they are used ethically and responsibly.