Press 1 for Digital Necromancy

In a moment of boredom as I awaited being called up for jury duty polling, I was doom scrolling through social media. I came across this gem in my feed from some British news source. At first, I could not decide if this was one of those weird click bait social media feed articles written by AI. I have my days where I think 3/4's+ of the stuff (particularly videos) on the social media now is completely AI generated garbage. I poked around the website itself and it seemed to be a quasi-legit online opinion editorial website. I am still skeptical, but it did have some thoughtful editorials if you dug around. 

Regardless, the article itself is disturbing. It is not disturbing in the directly maudlin or intentionally dystopian kind of senses. The article was unintentionally dystopian. The author was simply reporting on a particular usage of an AI phenomenon that is on the rise: the chatbot. More specifically, and what caught my attention enough to read the entire article, was what the article terms a "deathbot."

Granted, that term was kind of click-baity. Say what you will about the art of the click bait, click bait is effective in its utility. You throw out an outrageous or sensationalist keyword, title, or thumbnail image for your video, and it piques people's interest. I am usually fairly immune to click-bait posts as I don't respond well to sensationalism, but occasionally, even I get sucked in. The term "deathbot" certainly made me arch an eyebrow, Mr. Spock style. 

The technology for automated interactions has been around for quite some time. You used to call the operator on the phone by dialing zero, and, though the operator was an actual human working a switchboard, the person, usually a woman's voice, was following a script. For those old enough to remember those days, the conversation when something like this:

"Number, please..."
"Yes, I am trying to reach Margo Adams. The number is Brentwood-4356."
"One moment, sir. I will try to connect your call..."
"Thank you."

I know I am aging myself with this post, but I still remember having a party line when I was a little kid in the early 1980s before major telecoms brought in all the new technology later that decade to our small town in rural Appalachia. If you have no idea what a party line was...basically several houses on the block shared a single phone line. I remember picking up the phone on Sunday afternoons and hearing Mrs. Campbell across the street talking to her sister in Elizabethton about biscuits. Anyone on the party line could listen in to other people's conversations. It was basically one big extension phone service. But, if you happened to pick up the phone and no one was using it, you could dial your number or call the operator. It was a script used by an actual human, but essentially, an early analog form of a chatbot. 

Once big telecom came in a decade later, everything eventually went automated. I don't know exactly when the press "0" for an operator went defunct, but it no longer exists and has not for a long time. Gradually we moved to answering machines and 1-800 number call trees where we have to "Press 1 for English" and "To end call, please 9" (as if simply hanging up would not end the call.) Now, you pretty much have to press 34 different buttons in the vain hopes to "speak with an actual representative" and when you finally do, you find you are speaking to Amir in Armpitistan who, while very polite, knows about 20 English words, knows nothing about how to access your account, and has no authority to do anything. 

This is the point in the blog where "AI enters the group chat." I admit I have a general disdain for AI. It seems in the last few years, AI has infested every aspect of American life. You can't swing a dead cat(bot) without hitting something with AI in it. There's AI assists on virtually all major search engines. AI summaries whenever you search for something specific. Teachers are having all kinds of problems with students using AI to write their term papers, and yet teachers are using AI to grade papers or to make lesson plans. We have truly opened a Pandora's Box-or perhaps pAIndora's Box-that is making people stupid.   

Of course, AI has evolved-if you can really say a computer program 'evolves' because a program is never more than the some of its programming-to the point where it can create automated chatbots. Basically, a chatbot is an computer automated but interactive simulation of a person using regenerative artificial intelligence programming that can carry on an ongoing conversation with an actual human. I have experimented with conversations with chatbots (I'm not a complete troglodyte). The early days were pretty limited in terms of what you could talk about. After a short time, the chatbot would basically start repeating itself or start saying nonsensical things. 

Much like computer games, the AI programming has evolved exponentially in a short time. For those of us who remember 8-bit Nintendo games like Super Mario Bros., today's massive gaming platforms can reproduce real life in real time in ways that were unimaginable even 30 years ago. Throwing fireballs out Mario's nose at two dimensional dragons has been replaced by sprawling open world quests in real time. In the same way, AI chatbots have gone from simple "Number, please" like scripts to appearing to emulate human emotions and even being able to recall anything ever said by a human to it. You even have AI assistance on commerce websites that seem like real tech support chats, but it's all a simulation. 

The article I opened this with talked about using AI as a "deathbot." To artificially recreate an interactive deceased person virtually so a grieving family member can-in theory-be able to make peace with their real life deceased relative and say goodbye. At least, that's the take of the original article. It seems to be presented as a moral good, or at least, a moral neutral. Something to help a grieving person seems a laudable goal on its face.      

But then if you have any sort of ethical, theological, or moral training, you can immediately start figuring out how absolutely dystopian and ghastly such a program would be. Let's call this for what it is: this is a digital form of necromancy. Necromancy the ancient practice of magic involving communication with the dead. This involved trying to summon their spirits as ghostly apparition or vision. The purpose of Necromancy was divination or trying to get the spirit to impart hidden knowledge, either of future events or other knowledge that can't be achieved by preternatural means. This is precisely what a deathbot is created to do: to bring back a virtual apparition so as to get knowledge you cannot access in life. 

Theological considerations aside, in terms of psychological practice, this is also morally and ethically wrong. Creating something that tries to do an end run around death and pretending it is real can do no  good long term. In fact, it has the high probability of doing harm. One never knows what nonsense an AI bot is going to spew out. Imagine a virtual deathbot of your granny telling you something false about your childhood or some pop culture advice the AI programming comes up with that is unhelpful, like granny died because "God needed another angel" or other such well meaning but false clap trap. In your emotional state, you take it for true. This creates even new problems from a relationship that ended on bad terms. Exposing a client who comes to you for help to something unknown, fake, and potentially detrimental is bad practice.

And there is also the issue of dependence. People in vulnerable emotional states can easily become dependent. Sometimes this is on drugs or some other physical escape. Sometimes it becomes unhealthy dependence on others, either a spouse or a therapist. Recreating a deathbot might we become an escape from the effects of death: that death is not real. We can just summon Deathbot Granny any time we feel sad or lonely because she (the real Granny) is gone. We can pretend she is not really dead. Once more unto the breach...what's the harm? 

Death is real. No matter how we wish it otherwise. We can pretend it has not happened. We can mask the fact that one day Death will come for each of us. 

Seeking solace in a world of AI make believe is not the answer.  

Comments

Popular posts from this blog

Thoughts on the 'Connecticut 6'

My boardgaming journey, part II

My board gaming journey, pt. I