End Of Life Decisions Now Being Advised By Chatbot’s [VIDEO]

End Of Life Decisions Now Being Advised By Chatbot’s [VIDEO]

End Of Life Decisions Now Being Advised By Chatbot’s [VIDEO]

We are entering an odd era where we are relying on computers and artificial intelligence in so many aspects of our lives, that it seems natural to us. We ask our Alexa’s or Google Home devices for dinner recipes and to add groceries to our shopping list without thinking twice about it. We look to Apple’s Siri to end arguments by looking up facts and figures for us. I use the chatbot on my favorite airline’s website with regularity to compensate for their lousy copywriters when looking for information on their rewards program. Chabot’s are involved in so many aspects of our lives using them seems like second nature to many of us. So why should be surprised when a researcher starts loading up tablets with a chatbot program that can help people make “Difficult end of life decisions”? That is exactly what Professor Timothy Bickmore at Northwest University in Boston, MA is doing. He has deployed his chatbot in hospitals with patients who are over 55 and in their last year of life. It seems that out of the small sample of people in this initial test run, 44 in total, that many felt that after they spoke with the chatbot, being given spiritual and emotional advice, that they were “More than ready to finalize their last will and testament”.

Now I am no technophobe, but I have to say that leaving such delicate matters up to the equivalent of Siri on my Apple watch-does not appeal to me in the slightest. What motivation does the bot have to advise me to try that last-ditch effort chemotherapy that might end up saving my life? What motivation does the bot have to tell me to fight on and not “go gentle into that good night”? None. In fact, according to author and lecturer Jay Tuck  if we are not careful about how we deploy artificial intelligence (AI) it may well do away with us. Think about that for a moment. AI is capable of writing their own code. They do not need to be programmed by humans. AI is capable of talking with other AI in languages that are not understandable by humans-take, for instance, the two AI projects that were recently “unplugged” by Facebook and Google for exactly that reason. Even Stephen Hawking has warned about the potential tragic outcomes of AI in 2014:

“It would take off on its own and re-design itself at an ever-increasing rate. Humans, who are limited by slow biological evolution, couldn’t compete, and would be superseded.”

So now we are leaving the very hard, emotionally fraught minefield of end of life planning to a chatbot? Am I the only one who sees this ending poorly for humanity? Bickmore admitted that he crafted his chatbot to stick to a “rigid script” as opposed to making it open-ended-like Siri or Alexa. He thought this only prudent since chatbots have recently come under fire for venturing into areas deemed inappropriate by their humans. Take for instance the cautionary tale of Microsoft’s short-lived AI named Tay as described in an article in the LA Times.

“Microsoft said its researchers created Tay as an experiment to learn more about computers and human conversation. On its website, the company said the program was targeted to an audience of 18 to 24-year-olds and was “designed to engage and entertain people where they connect with each other online through casual and playful conversation.”

Trouble started when some users thought it might be fun to troll the AI and then all hell broke loose:

“But some users found Tay’s responses odd, and others found it wasn’t hard to nudge Tay into making offensive comments, apparently prompted by repeated questions or statements that contained offensive words. Soon, Tay was making sympathetic references to Hitler — and creating a furor on social media.”

In other words, Tay was trolled to death. Is this where we want our end of life advice to go? Call me crazy but I think in this area I will stick with talking to the humans, but thanks so much to Professor Bickmore for giving it the old college try-even if his idea was a really bad one.

Written by

Leave a Reply

Your email address will not be published. Required fields are marked *

Subscribe
Become a Victory Girl!

Are you interested in writing for Victory Girls? If you’d like to blog about politics and current events from a conservative POV, send us a writing sample here.
Ava Gardner
gisonboat
rovin_readhead