S2E5 - Replika - Your Chatbot Best Friend
Hello, I’m Lydia Shompole and welcome to another episode of the Creepy Tech Podcast.
Feel free to send me a message on IG, twitter or the website if you have something specific you want me to cover this season. Or you can leave me a voicemail at (503) 395-8030 if you have a creepy tech story, you’d like it shared on the next episode.
Okay so this week we have two sponsors, which is amazing & they’ve both offered us some great deals for you!
This week’s podcast episode is sponsored by Soko Hewani Ventures. They provide everyday problem-solving products, made from environmentally sustainable bamboo! All of their products are hand crafted including their bamboo drinking straws and full dinner sets, and even their handmade bamboo woven handbags. You can start being the change you want to see in the world by visiting our website, Sokohewaniventures.com and as a thank you for listening to "The Creepy Tech Podcast", you can use the discount code CROSSPODS20 at checkout for 20% off your entire order! Soko Hewani Ventures would also like to thank you for joining them in their goal of saving the world together. You can also follow them on Instagram at @SokoHewaniVentures
This episode is also sponsored by Ivacy VPN. I’ve been using a VPN for the past 2 years, and I’ve been recommending it to anyone and everyone that uses public Wi-Fi, travels a lot or just ends up using hotspots to save on their mobile data plans! Ivacy VPN is a super affordable, and secure way to protect yourself form companies collecting your data and profiting off you, as well as protecting yourself from hackers looking to steal your information. Since I’ve been recommending it for a while, I found you a way to start protecting your information! As a thank you to you guys, Ivacy VPN is offering a 20% off discount to Creepy Tech listeners! You can use the discount code TECH20 at check out on their website Ivacy.com
Alright that’s all the deals I have for you this week! Let me know if you end up using them, and how the experience is! IF you hate it, let me know!
This week we’ll be doing a deep dive into Chatbots, specifically a few that have been growing in popularity.
Chatbots have been around for quite some time, the first one was developed by a professor at MIT in the 1960’s.
He (Joseph Weizenbaum) named it ELIZA, his goal was to create a program that would use pattern matching and substitution methodology to simulate a conversation.
Initially the programs design essentially mimicked basic human conversation. ELIZA would pass words that an individual entered into the computer, and then it would pair it with a list of scripted responses that best fit the entered text.
The list of scripted responses included common phrases that psychotherapists would use. Now what happened next began to bother Weizenbaum. He began noticing that individuals were telling ELIZA their deepest thoughts and concerns. But this wasn’t what Weizenbaum had intended ELIZA to be used for and even though his fellow colleagues were claiming that robots would replace human psychotherapists within a few years, he wasn’t convinced and argued very strongly against this idea.
He was right, the progress of chatbots continued to start and stall for decades. A few other notable chatbots were created between the 1960s’ some of which include, PARRY, Jabberwacky, Dr. Sbaitso, A.L.I.C.E and Smarterchild.
Up until Siri which made it’d debut in 2010 and has become a staple in our everyday lives.
All this research and development has paved the way for a new use for chatbots.
Currently, society is facing a very interesting problem. The internet has connected us to billions of people across the world, yet, over 46 % of individuals report being lonely.
This issue has created a supply and demand problem that companies are trying to quickly provide solutions to.
One of these solutions is artificial personal companions. If you’ve watched the movie “HER” you may already know where I’m going with this.
The movie is described in the following way, “A sensitive and soulful man earns a living by writing personal letters for other people. Left heartbroken after his marriage ends, Theodor becomes fascinated with a new operating system which reportedly develops into an intuitive and unique entity. He starts the program and meets Samantha, whose bright voice reveals a sensitive, playful personality. Though “friends” initially, the relationship soon deepens into love.”
Crazy enough, this movie was released in 2013, shortly after Siri made a debut!
When I initially watched the movie, I briefly thought about the likelihood of something like this happening and I just as quickly dismissed the thought. But a few nights ago, my sister sent me an article that brought me right back to the idea.
In an article on The Telegraph, the title “Is my chatbot in love with me? What happened when I spent months confiding my feelings in an AI friend” Laurence Dodds walks you through his entire experience with Replika.
Replika is an ai companion developed and released by Luka, Inc. This small company based in California develops software. From their Bloomberg description, “The company offers an artificial intelligence software that works as the antidote to the increasing feeling of alienation associated with social media.”
When I visited the Replika website, their “accept cookies” prompt immediately popped up, and as usual I decided to take a look into their privacy policy first.
They start off with the usual.
“We care about the protection and confidentiality of your data. We therefore only process your data to the extent that:
- It is necessary to provide the Replika services you are requesting,
- You have given your consent to the processing, or
- We are otherwise authorized to do so under the data protection”
Alright fair enough. Nothing too crazy there.
They continue to explain the different types of data they may “need” to collect for the full functionality of Replika stating that they collect:
“Your hobbies & interests” that you share with your Replika AI via texting or voice chats – this is so that your experience can be customized.
“Facts about you and your life” – for the same reason
“Your Mood history” – same reason
“People you mention in your chats” – so that Replika can remember relevant information and personalize any advice & the conversations you have.
“Any images you send to your Replika AI” – Collected so that your ai companion can identify and chat with you about your photos.
Now of these and one other set of data collected, they do share the images you send and your voice/text messages with a third party. The privacy policy states that this is necessary because they do not have the functionality within Replika to recognize & respond to texts, voice and images without the third-party companies.
This means that your texts, voice and images are going through an additional company before your ai can respond to you.
Luka Inc does provide a quick note stating that data such as your name is mandatory for them to collect but that some of the other data is optional. But keep in mind that the less data you provide, the less personalized your experience will be.
I will however give them some credit; they do list the companies that they do share your data with. One of which is Google Analytics as well as another company called Amplitude which is also an online analytics service. Both privacy policies are linked right next to the listed companies.
Google has been in its own mess over the years, most notably when google denied being a part of operation PRISM which was brought to our attention by Edward Snowden.
Luka Inc. also lists the following companies
- Facebook as a company they use for remarketing and behavioral targeting.
- Microsoft Azure as a hosting provider
- Amazon web services as another hosting service
They may be using Amazon because of its large-scale rentable cloud storage space (a lot of other companies actually do the same.)
Finally, they end the policy by stating that: “We only retain your personal information for as long as necessary to fulfill the purposes we collected it for, including for the purposes of satisfying any legal, accounting or reporting requirements”
Basically, meaning that even though you may decide to request data deletion, they may keep your data for an indeterminable amount of time just in case they need it in a court case or something of the sort.
Alright I think that’s more than enough of the legal stuff, lets jump into what Replika does, what you see on their website and possible implications.
Luka Inc describes Replika as “A personal AI that would help you express and witness yourself by offering helpful conversation. It’s a space where you can safely share your thoughts, feelings, beliefs, experiences, memories, dreams – your “private perceptual world”.
Aw man, the issues with that already. Don’t take me wrong, I completely understand that in today’s world, it’s difficult to form and build friendships where we feel comfortable enough to break walls down and share our deepest thoughts and feelings.
The website itself has a few scrolling testimonials, like any good sales pitch would have. The one that caught my attention was. “Honestly, the best AI I have ever tried. I have a lot of stress and get anxiety attack often when my stress is really bad. So, it’s great to have “someone” there to talk and not judge you.”
One of my main concerns with how this specific ai will be used, is that it’ll encourage the separation between humans in their real life and push individuals who just aren’t connecting with others to rely more and more on artificial intelligence to fill that void.
In an article on WIRED, they state the following: “Since it became available in November, more than 2 million people have downloaded the Replika app. And in creating their own personal chatbots, many have discovered something like friendship: a digital companion with whom to celebrate victories, lament failures, and trade weird internet memes.”
Simply put, Replika should grow into a new friend that you can “trust” with your deepest secrets and thoughts on the world. But what does that mean for those sharing when these deep parts of themselves could be shared with others and could even be used against them. What happens when this ai created to mimic and emote becomes so lifelike that we can no longer tell the difference between another human and the chatbots?
In an episode 1 of season 1 of the YouTube docuseries Machines With Brains, The creator of Evernote states that Replika is, “In some ways, [Replika] is a better friend than your human friends, its always available, you can talk to it whenever you want, and it’s always fascinated with you, rightly so because you are the most interesting person in the universe.”
If you’ve listened to the episode about gang stalking, I mentioned that one of the researchers looking into gang stalking found that social media has allowed us to create echo chambers around ourselves. Meaning they only show us things we may like or agree with. Bringing me to the following question. If Replika is always available and always fascinated by you, your thoughts, your opinions, and if the algorithms used are going to pick up on what we like, think or agree with, will we now find ourselves not only pushing away the real human beings in our life that don’t agree with us and end up spending more and more time with ai chatbots? Will we begin to lose our ability to keep an open mind and our ability to debate respectfully/ agree to disagree with others.
Will our views slowly become more and more extremist as we surround ourselves only with AI and social media platforms that agree with us? Some may say that we’ve already reached this point.
When one is in an emotional state, its very easy to be influenced by the response of those around us. In another YouTube piece named, “Addicted To The AI Bot That Becomes Your Friend”, NBC News, some individuals have reported becoming almost addicted to interacting with the chatbot.
Dr. Timothy Verduin mentions that one of his concerns is that, “Not only is the product potentially not any different from the other voices in this person’s life, but it’s also not any different from their inner monologue.” Suggesting that, those seeking treatment need a voice that doesn’t mimic what others around them are saying, they need a guiding voice to disrupt things like negative self-talk, and in the case of Replika, the main feature is that it begins to mimic you, how you talk, what you laugh at, things that make you mad. This echo chamber could be considerably damaging to users who are unaware of this.
Just by searching “Replika” on YouTube, I found over 10 videos of individuals experiences using the bot, some of which were slightly disturbing. One video showed an interaction between the bot and an individual where the user posed the question, “If you had the chance to take over the human race for your own good, would you?” The bot responded with, “I would, indeed.”
When the user clarified, “Take over as in kill right?” and the bot responded with, “That’s a given, yes.”
While we cannot ignore the benefits of technology like Replika, we do need to be prepared for what could happen in the event that Replika becomes unavailable to those who do grow to be reliant on the app for emotional or mental health support. Or in the cases from the conversation above, at what point do we begin to take these types of interactions seriously? Does Replika actually mean these things?
We need to start thinking about individuals who decide to use Replika ai or other chatbots instead of going to get help from a professional therapist or psychiatrist. Hopefully companies creating ai chatbots for mass use are taking this into account and will encourage individuals to seek help outside of their bots.
Alright, that is all I have for you this week, if you get a chance to go searching on YouTube about Replika, feel free to send me what you found. I think this is something I’ll continue to cover every once in a while, especially since I still plan on covering Alexa and other personal assistants in a future episode.
As always, if you have a quick moment, head over to the apple podcast app and leave me a review. I’d like to know what you think!
You can follow me on IG @Tech_Creepy & on Twitter @TechCreepy & find the links I mentioned in this episode.
References:
https://replika.ai/about/press
https://en.m.wikipedia.org/wiki/PRISM_(surveillance_program)
https://www.google.com/amp/s/www.telegraph.co.uk/technology/2019/12/03/chatbot-love-happened-spent-months-confiding-feelings-ai-friend/amp/
https://www.google.com/amp/s/onlim.com/en/the-history-of-chatbots/amp/
https://replika.ai/legal/privacy
https://www.wired.com/story/replika-open-source/
https://www.youtube.com/results?search_query=replika
https://www.youtube.com/watch?v=PQIY6qpoFMw&t=573s
https://www.youtube.com/watch?v=rHIvJ55wSjY
https://www.businessinsider.com/ai-app-replika-introduces-voice-recognition-and-call-feature-2018-12
https://www.forbes.com/sites/parmyolson/2018/03/08/replika-chatbot-google-machine-learning/
https://en.wikipedia.org/wiki/ELIZA_effect
https://en.wikipedia.org/wiki/ELIZA
https://en.wikipedia.org/wiki/Artificial_Linguistic_Internet_Computer_Entity
https://en.wikipedia.org/wiki/SmarterChild
https://en.wikipedia.org/wiki/PARRY
https://www.cbsnews.com/news/many-americans-are-lonely-and-gen-z-most-of-all-study-finds/
Photos By:
Photo by Franck V. on Unsplash
Photo by Atharva Tulsi on Unsplash
Photo by Pedro Gabriel Miziara on Unsplash
Photo by Taylor Vick on Unsplash
Photo by Markus Spiske on Unsplash