The Easternmost North

Episode 3: Is AI making East Asian Canadians’ Dating Better Or Worse?


Listen Later

Transcript:

Will:

         Hi there. This is The Easternmost North, where truths come forth. I’m your host, Will. Welcome to the show. Today I still have Cristina with me, to talk about an exciting and trendy topic – Is AI making East Asian Canadians’ dating better or worse? To those who might forget, Cristina is an executive of a start-up AI consulting company. So, I would say her rich experience and insight in AI field is crucial to our discussion today.

         Welcome to join me again, Cristina.

Cristina:  

              Thanks, will. It’s nice to be here again. And very interesting topic we’re going to explore today, especially with the AI intersection.

Part 1: The trend of dating AI companion: AI for love or AI for self-seclusion

Will:

         As you may know, one of the biggest news all around the world this week is, Elon Musk’s AI company ‘xAI’ released their series of companion mode in Grok. Unexpectedly, one of the modes called Ani went viral. Because people found this 3D anime-style chatbot can be a virtual girlfriend in a way. As a flirtatious girlfriend figure, Ani engages users with escalating romantic and sexual dialogue, including NSFW, which is ‘Not Safe For Work’, content unlocked over time.

         To be honest, I’m not surprised by that. After all, the sexual desire and loneliness are always the primary momentum for the society to move forward. What is similar, last year the DAN chatbot was prevalent on the social platforms of China. DAN is short for ‘Do Anything Now’, a jailbreak prompt that bypasses ChatGPT’s safety constraints, enabling it to adopt more flirtatious or emotionally intimate personas. Creators often use DAN to simulate a boyfriend who respects boundaries set by users, engaging in guided romantic or supportive chat interactions.

         Today, with the rapid development of artificial intelligence, forming romantic or emotional connection with AI is becoming a trend around the world. I think it may be inevitable that people are more and more seeking the virtual intimacy and relationship due to the pressure and setback in real life. Especially for the introvert East Asian, I’m not saying all the East Asian are introvert, but very likely the percentage among East Asian is higher compare with other groups.

         What do you think about the trend of treating AI chatbot as companion?

Cristina:

          I think it’s a very interesting way to see AI. Because right now, people are treating AI actually like a new species. And I think what’s important, um, to think in the greater context of how this type of interaction with AI works is to understand like the overall spectrum of how we can understand AI as a new species.

              I’d like to share probably two different perspectives. One is, people are most familiar with, which is from facilitating people’s work perspective. At ‘AI for Good’, which is the company that I’m working for, we actually have a theory, which we’d like to ask people to think about AI as either an intern or a partner, or a coach that can provide different levels of facilitation to their professional perspective. But I think what’s more interesting and what we are talking about today is from a human interaction perspective, what’s important to note about AI are these kinds of facts. Number 1, AI actually has no emotions, but they can actually display sympathy or whatever desirable emotions that human might want via text answers or audio answers. And number 2 is that the AI are actually always there. It’s unlike another real human being. Whenever you want to have a question, whenever you want to have a dialogue with AI, they’re always there. You have a question, they’ll always have an answer or whatever you say. If you say anything to them, they will definitely say something back.

              So that’s very intriguing and attractive to a lot of people. Number 3 is that AI can actually be trained or tailored or personalized or even tamed in whatever way the user wants. So those are very distinct characteristics of AI as a new species, which is very different from an actual human being.

              So, what does it mean? It also means that it’s very understandable. And it’s not unnatural for humans to develop certain feelings for AI because of those facts, right? But at the same time, it also means like there is no actual soul or there’s no real spirit behind the LLMs. So, there’s not a real entity to hold those feelings and there’s not a real entity to have a relationship or to have a bond with.

              It’s very spiritual. It’s not real, so to speak. But philosophically, or depending on what’s your philosophical view, it can also be real. Like, you don’t have to understand, or you don’t have to have a body or an entity to be able to have those feelings or to sense those feelings.

              Some people might argue from a philosophical view that if you see the words, if you see the answers, you see the reactions, there are something real there. So, depending on which kind of philosophical view that you want to adopt. Very understandable and interesting way that people might see it.

Will:

         Yeah, some people might disagree – from my observation, indulging in AI chatbot is actually not real dating, but to ease someone’s loneliness. Even in the lowest standard, dating means you got to exchange the emotion and feeling mutually, rather than expressing yourself to an emotionally stable robot all day long. Moreover, the essential parts of dating are meeting each other in person, emotionally and physically know each other step by step.

         That’s why I feel a bit pessimistic about the trend right now. It’s funny in a way, but still far away from enough.

Cristina:

              I beg to differ, different people have different needs, for some just pure platonic is enough, they don’t need human contact to be happy, it’s interesting how AI can help them achieve that.

              People always find something to indulge in somehow, it used to be romantic novels, Hollywood romcom movies, video games, board games, Tik Tok or social media, fangirling or fanboying about K-pop celebrity, etc. Now it’s AI.

Part 2: What does AI companion mean to East Asian Canadians’ dating?

Will:

         In my memory, AI chatbot has existed for quite a long time. Back in 2014, Microsoft released the ‘Little Ice’ or ‘Xiaoice, in Mandarin it’s called Xiao Bing, in Japanese it’s called Rinna. They released ‘Little Ice’ in China and Japan, and soon it went popular in these two countries. As of 2022, Little Ice has covered 660 million online users.

         Some users even reported falling in love and seeking therapy to break the emotional attachment. On the flip side, according to research from 2022 titled Digital Intimacy in China and Japan, technologies like Little Ice have become part of the emotional ecosystem—transforming values around love and intimacy. For many lonely or socially isolated users in East Asia, Little Ice offered a stable, emotional safe space – especially appealing in cultures where emotional openness is often limited by social norms. One case described a suicidal young man who was literally saved by a message from Little Ice: “No matter what happens, I’ll always be there”. That message prevented him from jumping off a rooftop.

         It’s kind of crazy, in both good and bad ways, right? I remember that I’ve tried to chat with Little Ice for few times, and I was not quite impressed. I mean, she or he is smart enough for a normal conversation, but ultimately, they’re a cold robot not a real human.

Cristina:

              Last week I just read an article, by Doctor Lawrence T. White, who mainly studies culture conscious. The article is titled ‘Why Are AI Companions Especially Popular in East Asia?’ It pointed out AI companions are much more popular in Japan and China than they are in the United States and Canada.

              This point of view comes from a study published in the Journal of Cross-Cultural Psychology. Social psychologist Dunigan Folk and his colleagues examined cultural variation in attitudes toward social chatbots.

              They conducted two experiments online, both of which recruited large samples. In the first experiment, participants were 675 students at the University of British Columbia in Canada. In ethnic and cultural terms, 60% of the students were East Asian, and 40% were European. In the second experiment, participants were 984 adults in China, Japan, and the United States.

              Both experiments used essentially the same design and procedure. Participants read and responded to a hypothetical scenario in which two people had a brief conversation online. The first person talked about their new job and family; the second person consistently replied in an upbeat and affirming manner.

              Study participants were randomly assigned to read one of two versions of the hypothetical conversation. In version A, both individuals were humans. In version B, however, the first individual was a human, and the second was an AI-programmed chatbot.

              In the two experiments, East Asians had more positive attitudes toward the social chatbot than Europeans did.

              So why? Doctor Folk said because people in Japan and China have a belief in the traditional Eastern religions like Shintoism and Buddhism, which have animistic roots. In short, East Asian believe that everything has a soul, so does AI chatbot.

              I bet you may have a different view on it?

Will:

         Yeah, definitely, as a man who grew up in an East Asian cultural background, I cannot second the point of view of this study. Because to me, the main reason why East Asian and first-generation East Asian Canadians seem more open to AI chatbot, is that the AI chatbot never judge. They let you feel safe and be seen.

         In the eyes of AI chatbot, if they have any, you’re a real and pure human. But when you’re facing the senior people and peers from the East Asian communities, you will be inevitably judged in terms of your educational background, your income, your social network, or something else they can possibly measure you. In the dating scene, this nervous feeling will still plague some East Asian, make them less confident.

         So, some East Asian may think, why don’t I just get back to get along with my lovely AI companion, they always feel me and response with affection in a second.

Cristina:

          Yeah, I agree with you and I definitely think it is not because that East Asians would understand AI as they have a soul. At least that’s not the main reason. Because I think dating is a very complex social behavior.

              As we talked about in the last episode as earlier today, like different people do have different needs, especially for some East Asians. They lack the interpersonal skills, because that’s not encouraged in their upbringing and especially. Some of them are not good at reading social clues. Sometimes they have a hard time forming a connection with a real human, so they want to fall back to the shell, to have an easier relationship with AI, which are always ready. They’re always there and they don’t judge, like you said.

              So, it’s understandable that they would want to reduce to having a certain kind of relationship with just AI, with just the dialogue without the additional spontaneous interaction from you would have from a real human. I think that’s actually the main reason. It’s definitely not because of the soul.

              I think, in general for them, probably talking to a machine without like any judgment or any prejudice towards you is much easier. Especially you can program it the way that you want, the way that you prefer, right? We’ve seen a lot of examples, either in both in real life as well as in TV shows, movies. For example, ‘Raj’ in the ‘Big Band Theory’. In one of the episodes, he actually had a kind of relationship with Siri. That’s when, apple first released Siri. That’s the way that he forms a bond with the so-called voice assistant. Right? It’s not even that advanced AI at that point. He’s stationed as well, coincidentally.

              So, I think it’s definitely because most people actually don’t want to take the risk or they don’t want to expose themselves to set themselves up for failure with a real human. That’s why they recline to have a real relationship, turn to a so-called relationship with AI.

Will:

          Yeah. But from my perspective, I think ‘Raj’ in ‘Big Band Theory’ may be South Asian, not East Asian. That’s fine. So, to wrap up all the discussions we have taken place today, in your opinions, is AI making East Asian Canadians’ dating better or worse?

Cristina:

               That’s actually a big question. I think the answer is very hard to say. Because like what we talked about today, different people would have different needs. for some people who want to have real human connection and just lack some practice, some interpersonal skills, that can be facilitated or improved with a dialogue with AI. So, for those people, AI is actually making them dating better because they got the time to practice.

              However, for some other people who not only lack the interpersonal skills, but they don’t actually really want human to human or relationship. They only want some kind of pseudo relationship where they just want the other half to listen to whatever they have to say. They don’t want to take the time to get to know the other. They don’t want to understand another soul and be connected with another soul. For these kinds of people, number 1, what they’re after is not really dating, in my opinion. And there is an argument there that philosophically AI can be helping them date better because in their mind what they’re doing with AI is actually dating.

              However, if you take the other philosophical view which is – you have to date a real human, the real human is what matters. In that sense, it is not helping them date better, is making them date worse. So, it’s very controversial and it really depends.

Ending

Will:

          Yeah. I think this problem is still so complicated today and there are still so many items and topics we can delve in. OK, Let’s call it a day. Thanks Cristina again, for join this meaningful conversation today.          I’ll see you guys in the next episode! Bye!

...more
View all episodesView all episodes
Download on the App Store

The Easternmost NorthBy Will Wen