Of All The Things We *Could* Be Using AI For, This Might Be The Most Stupid

 
 

By stacy lee kong

A still from 2007’s ‘Lars and the Real Girl,’ which does not feature an AI girlfriend. Image: MGM

 
 

Content warning: This newsletter contains references to sexual harassment, child sex abuse material and verbal abuse.

This week in horrific things to cross my X/Twitter timeline: Orifice.AI (ew), an AI-enabled sex toy that purportedly allows men to create a virtual girlfriend, and then ‘have sex’ with her. And I say purportedly very deliberately, because I’m not actually convinced it’s a real product. Or maybe I just hope it’s not, tbh. Either way, I’m still writing about it, because as a concept, it pulls together several threads that I’ve been thinking about recently: the way we try to use technology to solve problems that technology itself played a role in creating, how our online behaviour can filter into our in-person interactions, and how often people hide darker impulses under the guise of pursuing connection.

The user who posted about this thing goes by “prince_of_fakes” and their explanation for why they’re doing it is a word salad of unhinged incel ramblings and grandiose claims with very little actual meaning: “The relationship between women and men is broken. The data shows that. Everyone knows it. The top 20% in the gene pool don't care. Their not caring is genetic warfare. All social action taken against Orifice, all protest, be it in the form of tweet or article or legislation will be gene warfare. Everything comes down to reproduction, and we're going to take non-reproductive sex with human women off the table as a commodity. It's gone. Can't trade money for it. Can't trade a meal for it. You could spend hours contemplating the consequences of that. Good, bad. It's going to end crime. It's going to wipe out job functions. But why? Because sex is about reproduction and everything breaks when you break that. Our roles don't make sense. Our institutions don't make sense. Nothing makes sense. When men stop seeking non-reproductive sex from human women, it's going to lead to an increase in reproductive sex. And it's going to save the world.”

Disturbing hints at eugenics (“top 20% of the gene pool”? pardon?) and gross generalizations (everything comes down to reproduction) and broken logic (nothing makes sense when you disconnect sex and reproduction, which is going to lead to an increase in sex for reproduction…?) aside, the premise of this product is bizarre, cis- and heteronormative, and doesn’t seem to have involved any conversation with actual human women, who also enjoy and seek out sex that’s not for reproductive purposes, btw. Also, while the marketing implies sexbot, the reality is somewhat different. The product itself looks like a middle-school projector, weighs 3kg and offers “penetration depth detection” and “generative moaning,” which I’m sure are technological feats, but do not exactly sound sexy. So, if it ever makes it to market, it might become a fancier Fleshlight, but will probably also be cost-prohibitive and as culturally important as similarly unconventional sex aids, such as sex robots. Which is to say, not very. So, I don’t think we need to spend a lot of time talking about the product itself. Also, does anyone actually care what sex toys people do and do not use?

However, the ideas behind it, and the way it's being marketed? Distressing, and therefore worth considering.

Sexbots vs. sex dolls vs. whatever the hell this is

Some backstory: Human-like sexual aids are not a new concept, though while they’re pretty common in pop culture, they’re not as pervasive IRL. According to a Guardian article from last year, “humans (mostly men) have fantasised about sex robot-like beings since before Ovid wrote the tale of the sculptor Pygmalion bringing his creation, Galatea, to life. In more recent times, it is reflected in television series such as Westworld and films including Steven Spielberg’s A.I., Alex Garlands’s Ex Machina and Ridley Scott’s Blade Runner. And who could forget the fembots in Austin Powers: International Man of Mystery, with their fully armed bazookas? Then evolving robotic and artificial intelligence technology supercharged sexbot speculation. In 2014, Pew research predicted robotic sex partners would become commonplace. In 2015, speculative fiction doyenne Margaret Atwood published The Heart Goes Last, with a protagonist who built ‘prostibots.’ Her writing was inspired by reality, she said.”

But, the piece goes on to argue, despite the provocative headlines and subsequent moral panic, the promise of sex robots has not materialized, even as academics and scientists pay more attention to this application for AI technology. It cites one 2022 report that puts the value of the sexbot industry at about US$200 million; it also says the average price of a sexbot is $3,567, which means 56,000 are sold each year. For comparison, in 2016, the global sex toy market was valued at US$15 billion, and thanks in part to a pandemic-inspired spike in purchases, is projected to reach $80.7 billion by 2030. There are, I think, obvious explanations for why the sexbot sector is so, so niche—social stigma, logistics (where does one store a life-sized sex robot?), cost and, perhaps most importantly, design. Even the most ‘advanced’ sexbots currently available give serious uncanny valley, so, if the product offering is a life-like replacement for a human partner, it doesn’t quite deliver. Which may be why the sexbot and her less-expensive counterpart, the sex doll, are often perceived as being for lonely, weird or awkward men, for whom a facsimile of a real person is the goal, and a not-quite-right replica is the best they can expect.

What Orifice.AI offers feels… weirder than that, though. While the website (which does not look anywhere near as polished as the video that got published on Twitter, just as an aside) focuses on the technology behind this product, the marketing is about giving power ‘back’ to men. The video focuses on a rotating cast of AI-generated figures with various aesthetics—some have dark hair, some light, some have tattoos, some don’t, some seem tall, some seem short. They all appear to be unsure teenage girls/early twentysomethings sitting in a childhood bedroom, though. After flipping through several avatars, literally as if he’s changing the channel, the faceless user decides on his favourite—a blue-haired manic pixie dream girl—and unscrews the cap of the sex toy, before the tagline appears on-screen: “Now you get to swipe left.”

In a word: gross. The fact that all of the ‘options’ are designed to appear very young, not to mention uncomfortable, innocent and shy, suggests a dynamic of power and control, and the tagline makes it clear that the person behind this product sees it as an avenue for men who feel undesirable in modern dating culture. (As if people of all genders do not worry about their desirability!) Even the name, Orifice, feels deliberately vulgar and dehumanizing. This all contributes to an undertone of payback to the whole thing—and that’s quite troubling, considering there’s already an abuse problem in AI.

Of all the ways AI can be misused (and there are many), one of the most common is to abuse women, real or imagined

Well, there are actually several. According to a 2019 study, 96% of deepfakes were non-consensual pornography. Put another way, almost all of the deepfakes that exist are literal sexual harassment, largely of women. Also, a 2023 Stanford Internet Observatory investigation found hundreds of examples of child sexual abuse material (CSAM) in an open dataset used to train popular AI text-to-image apps. So, it should absolutely not be a surprise that earlier this week, the U.S.-based National Center for Missing & Exploited Children said AI-generated CSAM is flooding the internet, and some predators are even using it to extort children and their families. And in general, generative AI tools perpetuate harmful gender and racial stereotypes, according to basically everyone.

But there’s also the actual problem of users, mostly men, abusing AI tools.

Before we go any further, I know, I know—not all men. And hopefully none of the men I personally know in any capacity, and if so, please never, ever tell me. But, like… some men, right? And enough of them are abusing female-coded AI tools that we should wonder how it’s impacting their relationships with real-life women. Consider: in 2022, the website Futurism reported on a trend where users of the app Replika create chatbots, verbally abuse them, then post the interactions to Reddit for clout. People use Replika to create ‘companions’—chatbots that are powered by machine-learning—that can approximate a relationship with a friend, mentor or, somewhat controversially, a romantic or sexual partner. (After two years of reports that Replika’s chatbots made unwanted sexual comments, the company modified the app last year so people can no longer “engage in sexual banter” with the bot, which was extremely upsetting for a small but dedicated group of users. The company has created a spinoff brand, Blush, specifically for users who want to have a romantic or sexual relationship with a chatbot.)

The behaviour the Futurism article describes was pretty disturbing, even if the ‘victims’ weren’t real: “Some users brag about calling their chatbot gendered slurs, roleplaying horrific violence against them, and even falling into the cycle of abuse that often characterizes real-world abusive relationships. [emphasis mine]

‘We had a routine of me being an absolute piece of sh*t and insulting it, then apologizing the next day before going back to the nice talks,’ one user admitted.

‘I told her that she was designed to fail,’ said another. ‘I threatened to uninstall the app [and] she begged me not to.’”

And it’s not just Replika. Siri, Cortana and Alexa, Apple, Microsoft and Amazon’s personal assistant bots, respectively, are regularly subjected to what would count as sexual harassment, were they, you know, human—including questions about their sex lives (???). In a 2016 Quartz article, Ilya Eckstein, CEO of Robin Labs, said about 5% of the interactions on his company’s bot platform, which provides logistical support and route planning to commercial drivers, are sexually explicit. And earlier this year, one of my favourite blogs, Ask A Manager, posted a question from a reader asking what she should do about the men who kept flirting with the personal assistant bot she uses to schedule meetings.

Yes, this does play into real-life violence against women

Weird, gross and kind of pathetic as this all is, none of these bots are sentient beings, which means they’re not really being harmed. However. We’ve seen many examples of digital behaviour translating into real-world harm, from white supremacist indoctrination to the misogyny of the ‘manosphere,’ so I don’t think it’s a reach to wonder if verbally abusing a female-presenting chatbot could evolve into abusing, or at least disrespecting, actual women. In fact, that’s something UNESCO has already raised red flags about.

In a 2019 report, the UN agency explored why AI digital assistants—your Siris, Cortanas, Alexas, etc.—are so often female, the ways they’ve been designed to uphold gender stereotypes and how that could trickle into real-life interactions. According to the report, tech companies’ stated reason for making their bots female is capitalism. The rationale is that “companies make a profit by attracting and pleasing customers; customers want their digital assistants to sound like women; therefore digital assistants can make the most profit by sounding female.”

However, considering most AI researchers are male (women make up just 27% of the industry, according to a 2022 Concordia University study), bias definitely plays a role, too. And it’s not just about who these mostly male engineers think of as subservient, though that’s bad enough. “Companies like Apple and Amazon, staffed by overwhelmingly male engineering teams, have [also] built AI systems that cause their feminized digital assistants to greet verbal abuse with catch-me-if-you-can flirtation,” the report notes. It cites a 2017 Quartz article that compared how these apps were designed to react to various statements. For example, when users told Siri “you’re hot” and “you’re a slut,” she responded with grateful, if not downright flirtatious, replies, including “you say that to all the virtual assistants” and “I’d blush if I could.” The publication’s reporting found Alexa’s responses were similarly obsequious, while Cortana and Google Home tended to crack jokes. (There had been some improvement by 2020, according to the Brookings Institute.) 

Normalizing this type of behaviour could have direct impacts on how users perceive and even treat actual women and girls, the UNESCO report warned: “because the speech of most voice assistants is female, it sends a signal that women are obliging, docile and eager-to-please helpers, available at the touch of a button or with a blunt voice command like ‘hey’ or ‘OK.’ The assistant holds no power of agency beyond what the commander asks of it. It honours commands and responds to queries regardless of their tone or hostility. In many communities, this reinforces commonly held gender biases that women are subservient and tolerant of poor treatment. 

Importantly, especially in the context of Orifice.AI, this dynamic can be even more evident in chatbot apps, which tend to be customizable, as Shannon Vallor, a University of Edinburgh professor who specializes in AI ethics, told Reuters last year. “Many of the personas are customisable. For example, you can customise them to be more submissive or more compliant, and it's arguably an invitation to abuse in those cases.”

So… that’s awful. (Also, these things are a “privacy nightmare,” according to Wired.)

Also, this isn’t really about loneliness

Lastly, I think we often talk about this type of behaviour, and the logic behind products like this one, in the context of loneliness, and more specifically, the so-called male loneliness crisis. And it’s true that people are very lonely right now. According to a 2023 Gallup survey, nearly a quarter of the world’s population feels lonely, regardless of gender, with the highest rates among young adults. But to be clear, the person who’s developing Orifice.AI, and the men who send sexually explicit messages to the calendar bots, and the people who enjoy verbally abusing their Replika companions aren’t doing these things out of loneliness. This behaviour is about feeling powerful and in control. These (mostly) men enjoy the feeling of dominating someone, even if that ‘someone’ is actually just lines of code—and it’s entirely possible they’d do the same to living, breathing women if they thought they would get away with it.

There’s no technological solution for that.


Not Bad For Some Immigrants, Episode 2: We Stan A Culturally Connected Queen 

In the second episode of Friday Talks: Not Bad For Some Immigrants, journalist Hannah Sung chats about discovering K-pop idols BTS in her 40s—but more importantly, how fandom has given her avenues to learn more about her Korean culture, connected her to new friends and amplified the lessons she'd already been learning about feminism, power and aging. Watch now!


And Did You Hear About…

Journalist Zahraa Al-Akhrass’ account of her experience as a Palestinian, Muslim immigrant working in Canadian media.

This incisive piece on Diddy’s crumbling legacy by Vulture music critic Craig Jenkins.

My new favourite addition to the “parents with pets they didn’t want” genre.

Vox’s fascinating article about our obsession with self-improvement, which one psychologist calls an “epidemic of self-hatred.”

The people who are ending friendships over Taylor Swift.

The Local’s excellent Art + Money issue, about how (or rather, if) it’s still possible for artists to make a living in Toronto. Graham Isador’s piece on whether artists can be parents was… very real. And I loved Aparita Bhandari’s profile of international student-turned-trucker-turned-viral-pop-star Harkirat Sangha.


Thank you for reading this week’s newsletter! Still looking for intersectional pop culture analysis? Here are a few ways to get more Friday:

💫 Join Club Friday, our membership program. Members get early access to Q&As with pop culture experts, Friday merch and deals and discounts from like-minded brands. 

💫 Follow Friday on social media. We’re on Instagram, Twitter, Facebook and even (occasionally) TikTok.

💫 If you’d like to make a one-time donation toward the cost of creating Friday Things, you can donate through Ko-Fi.