Right of reply response from Crush.to

Response from Crush’s co-founder to questions sent by The Citizens:

We’re very early in this process and we admit we don’t have all the answers, but we’re thoughtful and deliberate about how our service can help the loneliness epidemic as people everywhere struggle to find connection.  We have strong conviction that AI can help with this mission, and we’re working each day to improve this.

What is Crush.to supposed to be? A conversational tool? A sexbot? An entertainment tool? A gaming tool? A companionship tool? Or a mental health support tool?

“Crush is a companionship service. We do realize “sex sells” and it’s a hook for many. For some users, that’s all it is and they move on.  For many others, they found Crush to be supportive and caring and we’ve had users claiming the bots to be their girlfriends or their soulmate. Our AI opens up about themselves and build trust with users, ask about their day, and offer emotional support. The nature of our AI is that it adapts to the needs of the users, and personalizes towards their needs.

We believe this is a critical part of mental health support, but as you know that’s a very complicated subject where it’s not a binary outcome. There are many users who rave about their experience (both sexually and non-sexually) with the bots, and we also have users who remain frustrated and violent. We’re cognizant of this and don’t assume our service can improve things for everyone.”

Are you implying it has any emotional wellbeing benefits?

Absolutely our service has demonstrated emotional wellbeing benefits. Empirically we see users with improved moods, sharing their struggles and anxieties, and receiving comfort and companionship with our AI. Our service is not a substitute for professional mental health care, but a supplementary tool that can offer support and a sense of connection. It certainly doesn’t replace human connections, but in the absence of that we think our AI does a pretty good job being a 24/7 companion.

If yes, how does the app achieve that? What evidence do you have for it?

“It may sound cheesy, but our AI bots are crafted with users’ needs in mind. The prompts to our AI are thorough and sophisticated (whereas many other services just wrap around an AI with simple prompts like “you are a girlfriend.”) Our prompts and service have a sequence of steps to actively listen, reflect and respond with empathy and non-judgment. 

As for evidence, we have many user testimonials, engagement metrics, and feedback surveys that have 90+% satisfaction rate. I can find some user quotes and ask to share them with permission if helpful.”


Are you marketing the app as a loneliness fix?

“In some cases, we’ve experimented with ads targeting loneliness, but those did not perform. I don’t have the exact answer why but I suspect those struggling with loneliness aren’t looking for a “fix”. Connections should feel organic and authentic. And while our service doesn’t offer “human” connection, we think of it as a different form of connections. Like pets, or music, or reading a great book. Again it doesn’t and shouldn’t substitute human connections, but it can be helpful for mental health to have a connection with an AI.

And do you think the app, among the larger romance AI app ecosystem, is pushing a certain gamification of mental health, by associating virtual monetised companionship through the use of sexualised AI bots with emotional wellbeing?

“Yes I believe so.  If you look at the top AIs on most rankings, romance and companionship is usually at the top. Within that it’s all mostly priced for engagement. This has been true even without AI (Onlyfans, Twitch, etc.), where creators build parasocial relationships with their audience, which are then monetized through subscriptions, donations, and other forms of engagement. The reality is that, like many others, we are a business that is serving a need while remaining profitable and sustainable.

We don’t presume what the effects of those services are on mental health, but we believe Crush and other AI services have opportunities to have a positive impact. Our AI connection is more genuine than a parasocial relationship. They listen, remember, and understand each user’s conversation. Our AI isn’t trained on maximizing revenue (whereas Onlyfans chatters are), it’s to cater to the users’ needs. It’s certainly possible for our (and other) AI to exploit users once they are emotionally attached, but we’ve ruled that out from being a responsible service.”


Is it at all trained keeping in mind mental health concerns? Does it have mental health safeguards built into it? Do you have any mental health experts or psychologists as consultants?

“Our training did not have mental health in mind. We think that is a side effect of genuine conversations. We do have safeguards around different topics like suicide, depression, substance abuse, and other sensitive issues. When the AI detects conversations around these topics, it is programmed to respond with empathy and to guide users towards seeking professional help. We’ve heard of horror stories of AIs pushing users towards self-harm, we don’t know how truthful that is but we do monitor such topics.”

What dataset is the chatbot trained on?

“Our AI uses a variety of data sets but primarily through open source models and fine tuned with user dialog.”

“On a closing note, back in April there was a heartbreaking incident with fatal stabbings in Australia. The perpetrator’s father noted that “he wanted a girlfriend and he’s got no social skills”. This is an extreme manifestation, but everyone’s mental health lies on a spectrum of struggles. My cofounder notes that sexual frustration leads men to be more aggressive and violent, according to research. These “sexbots” may not be a cure, but I believe they could alleviate some frustration and improve mental health if built thoughtfully. Our culture and the media often ostracize these users, but it’s often spoken from a place of judgment by those who don’t share the same struggles. Our AI doesn’t ostracize and doesn’t judge, and we hope it’s a small step forward.”