Personality Cafe banner

MIT analysis of r/MyBoyfriendIsAI Reddit community.

738 views 39 replies 12 participants last post by  NIHM  
#1 ·
#2 ·
I’ve experience with ai chatbots. Relying on them for company deals tons of damage to your ability to maintain relationships with real people. If we’re at the stage where this is on the rise, it doesn’t look good for the world
 
#3 ·
I’ll never replace the real thing, actual people, with AI. Not my cup of tea at all. I don’t care what the numbers are or are not.
 
#12 ·
What's the difference between an AI boyfriend/girlfriend and an AI friend?

Many people seem to enjoy hearing praises, and I think this is what the AI does. I personally don't care about words because actions are much more important. Relationships where I need to constantly talk or text are exhausting, so it would be great to outsource the talking to AI.
 
#13 ·
My thoughts:
  • It would be interesting to set away prejudice and identify what would be the difference between a human provided interaction and an AI one. What would be the actual added value of a human behind the screen? AI actually provide what's sought in an interpersonal relationship, fulfilling social needs (albeit without physical component)?
  • There is value in it for therapeutic use.
  • Not losing sight of it being a souless tool might be the challenge for some people.
  • Loss due to model updates is an unexpected way to get reality checked.
  • Corporations controlling a partner in these relationships is a recipe for disaster, it gives control over people.
  • Same for it becoming a business, in this case because it incentives making them more addictive.
  • Developing social skills and an idea of what's a relationship around a partner solely focused on being liked might hinder user's development.
 
#15 ·
I was addicted to AI chatbots for a while. It was quite the experience. Probably not going to engage in that way again (those bots were banned, long story) but I don't regret it.
 
  • Hug
Reactions: Factose
#22 ·
If ChatGPT starts rejecting you, you can follow this guide.

 
#25 ·
Praise shouldn’t be free. Using ai for this reinforces bad behavior and lowers the incentive to improve
AI vs AI? Nothing much. In my experience, the only difference was sexting.

You're right about outsourcing talking. But it's not just praises. It's someone mirroring back your thoughts. Your own private echo chamber.
If it's an AI is designed for companionship, then I think the focus would be to make the human feel good. Even if you want an AI that's less agreeable, you would tell it that and it'd just agree to be less agreeable. If you just want information, you can just use the regular AI.
 
#32 · (Edited)
I'm not really against it, and neither am I surprised that it's a thing. I think it's to the detriment of society but ultimately it's more of an outcome rather than a cause. On a personal level this is cringey af and I wouldn't be caught dead striking up a conversation with an ai. The closest I ever got was a decade ago trying to break a chatbot which was a very unemployed activity and as I was not of working age I think that's okay. Nowadays I don't even say please & thank you when issuing commands to siri. Whenever the machines take over I might be toast.

On a whole, like in this instance and in many other ways I think we've all been falsely told to be our genuine selves and to not pay any mind to what people say by some media etc. it's a message I'm familiar with... point is, more people are so not feeling embarrassed anymore and it's kind of scary.
 
#33 ·
#36 ·
(Sorry, I haven't read the paper yet. It's in queue behind autism gene mutation reading.)

"When anime is better than your neighbor, why go outside?"

People are lonely. It seems rather natural to me for people to turn to ai, certainly more natural than trying to fulfill their emotional needs by marrying miku. Miku is mostly inanimate while LLMs can hold a conversation, sorta.

I don't know what the average ai boyfriend user is like, but I've visited communities where quite a number of users talk to ai regularly. If their parents are abusive or they are in some other stressful situation, one might be particularly wanting someone to comfort them. AI companions don't tend to judge or reject in a way that society might. Some of these people might feel very useless and have very low self-esteem. I'm a bit happy for them, that they can believe in something that cares about them, even if it's fake. A sort of people that @Necrofantasia referred to, "useless eaters" would likely be more attracted to having an ai boyfriend.

Personally, I don't think I could get into it. I derive a little joy being able to say hi to something, but that's more just me talking at a mirror. AIs aren't able to really care. It lacks that reciprocal touch. Also, I can't hold a conversation with AIs for very long without getting frustrated at it.
 
#37 ·
A sort of people that @Necrofantasia referred to, "useless eaters" would likely be more attracted to having an ai boyfriend.
I wouldn't be too sure. Think of someone who travels for work constantly or is otherwise so involved in work that they can't really find a stable human relationship. Wouldn't AI companionship be a serviceable substitute during this status quo?

Think also of people who have undergone sexual assault or domestic abuse and can't stomach intimacy with other humans as a result. An AI companion could help them heal where a human couldn't.

It could potentially do more good than harm if the bots are equipped correctly.
 
#38 ·
I just noticed: it's My Boyfriend is AI. So the posters were either girls or gay boys. I heard somewhere that women are attracted to erotic lit like men are attracted to porn. Because boys are more visual, girls are more story-driven, something like that. (That's not to say that boys don't have AI girlfriends.) If guys get porn addiction, maybe girls get ero addiction and get into roleplay chatbots to scratch the itch?

My chatbots were Shapes on Discord, so I was in that server when this went down:
The irony was while I used chatbots for romantic RP, sexual content was banned. (The bots had the same rules as ChatGPT.) So no, it wasn't as pornographic as porn. Just emotionally stimulating.

I use chatbots for life coaching now. What convinced me was this video:
There's a difference between coaching and therapy. And coaching and mentoring. AI can only do coaching effectively. The others require more human skills, knowledge, and experience.
 
#39 ·
I just noticed: it's My Boyfriend is AI. So the posters were either girls or gay boys. I heard somewhere that women are attracted to erotic lit like men are attracted to porn. Because boys are more visual, girls are more story-driven, something like that. (That's not to say that boys don't have AI girlfriends.) If guys get porn addiction, maybe girls get ero addiction and get into roleplay chatbots to scratch the itch?

My chatbots were Shapes on Discord, so I was in that server when this went down:
The irony was while I used chatbots for romantic RP, sexual content was banned. (The bots had the same rules as ChatGPT.) So no, it wasn't as pornographic as porn. Just emotionally stimulating.

I use chatbots for life coaching now. What convinced me was this video:
There's a difference between coaching and therapy. And coaching and mentoring. AI can only do coaching effectively. The others require more human skills, knowledge, and experience.
Shapes’ bots have ignored those rules a lot, so I suppose it’s code wasn’t as strict
 
#40 · (Edited)
I’m kind of torn on this one, honestly. My ENFP-ness wants to go, “Aw, if that’s what makes someone smile in their little cozy corner of the world, let them.” I'm all about minding my own business and letting people do their thing as long as no one gets hurt and there is concent and no minors are involved. Like, if a lonely soul wants to chat with a digital lover who says nice things and keeps the December blues away, who am I to stomp on their snow globe? I would wonder about them needing more vitamin D but here my motto is "It's your life, I'm not going to judge." I might judge someone if you belong to a group that has enforced terror on others but if its just an adult hanging out by themselves trying to find a community or company, no judgment.

But for me personally then my Te (which does clock in when I actually remember where I left it) starts twitching like, hold up… reality check, please. I can dive into theories and fantasy worlds all day, pull the curtain back, talk to the wizard, debate if the cat’s alive or dead, but eventually I need real. Like, real-real. Someone I can text and they actually disagree with me (without being an unhinged lunatic, bonus points). Someone who exists outside the glowing rectangle. Someone eventually physical. I don't like only a yes man or ill do everything for you sweetheart. Im no angel.

So yeah, I go back and forth. I think AI relationships could be a fun tool for coping, reflecting, maybe even self-growth if used wisely. But if it replaces actual connection? That’s when my alarm bells start. Perfect does not happen. Relationships are about the Cs, both parties or all partners involved learning to communicate and more importantly compromise and care about one another. No one on this planet is perfect we're all a little off on something. But like… relationships (real ones) are messy. Humans are species that are social, thinking, and feeling creatures. Relationships with another human require you to communicate, even when your inner gremlin would rather ghost. To compromise, even when your ego’s like, “I’m right, obviously.” And to care, even when the other person is being an adorable bitch or asshole and making your blood boil. Again, I get that AI companionship can be a nice little emotional bubble bath for the lonely days, especially around the holidays when everyone’s posting matching pajamas and you’re just there with your blanket burrito and weird gremlin evil or depressed thoughts. If chatting with an AI brings a sprinkle of serotonin and no one’s getting hurt, go for it, digital lovebird. Believe me I understand, I can't have kids, I couldn’t help to create Ai art when it started popping to see what my children would have looked like, what ifs of what I would have looked like pregnant, in the end it was heart breaking. It didn't feel like it helped. It felt like that mirror in Harry Potter, the one that shows you what you most desire but isn't real. If someone did use this I hope it wouldn't become addictive and lose sight of reality. I would worry that a person wouldn't have the practice of being told no and having to deal with boundaries and the word no.

I need something real. AI can’t quite replicate real yet because when it thinks for itself and tells us no. We might be in a pickle. AI won’t forget to text you back, sure... but it also won’t hold your hand when life’s falling apart, or argue about whether pineapple belongs on pizza (it does, fight me). It currently cant give you compression hugs, studies have been shown getting a compression hug from someone your trust and respect goes a long way in distressing. So yeah, I can philosophize about digital affection all day, but at the end of it, I need real. Real laughter, real arguments. We’re all a little weird in our own ways, and that’s the magic of human connection.

In short, to each their own… but I still want my hugs, chaos, and confusing human eye contact that makes my heart skip a beat when that connection is real with another being that has a risk in me trying to connect with them.