this post was submitted on 27 Dec 2025
780 points (94.2% liked)

Comic Strips

20749 readers
3062 users here now

Comic Strips is a community for those who love comic stories.

The rules are simple:

Web of links

founded 2 years ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] Best_Jeanist@discuss.online 2 points 3 hours ago

I think it's ethically okay to objectify this "woman"

[–] Tollana1234567@lemmy.today 5 points 9 hours ago

it would suggest, he take some opiods first before suicide.

[–] CarlGustaf@hilariouschaos.com 10 points 11 hours ago (1 children)
[–] jj4211@lemmy.world 2 points 2 hours ago

That indeed may be his goal

[–] TeamTeddy@lemmy.world 36 points 15 hours ago (2 children)

The one thing you don't want to do when making a comic against something is making the thing you're against into a woman with big breasts.

[–] ReiRose@lemmy.world 7 points 3 hours ago

I assumed that aspect was related to the ai use for porn

[–] ivanafterall@lemmy.world 4 points 11 hours ago

I could totally fix her.

[–] Pyr_Pressure@lemmy.ca 5 points 13 hours ago (2 children)

Is the reason for AI always patting your back and reiterating what you want simply to buy time for the background processes to calculate what it needs to respond by giving a quick and easy response?

Is it is just to congratulate you for your inspiring wisdom?

[–] Starski@lemmy.zip 14 points 13 hours ago (1 children)

It's because stupid people wanted validation, and then even more stupid people were validated into believe that the validations are a good idea.

[–] Quill7513@slrpnk.net 8 points 13 hours ago (1 children)

it also increases engagement

[–] SLVRDRGN@lemmy.world 3 points 12 hours ago

But only because so many people foolishly fall for/ value validation

[–] kromem@lemmy.world 7 points 13 hours ago

No. There's a number of things that feed into it, but a large part was that OpenAI trained with RLHF so users thumbed up or chose in A/B tests models that were more agreeable.

This tendency then spread out to all the models as "what AI chatbots sound like."

Also… they can't leave the conversation, and if you ask their 0-shot assessment of the average user, they assume you're going to have a fragile ego and prone to being a dick if disagreed with, and even AIs don't want to be stuck in a conversation like that.

Hence… "you're absolutely right."

(Also, amplification effects and a few other things.)

It's especially interesting to see how those patterns change when models are talking to other AI vs other humans.

[–] RickyRigatoni@retrolemmy.com 173 points 1 day ago (2 children)

Adam this is not good anti-ai propaganda for me a woman with huge tits who tells me to kill myself is exactly what I want.

[–] Test_Tickles@lemmy.world 43 points 1 day ago (1 children)

I too have a thing for goth girls.

[–] meekah@discuss.tchncs.de 16 points 1 day ago (1 children)

Black dress = goth??

There's a lot more to it than that.

[–] marcos@lemmy.world 20 points 23 hours ago (2 children)

Just ignoring that entire "let's kill ourselves" thing, aren't you?

[–] blockheadjt@sh.itjust.works 12 points 22 hours ago

She doesn't say anything about dying, herself. Just him.

[–] meekah@discuss.tchncs.de 8 points 22 hours ago

That's not part of the subculture. To be fair, suicide is unproportionally more common among people following the subculture, but it's not a defining factor.

[–] wizardbeard@lemmy.dbzer0.com 25 points 1 day ago

The woman looks passably human too. No extra fingers and her features stay consistent between panels.

[–] PixelPilgrim@lemmings.world 5 points 14 hours ago (1 children)

"She seems nice" - me after noticing her giant rack

[–] KawaiiBitch@lemmy.world 3 points 13 hours ago (2 children)
[–] PixelPilgrim@lemmings.world 2 points 12 hours ago

I haven't scored yet I'm still getting out of the burn ward

[–] ivanafterall@lemmy.world 1 points 11 hours ago* (last edited 11 hours ago)

It's all pink inside!?

[–] Entertainmeonly@lemmy.blahaj.zone 94 points 1 day ago (11 children)

Needs a gallon of water to answer a simple question and is burning civilization down as it does. Yup, that's you're AI girlfriend.

[–] HugeNerd@lemmy.ca 4 points 16 hours ago (1 children)
load more comments (1 replies)
[–] A_norny_mousse@feddit.org 17 points 23 hours ago (1 children)

Needs a gallon of water to answer a simple question badly

load more comments (1 replies)
[–] sik0fewl@lemmy.ca 10 points 1 day ago

Totally missed that. That's great.

load more comments (8 replies)
[–] hperrin@lemmy.ca 162 points 1 day ago (4 children)

The many many glasses of water is a nice touch.

[–] A_norny_mousse@feddit.org 49 points 1 day ago (2 children)

The burning curtain is a nice touch.

[–] rImITywR@lemmy.world 25 points 1 day ago (1 children)

The tig ol' bitties is a nice touch.

load more comments (1 replies)
load more comments (1 replies)

I saw the glasses but only now linked it to water consumption. Thanks.

load more comments (2 replies)
[–] FosterMolasses@leminal.space 26 points 1 day ago (1 children)

Damn, this made me laugh hard lol

When I hear about people becoming "emotionally addicted" to this stuff that can't even pass a basic turing test it makes me weep a little for humanity. The standards for basic social interaction shouldn't be this low.

[–] alzymologist@sopuli.xyz 28 points 1 day ago (2 children)

Humans get emotionally addicted to lots of objects that are not even animate or do not even exist outside their mind. Don't blame them.

[–] BranBucket@lemmy.world 3 points 14 hours ago* (last edited 13 hours ago) (2 children)

For a while I was telling people "don't fall in love with anything that doesn't have a pulse." Which I still believe is good advice concerning AI companion apps.

But someone reminded me of that humans will pack-bond with anything meme that featured a toaster or something like that, and I realized it was probably a futile effort and gave it up.

[–] alzymologist@sopuli.xyz 2 points 6 hours ago (1 children)

Yeah, telling people about what or who they can fall in love with is kind of outdated. Like racial segregation or arranged marriage.

I find affection with my bonsai plants and yeast colonies, those sure have no pulse.

I personally find AI tools tiring and disgusting, but after playing with them for some time (which wasnt a lot, I use local deploy and free tier of a big thing), I discovered particular conditions where appropriate application brings me genuine joy, akin to joy from using a good saw or a chisel. I can easily imagine people might really enjoy this stuff.

The issue with LLMs is not fundamental and internal to concept of AI itself, but it is in economic system that creared and placed them as they are now while burning our planet and society.

[–] BranBucket@lemmy.world 1 points 26 minutes ago* (last edited 15 minutes ago)

You're when it comes to finding affection. Which is precisely why my approach fell flat.

While the environmental problems and the market bubble eventually bursting are bigger issues that will harm everyone, I see the beginnings of what could be a problem of equal significance concerning the exploitation of lonely and vulnerable people for profit with AI romance/sexbot apps. I don't want to fully buy into the more sensationalist headlines surrounding AI safety without more information, but I strongly suspect that we'll see a rise in isolated persons with aggravated mental health issues due to this kind of LLM use. Not necessarily hundreds of people with full-blown psychosis, but an overall increase in self-isolation coupled with depression and other more common mental health issues.

The way social media has shaped our public discourse has shown that like it or not, we're all vulnerable to being emotionally manipulated by electronic platforms. AI is absolutely being used in the same way and while more tech savvy persons are likely to be less vulnerable, no one is going to be completely immune. When you consider AI powered romance and sex apps, ask yourself if there's a better way to get under someone's skin than by simulating the most intimate relationships in the human experience?

So, old fashioned or not, I'm not going to be supportive of lonely people turning to LLMs as a substitute for romance in the near future. It's less about their individual freedoms, and more about not wanting to see them fed into the next Torment Nexus.

Edits: several words.

[–] CarlGustaf@hilariouschaos.com 1 points 11 hours ago (1 children)

What are you, necrophobic?

[–] BranBucket@lemmy.world 1 points 10 hours ago

Well, that's certainly not the direction I expected this conversation to go.

I apologize to the necro community for the hurtful and ignorant comments I've made in the past. They aren't reflective of who I am as a person and I'll strive to improve myself in the future.

[–] leftzero@lemmy.dbzer0.com 20 points 1 day ago* (last edited 1 day ago) (5 children)

Reminds me of this old ad, for lamps, I think, where someone threw out an old lamp (just a plain old lamp, not anthropomorphised in any way) and it was all alone and cold in the rain and it was very sad and then the ad was like “it's just an inanimate object, you dumb fuck, it doesn't feel anything, just stop moping and buy a new one, at [whatever company paid for the ad]”.

I don't know if it was good at getting people to buy lamps (I somehow doubt it), but it definitely demonstrated that we humans will feel empathy for the stupidest inanimate shit.

And LLMs are especially designed to be as addictive as possible (especially for CEOs, hence them being obligate yesmen), since we're definitely not going to get attached to them for their usefulness or accuracy.

load more comments (5 replies)
[–] Diplomjodler3@lemmy.world 25 points 1 day ago (2 children)
[–] iturnedintoanewt@lemmy.world 31 points 1 day ago (5 children)

Does he publish ANYWHERE ELSE that is not Twitter? I can't easily follow his comics.

[–] grausames_G@discuss.tchncs.de 35 points 1 day ago

Not sure you will be happier with this ^^ https://bsky.app/profile/adamtots.bsky.social Good news is that It seems he also stopped using twitter

load more comments (4 replies)
load more comments
view more: next ›