this post was submitted on 07 Apr 2025
30 points (100.0% liked)

Technology

38512 readers
98 users here now

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 3 years ago
MODERATORS
 

I know the title will trigger people but it's a short so please briefly hear her out. I've since given this a try and it's incredibly cool. It's a very different experience and provides much better information AFAICT

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 2 points 5 days ago

evil play-through in a video game

This reminds me of the case of a parent who let his 6 year old play GTA. It's a notoriously "crime based" game, rated 18+... yet the kid kept progressing by just doing ambulance, firefighter, and police missions. I'd call that quite an indicator of their disposition 😉

AI isn't quite the same as a fictional setting, but it's potentially closer to that than it is to dealing with a real person.

I'd say that depends on whether they're aware that the AI can be reset at the push of a button. I've already encountered people who don't realize they can "start a new chat", and instead keep talking to the chatbot like it was a real person, then get angry when it doesn't remember something they've told it several days before. Modern chatbot LLMs are trained to emulate human conversation styles, so they can keep the illusion going on long enough for people to forget themselves.