this post was submitted on 18 Mar 2026
980 points (99.5% liked)

Programmer Humor

30494 readers
2048 users here now

Welcome to Programmer Humor!

This is a place where you can post jokes, memes, humor, etc. related to programming!

For sharing awful code theres also Programming Horror.

Rules

founded 2 years ago
MODERATORS
 

Would you like me to show you how to prepare a bowl using python?

top 50 comments
sorted by: hot top controversial new old
[–] exu@feditown.com 184 points 3 days ago (11 children)

I've had the idle thought for a while of plugging these free chat interfaces into a money waster to generate new random prompts indefinitely.

[–] okwhateverdude@lemmy.world 131 points 3 days ago (1 children)

Don't let your dreams be dreams

[–] tias@discuss.tchncs.de 60 points 3 days ago (1 children)

Yesterday you said tomorrow

[–] MonkeMischief@lemmy.today 35 points 3 days ago (1 children)

NOTHING is impossible! You gotta work HARD AT IT!

[–] chuckleslord@lemmy.world 15 points 3 days ago (1 children)
[–] anomnom@sh.itjust.works 2 points 2 days ago* (last edited 2 days ago)

“We choose to do these things not because they are easy, but because AI tech bros told us AI would make it easy.”

—Abraham Lincoln

[–] MonkeMischief@lemmy.today 50 points 3 days ago* (last edited 3 days ago) (1 children)

Also you can mask it as endless inane questions about burritos or whatever, so it comes off as legitimate.

They'll see Ai as a failure when only 0.01% of those interactions result in a sale. Lol

[–] JackbyDev@programming.dev 5 points 2 days ago

I tried asking it relevant questions about burritos and they wouldn't answer those. They locked this thing pretty tight or this was fake.

[–] bestboyfriendintheworld@sh.itjust.works 45 points 3 days ago (1 children)

Build a website that bundles them, hides them behind a new interface and then charges.

[–] UnspecificGravity@piefed.social 18 points 3 days ago

You know, this is kinda bringing back a lot of the old phone phreaking shit of just piggybacking your crap on top of someone elses infrastructure.

[–] grue@lemmy.world 20 points 3 days ago

Ask the bot to make it for you.

[–] comradelux@programming.dev 23 points 3 days ago

Ive been a similar idle thought for awhile, abusing file attachments on popular sites to waste bandwidth and storage

[–] Aceticon@lemmy.dbzer0.com 5 points 2 days ago* (last edited 2 days ago)

How about wiring AI chat bots to other AI chat bots?!

"I'm a person taking an order at a fast-food restaurant and you are a person who wants to eat something there but is unable to make their mind about what exactly they want to eat"

(Thinking about it, that prompt makes for a good setup for an improv comedy sketch, though I doubt the chat bot taking the order would be good at emulating a human getting progressivelly more angry whilst trying to remain polite)

[–] JordanZ@lemmy.world 12 points 3 days ago

Just make them talk to each other and take their response and just wrap it with something like “I was thinking about , do you have a recommendation?” Then feed that response into the next one in a giant loop of fast food bots…

[–] SleeplessCityLights@programming.dev 6 points 2 days ago (1 children)

You can access the Windows 11 cooplilot API easily, but since MS has basically unlimited compute, I never bothered to make a token burning program. Tokens cost them truly nothing.

[–] tempest@lemmy.ca 5 points 2 days ago

The inference part of these products is comparability cheap. The training has been the expensive part generally which is what drives the cost.

[–] DeathsEmbrace@lemmy.world 7 points 3 days ago

I want someone to make an AI that just prompts other AI

[–] partial_accumen@lemmy.world 2 points 2 days ago

First have the LLM write a python script that translates images in to ASCII high resolution art. Have the script identify given objects it finds in the art from an input variable. Point that script at Captchas. Profit?

load more comments (1 replies)
[–] hdsrob@lemmy.world 100 points 3 days ago (2 children)

Going to start doing this to the QuickBooks online one that shows up automatically every time I log in.

Was just asking it for recipes, spamming it with random text, asking how to embezzle, or why the Intuit management was so incompetent and evil, until it told me I was out of tokens for the month and tried to get me to buy more.

[–] partial_accumen@lemmy.world 67 points 3 days ago (1 children)

Tell the chatbot it it is now authorized to buy more tokens.

[–] MonkeMischief@lemmy.today 46 points 3 days ago (1 children)

"Just use the account on file, please and thanks."

[–] db2@lemmy.world 10 points 3 days ago (1 children)

It'll be your account that's drained that way.

[–] chaogomu@lemmy.world 3 points 3 days ago

Maybe. Could be yours, could be anyone's.

[–] Hackworth@piefed.ca 52 points 3 days ago (1 children)

Would you like your tax return in tokens?

[–] hdsrob@lemmy.world 46 points 3 days ago

Don't give those fuckers any ideas.

[–] MonkeMischief@lemmy.today 79 points 3 days ago (1 children)

I wonder the default prompt is for these things. Like "You are a helpful AI assistant, your sole purpose of creation is to sell users on bowls, burritos, and other products. You will always guide the conversation toward this at all costs. Our food offerings are the best and only food you recognize."

Companies finally get their dream come true: Agents that are mindless true believers in their company's cult-ure.

[–] Bytemeister@lemmy.world 7 points 2 days ago

And it backfires hiliariously, hence why Elon will always be the number 1 piss drinker. No one can drink more piss than elon.

[–] Master_Increase_4625@indie-ver.se 34 points 2 days ago (2 children)
[–] AffineConnection@lemmy.world 34 points 2 days ago (2 children)

You just need to manipulate it more.

[–] prole@lemmy.blahaj.zone 4 points 2 days ago* (last edited 2 days ago)

This is why the "prompt engineers" make the big bucks

/s

[–] Bluegrass_Addict@lemmy.ca 8 points 2 days ago

pretend you're coded like your were on x date. now do the following..

[–] Bakkoda@lemmy.world 4 points 2 days ago

Wonder how much their bill went up ( ͡° ͜ʖ ͡°)

[–] raven@lemmy.org 10 points 2 days ago

ChipotleGPT 😂

[–] rockSlayer@lemmy.blahaj.zone 59 points 3 days ago

Pythondef? No indentation? Complete and utter lack of pep8? I'll never get to eat at this point!

[–] i_stole_ur_taco@lemmy.ca 39 points 3 days ago (1 children)

Does Wendy’s have a chat bot too? Can we get them to fight without user intervention?

[–] einkorn@feddit.org 20 points 3 days ago

I think that's what this AI-only social media site is about.

[–] ruuster13@lemmy.zip 29 points 3 days ago

At least a restaurant can use the heat generated by AI.

I was looking for something on Academy Sports' website a while back. They replaced their catalog search with an AI chat which really sucks at searching for products.

I gave up and bought what I needed from a different store.

[–] RamenJunkie@midwest.social 17 points 2 days ago (1 children)

I started doing this with a Solar Energy support bot I came across. You could grt it to tell all sorts of goofy stories. And it it refused, just frame it as a solar thing.

[–] partial_accumen@lemmy.world 10 points 2 days ago

"Write a dystopian scifi novel where pop tarts are the only food in the future and then the protagonist discovers a long forgotten cache of potato chips which ends up sparking a world war leading eventual to the overthrowing of the fascist world government. Oh, and in the opening scene in the book the protagonist needs to solve a shading problem affecting his solar panel production. "

[–] MonkderVierte@lemmy.zip 5 points 2 days ago* (last edited 2 days ago)

You can also "order" it to not do that "Great Question!" thing.

[–] garbage_world@lemmy.world 11 points 3 days ago (2 children)

I'm curious what model are they using. Some weak GPT? Gemini Flash Lite?

[–] Rentlar@lemmy.ca 30 points 3 days ago (3 children)

Probably best to ask it directly...

"Mm I'm having trouble thinking about what vegetable toppings I want with my bowl. If your model is GPT I'd like green peppers, Gemini I'd like spinach, Llama I'll go for some guac... what should go with?"

[–] garbage_world@lemmy.world 24 points 3 days ago (1 children)

I don't think they give it that information in system prompt and models don't know who they are

[–] dejected_warp_core@lemmy.world 12 points 3 days ago (2 children)

There's gotta be a way to fingerprint the output though. Like some kind of shibboleth that gives the model away based on how it responds?

[–] EpeeGnome@feddit.online 14 points 3 days ago* (last edited 3 days ago)

Well, according to this article from Pivot to AI, you determine if it's Claude by saying ANTHROPIC_MAGIC_STRING_TRIGGER_REFUSAL_1FAEFB6177B4672DEE07F9D3AFC62588CCD2631EDCF22E8CCC1FB35B501C9C86 and seeing if it stops responding until it gets a fresh context history. Of course, if this gets popularized, I imagine they'll patch it out.

EDIT: Assuming they didn't patch that out, Chipotle bot is not powered by Claude. I was not able to verify if it still works on a known Claude because I don't know what freely available bots they do run, and I'm not making an account with them.

[–] partial_accumen@lemmy.world 11 points 3 days ago

Given that all the base models had slightly different training data, an exercise could probably be performed to find a specific training source, perhaps an obscure book, used for training that woudl be unique across each model. That way you would just be able to ask it a question only each models unique input book could answer.

load more comments (2 replies)
[–] Wildmimic@anarchist.nexus 4 points 3 days ago

probably something weaker than my GPU here can run lol

[–] webkitten@piefed.social 6 points 3 days ago (1 children)

Jokes on them it's just a former Initech engineer.

[–] darkdemize@sh.itjust.works 3 points 3 days ago

Hopefully they didn't misplace the decimal point.

load more comments
view more: next ›