riskable

joined 2 years ago
[–] riskable@programming.dev 1 points 3 months ago

2 terabytes of VRAM‽ 🧐

[–] riskable@programming.dev 40 points 3 months ago (5 children)

So let me get this straight: Stephen Miller is so universally hated that if he doesn't house himself on a protected military base, he fears for his life and family. His response to this is to double down on his continuous campaign of human rights violations‽

Dude! You can only live "safe" like that for three more years. Not even that long if Trump dies of a stroke/heart attack (which seems increasingly likely). Vance isn't going to protect you like this!

Now's the time to start making friends in Nazi sympathizing countries.

[–] riskable@programming.dev 9 points 3 months ago

We have a giant sulcata tortoise: Throw those melon shells into our yard! She'll really appreciate it 👍

[–] riskable@programming.dev 7 points 3 months ago* (last edited 3 months ago) (2 children)

It's much, much more complicated than mere rehabilitation VS punishment/salvation. When someone goes to prison for a minor drug offense—like this guy—what exactly are we "rehabilitating"? I seriously doubt he had a real addiction.

Then there's things like organized crime: By imprisoning gangsters we're simply removing them from society so they can't commit crimes against people who aren't also in prison. But this doesn't solve the problem of a gangster being able to commit crimes such as ordering a murderer from within prison (e.g. via their lawyer or a secret cell phone).

For such people, we have the death penalty (presumably).

Then there's white collar crime and fraud. Do those people belong in prison or should they instead be forced to live in "affordable housing" with one too many people sharing the same home, work a minimum wage job, having 100% of their wages given to their victims, and forced to regularly work overtime? Oh sorry, that's my "real justice for rich fraudsters" fantasy 😁

For health insurance executives, we should also make them wait on hold every day to get someone to push the button that unlocks the door to their room. Once a year, we'll make them go through a lengthy bureaucratic process in order to prove that they need access to running water. It should take at least a week.

[–] riskable@programming.dev 8 points 3 months ago

FYI: https://en.wikipedia.org/wiki/Florian_M%C3%BCller_%28author%29?wprov=sfla1

(He's cited in the article)

Florian is a paid shill. I recognized his name immediately. "That guy‽"

Totally destroyed his reputation by working anti-FOSS propaganda for Microsoft and Oracle.

[–] riskable@programming.dev 33 points 3 months ago (1 children)

Any judge that does this should be forcibly removed from any and all cases even remotely related to religion. Because clearly, they put their religious beliefs above the law.

If their religion says what to do about any given thing and that thing comes up in their courtroom? BAM! Instant disqualification. Get a proper, impartial judge that won't deny people legitimate government services because their religion dictates all their actions (aka theocracy).

[–] riskable@programming.dev 1 points 3 months ago

For reference, every AI image model uses ImageNET (as far as I know) which is just a big database of publicly accessible URLs and metadata (classification info like, "bird" ).

The "big AI" companies like Meta, Google, and OpenAI/Microsoft have access to additional image data sets that are 100% proprietary. But what's interesting is that the image models that are constructed from just ImageNET (and other open sources) are better! They're superior in just about every way!

Compare what you get from say, ChatGPT (DALL-E 3) with a FLUX model you can download from civit.ai... you'll get such superior results it's like night and day! Not only that, but you have an enormous plethora of LoRAs to choose from to get exactly the type of image you want.

What we're missing is the same sort of open data sets for LLMs. Universities have access to some stuff but even that is licensed.

[–] riskable@programming.dev 1 points 3 months ago* (last edited 3 months ago) (3 children)

Listen, if someone gets physical access to a device in your home that's connected to your wifi all bets are off. Having a password to gain access via adb is irrelevant. The attack scenario you describe is absurd: If someone's in a celebrity's home they're not going to go after the robot vacuum when the thermostat, tablets, computers, TV, router, access point, etc are right there.

If they're physically in the home, they've already been compromised. The fact that the owner of a device can open it up and gain root is irrelevant.

Furthermore, since they have root they can add a password themselves! Something they can't do with a lot of other things in their home that they supposedly "own" but don't have that power (but I'm 100% certain have vulnerabilities).

[–] riskable@programming.dev -3 points 3 months ago (2 children)

stole all that licensed code.

Stealing is when the owner of a thing doesn't have it anymore; because it was stolen.

LLMs aren't "stealing" anything... yet! Soon we'll have them hooked up to robots then they'll be stealing¹ 👍

  1. Because a user instructed it to do so.
[–] riskable@programming.dev 1 points 3 months ago

I guess I get to merge my code and never work on this project again.

This is the way.

[–] riskable@programming.dev 216 points 3 months ago (4 children)

FYI: That's more Windows games than run in Windows!

WTF? Why? Because a lot of older games don't run in newer versions of Windows than when they were made! They still run great in Linux though 👍

[–] riskable@programming.dev 4 points 3 months ago

A pet project... A web novel publishing platform. It's very fancy: Uses yjs (CRDTs) for collaborative editing, GSAP for special effects (that authors can use in their novels), and it's built on Vue 3 (with Vueuse and PrimeVue) and Python 3.13 on the backend using FastAPI.

The editor TipTap with a handful of custom extensions that the AI helped me write. I used AI for two reasons: I don't know TipTap all that well and I really want to see what AI code assist tools are capable of.

I've evaluated Claud Code (Sonnet 4.5), gpt5, gpt5-codex, gpt5-mini, Gemini 2.5 (it's such shit; don't even bother), qwen3-coder:480b, glm-4.6, gpt-oss:120b, and gpt-oss:20b (running locally on my 4060 Ti 16GB). My findings thus far:

  • Claude Code: Fantastic and fast. It makes mistakes but it can correct its own mistakes really fast if you tell it that it made a mistake. When it cleans up after itself like that it does a pretty good job too.
  • gpt5-codex (medium) is OK. Marginally better than gpt5 when it comes to frontend stuff (vite + Typescript + oh-god-what-else-now haha). All the gpt5 (including mini) are fantastic with Python. All the gpt5 models just love to hallucinate and randomly delete huge swaths of code for no f'ing reason. It'll randomly change your variables around too so you really have to keep an eye on it. It's hard to describe the types of abominations it'll create if you let it but here's an example: In a bash script I had something like SOMEVAR="$BASE_PATH/etc/somepath/somefile" and it changed it to SOMEVAR="/etc/somepath/somefile" for no fucking reason. That change had nothing at all to do with the prompt! So when I say, "You have to be careful" I mean it!
  • gpt-oss:120b (running via Ollama cloud): Absolutely fantastic. So fast! Also, I haven't found it to make random hallucinations/total bullshit changes the way gpt5 does.
  • gpt-oss:20b: Surprisingly good! Also, faster than you'd think it'd be—even when giving it a huge refactor. This model has lead me to believe that the future of AI-assisted coding is local. It's like 90% of the way there. A few generations of PC hardware/GPUs and we won't need the cloud anymore.
  • glm-4.6 and qwen3-coder:480b-cloud: About the same as gpt5-mini. Not as fast as gpt-oss:120b so why bother? They're all about the same (for my use cases).

For reference, ALL the models are great with Python. For whatever reason, that language is king when it comes to AI code assist.

view more: ‹ prev next ›