this post was submitted on 06 Jan 2026
853 points (99.4% liked)
Microblog Memes
10083 readers
1433 users here now
A place to share screenshots of Microblog posts, whether from Mastodon, tumblr, ~~Twitter~~ X, KBin, Threads or elsewhere.
Created as an evolution of White People Twitter and other tweet-capture subreddits.
Rules:
- Please put at least one word relevant to the post in the post title.
- Be nice.
- No advertising, brand promotion or guerilla marketing.
- Posters are encouraged to link to the toot or tweet etc in the description of posts.
Related communities:
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Thanks, I always wondered how other people solved this exact problem.
Now I know.
Poorly.
Carry on.
There's a correct way to solve the problem. You can get a GPU dock for like $100-300. Or you can get a long PCIE riser for like $20, and hook up an unused PSU and jump the start pins on the ATX cable with a paperclip. There's a correct way to use multiple psus too, but I had a paperclip and didn't have a jumper cable. Anyways it's stable enough that uptime is at a year and it hasn't started a fire yet.
I'm not saying I have a better solution or a better setup at home. I'm doing something extremely similar. I would 100% say that my solution sucks.... But it works.
There's plenty of ways to do it.
I wanted to get an x16 to dual oculink 8x connector card with a matching dock, so I wouldn't have to give up any bandwidth while getting high end graphics going, but I can't find anything that fits that bill. So I have a riser cable bodged together with the case of my system perpetually open.
It works. I don't love it, but I don't have the time/money to find/buy what I feel would be more ideal. So this is what I've done.
I don't want to give up half the PCIe lanes and I can't find a way to do that with anything that's not a riser cable.
So here I am.
Oh I had a gaming computer once with two PSUs like this once.