this post was submitted on 27 Nov 2024
1115 points (96.1% liked)

memes

18726 readers
2174 users here now

Community rules

1. Be civilNo trolling, bigotry or other insulting / annoying behaviour

2. No politicsThis is non-politics community. For political memes please go to !politicalmemes@lemmy.world

3. No recent repostsCheck for reposts when posting a meme, you can only repost after 1 month

4. No botsNo bots without the express approval of the mods or the admins

5. No Spam/Ads/AI SlopNo advertisements or spam. This is an instance rule and the only way to live. We also consider AI slop to be spam in this community and is subject to removal.

A collection of some classic Lemmy memes for your enjoyment

Sister communities

founded 2 years ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] TheOctonaut@mander.xyz 17 points 1 year ago (1 children)

Is Midjourney available for use locally now? Or have you misunderstood Midjourney taking 30 seconds to generate from their server as happening locally?

[–] Gutek8134@lemmy.world 15 points 1 year ago (2 children)

I know a guy that uses Stable Diffusion XL locally with a few LORAs, last time I've heard one 2k image took 2 minutes on RTX 30 or 40 something

IDK how expensive is Midjourney in comparison, but running it locally doesn't sound impossible

[–] TheOctonaut@mander.xyz 13 points 1 year ago (1 children)

The impossible part is that it isn't publicly available. It's not an open model.

[–] Gutek8134@lemmy.world 1 points 1 year ago

Oh, ok. I'm not into image generation, so I didn't know.

[–] herrvogel@lemmy.world 2 points 1 year ago

That sounds a bit too much. Generating an sdxl image and then scaling it up is the common procedure, but that should not take 2 minutes on a 40xx card. For reference I can generate 3 batches of 5 images (without the upscaling step) in less than 2 minutes on my 4070ti. And that's without using faster sdxl models like lightning or turbo or whatever.