this post was submitted on 03 Dec 2025
54 points (100.0% liked)

Hardware

5917 readers
176 users here now

All things related to technology hardware, with a focus on computing hardware.


Some other hardware communities across Lemmy:


Rules (Click to Expand):

  1. Follow the Lemmy.world Rules - https://mastodon.world/about

  2. Be kind. No bullying, harassment, racism, sexism etc. against other users.

  3. No Spam, illegal content, or NSFW content.

  4. Please stay on topic, adjacent topics (e.g. software) are fine if they are strongly relevant to technology hardware. Another example would be business news for hardware-focused companies.

  5. Please try and post original sources when possible (as opposed to summaries).

  6. If posting an archived version of the article, please include a URL link to the original article in the body of the post.


Icon by "icon lauk" under CC BY 3.0

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] riskable@programming.dev 2 points 2 months ago (1 children)

Why'd you give up on local image generation? With FLUX-based models and tools like ComfyUI, it's actually better than what you get with cloud-based services. You have a lot more control, and the wide availability of LoRAs makes it much more fun/useful, IMHO.

Having said that, if you don't have a modem GPU with at least 8GB of VRAM, it's not going to be a great experience. 16GB is preferable.

My great wish is for there to be affordable, fast GPUs with at least 32GB of VRAM. That would be enough to play a modern AAA game while also running other AI workloads at the same time (e.g. as a secondary aspect of the game).

I have two really fantastic game ideas that can't really exist without the average gamer having access to that level of hardware. Not for fancy graphics; for the AI possibilities 😁

[–] Alphane_Moon@lemmy.world 2 points 2 months ago (1 children)

I have a 3080 with 10 GB VRAM.

I wasn't getting very good results with Automatic1111, it felt like I was spending more time fiddling with config than getting the images I was looking for (nothing special mostly generic home hardware type objects, but done in a specific and consistent style and view).

32 GB VRAM won't be common for at least another 5-7 years IMO, perhaps more.

[–] riskable@programming.dev 1 points 2 months ago (1 children)

What model? I highly recommend trying AnimePro Flux in ComfyUI. It generates really great images in just six steps which is like 8 seconds for a 768x768 image on my 4060 TI 16GB. It'd be even faster on your 3080.

[–] Alphane_Moon@lemmy.world 1 points 2 months ago

Don't really remember which Stable Diffusion model was used. I followed a guide.

I am just looking for generation of simple two tone flat icons and very basic 3D photo images for products. It to save time making presentations. I would usually just cook something up based on free iconsets or Google images. It actually doesn't take long.

Gemini works pretty well for this (and support multi -model prompts), but I would prefer to use local models.