Glitterkoe

joined 2 years ago
[–] Glitterkoe@lemmy.world 27 points 1 week ago

yesyesyesno

[–] Glitterkoe@lemmy.world 1 points 1 month ago

There's plenty more books/trilogies staged in the same world she built!

[–] Glitterkoe@lemmy.world 8 points 1 month ago (3 children)

Robin Hobb's Assassin's Quest (Farseer trilogy 3/3). Devouring those on a holiday like I used to blaze through books as a kid!

[–] Glitterkoe@lemmy.world 4 points 1 month ago

Has to be the city organ, PWOOOOOAAAAAAAAAH

[–] Glitterkoe@lemmy.world 1 points 3 months ago

Thanks for the recommendation! I'll check those out. Just as a personal nitpick I'll look for one of their models with USB-C as opposed to micro-USB to be a bit more future proof.

 

Hiya! Any recommendations for a nice flash for the Sony A7 series?

I currently own an A7iii, but I'm debating an upgrade to the A7iv or A7v whenever it's released. I mostly shoot with my Sony 40mm G, Tamron 35-150mm, and Tamron 17-28mm.

Links to good buying guides would be great, too! (not AI generated slop recapping Amazon's top 10)

Budget is rather arbitrary as long as price/quality is upheld.

[–] Glitterkoe@lemmy.world 6 points 4 months ago

And then you have a trained model that requires vast amounts of energy per request, right? It doesn't stop at training.

You need obscene amounts GPU power to run the 'better' models within reasonable response times.

In comparison, I could game on my modest rig just fine, but I can't run a 22B model locally in any useful capacity while programming.

Sure, you could argue gaming is a waste of energy, but that doesn't mean we can't argue that it shouldn't have to cost boiling a shitload of eggs to ask AI how long a single one should. Or each time I start typing a line of code for that matter.

[–] Glitterkoe@lemmy.world 2 points 5 months ago (1 children)

Well, from what I understand for admins you have some config keys being PF_OPTIMIZE_IMAGES to toggle the entire optimization pipeline (or accept supported formats as is) and the IMAGE_QUALITY percentage as an integer to tweak the lossy compression for formats that support it.

The image resize to 1080 is even hardcoded in the optimizrtion pipeline. I think I saw a toggle for it on the PHP side, but it seems they only expose the toggling of storage optimization as a whole for admins. The 1080 is currently not exposed as a parameter to set, sadly.

As a creator, I was interested in the maximum possible quality to retain. As PNG is often supported and by design only features lossless compression at best while remaining well under 15MB for a file with common image aspect ratio's, that was the winner in that regard. My uncropped 24MP images then become 3MB-ish.

Other formats tend to be way smaller in filesize due to lossy compression being so effective and most images I checked on Pixelfed are resized&optimized JPEGs well under 1MB (around 600-800KB). That is probably the file format and size you'll encounter most.

My own filesize comparisons were for RAW exports using Darktable for different file formats, qualities and resolutions. The PHP image pipeline used by Pixelfed will probably yield comparable results for the same image.

If I were to advocate new settings, that would be cranking up the resolution to more modern standards (like fitting a 4k monitor) and converting to WebP at some 85% (or sticking with 80%).

It's difficult, though, as that may introduce double-lossy pipelines when converting other lossy formats. That's why I looked into resolution settings first. If you upload an image that is too large, it currently decodes your (maybe lossy) image, resizes that (lossy, probably?) and re-encodes that using the set lossy quality if applicable.

Thus, first order of business: at least publish ideal image sizes.

Second, better quality control. Might involve settings per file format or setting a unified output file format.

 

TL; DR; PNG resized to 1080 pixels on the short edge.

Just recently started dabbling with Pixelfed, but I couldn't find the best upload settings anywhere in the documentation, bar the 15MB upload limit. The server documentation states a quality percentage to use and boolean switches whether to resize and/or optimize images using aforementioned quality.

Turns out: images are by default resized to fit 1080 pixels on the short edge and are re-encoded using https://image.intervention.io/v2/api/encode set to the server's chosen quality percentage (which usually isn't advertised for instances but is 80 by default). Luckily, PNG is accepted nearly everywhere and its compression is lossless.

Those PNGs are way bigger than actual full resolution WebP's with a quality over 80% for my 24MP RAW exports. Most resized and optimized images tend to be smaller than 1MB, though, whereas the allowed PNG's are just shy of 3MB.

A full-res 24MP test image on 85% quality WebP dives well under that 3MB. Restricting to something more modern say 4K monitor resolution at 85% WebP would be well within current optimized file size ranges of about 1MB from my experiments with Darktable exports.

[–] Glitterkoe@lemmy.world 4 points 6 months ago

Cool! I'm glad more people are picking up Darktable! Ever since I switched the 'image processing workflow' to 'scene-referred (sigmoid)' my editing productivity skyrocketed. It's way more intuitive than the filmic RGB module IMHO. How are you finding Darktable?

[–] Glitterkoe@lemmy.world 2 points 6 months ago

Good shower thought

[–] Glitterkoe@lemmy.world 14 points 7 months ago (1 children)

Pretty sight for sure, but the editing is overdone IMHO

[–] Glitterkoe@lemmy.world 6 points 7 months ago (1 children)

I guess if the need for more badges arrives you could always change the design or offer an option then.

Semantically it makes sense to put it after the community (ie crossposted from somewhere else) or after user who did it. I'd rather have this information in some shape or form in that location than that it's shaped like a badge.

[–] Glitterkoe@lemmy.world 1 points 8 months ago* (last edited 8 months ago)

The one that lives within or is in the living. It's alive in all of us (the community) and the other way around: our contributions live within the app and OSM. Also, its supposed to be the fork that lives on. I think this would be a subtle nudge without carrying a scarred name like phoenix/revival with the project forever.

 

From what I understood, you didn't want to open-source Summit because you don't want to allocate your resources to managing issues and reviewing pull requests amongst other reasons (correct me if I'm wrong!). I don't know if you can disable Issues/PRs on GitHub, but I think it would give a lot of (potential) users peace of mind if the source code could be reviewed. As far as licensing goes, you could go quite stringent with an AGPL if that is a factor, to prevent closed-source clones.

Anyways, I find it sad to see that Summit often gets bashed in Lemmy application discussions for being "yet another proprietary app, no thanks".

That said, if setting up publishing actions or other packaging shenanigans is a hurdle, I'm sure there's people who would love to help.

 

Short version of a past post: I'm considering to license my startup's software under the LGPL license, which mostly concerns our "applied science" libraries. Does anybody have perspectives worth sharing on the usage/reception/dependency on LGPL libraries from a personal or company perspective? How often would it still be "blacklisted" like the GPL sometimes is?

Amongst other things the libraries do include tooling for a domain specific language (parser, compiler, language server). The reasoning would be that we would like to lower the barrier to integration of the methods and libraries versus GPL, but don't want proprietary (language) flavors popping up instead of open-sourced contributions somewhere. It might also somewhat prohibit larger parties from "overtaking" the project into something proprietary entirely.

Side note: our low-level elemental libraries are mostly MIT/Apache because these things aren't our core business and are mostly filling gaps where standard implementations are missing.

 

Has anyone here tot any experience with the WTS drums? I'm in the market for a new set of shells and although I'm usually a TAMA guy, these shells look like a blast to play around with. The Sweetwater demo and Sounds Like A Drum coverage on YouTube look and sound promising.

 

Why is any informational discourse regarding anything not strictly OSI-style open source immediately removed by moderators in this community? How can we expect people to educate themselves through censorship instead of public discussion? I'm looking for an open source licensing strategy for my own startup company and discussing the do's, don'ts or even perception of different licenses and strategies seems highly important to the community to me. I could understand if it was actively promoting something 'bad' or wouldn't mind having clear tags or disclaimers that underline what is or isn't strictly OSI but it feels a bit too rigorous right now 😕

12
submitted 11 months ago* (last edited 11 months ago) by Glitterkoe@lemmy.world to c/feddit_nl@feddit.nl
 

Laat ik vooropstellen dat ik enorm blij ben dat er al een redelijk bevolkte Nederlandse Lemmy instance bestaat. Maar met instances als deze of bijvoorbeeld tchncs.de bekruipt mij toch altijd het gevoel dat het risicovol is met één persoon als beheerder. Moet de instance dan 'gewoon' verlaten worden in de hoop dat een nieuwe opkomt als de beheerder uit beeld raakt of wat dan ook?

Hoort deze fase van instances gewoon bij het zijn van 'early adopter'? Ik zou het heel vet vinden om een stichting of iets anders op te richten of te sponsoren als die zich in zou zetten voor een NL rijtje gefedereerde services met behoorlijke statuten die de gebruikers beschermen. Ik betaal daar met liefde en plezier een (vrijwillige) subscriptie voor van een paar € per jaar.

Of bestaat dit al?

view more: next ›