Glitchvid

joined 2 months ago
[–] [email protected] 3 points 1 week ago* (last edited 1 week ago) (1 children)

I think it's "the algorithm", people basically just want to be force-fed "content" – look how successful TikTok is, largely because it has an algorithm that very quickly narrows down user habits and provides endless distraction.

Mastodon and fediverse alternatives by comparison have very simple feeds and ways to surface content, it simply doesn't "hook" people the same way, and that's competition.

On one hand we should probably be doing away with "the algorithm" for reasons not enumerated here for brevity, but on the other hand maybe the fediverse should build something to accommodate this demand, otherwise the non-fedi sites will.

[–] [email protected] 1 points 1 week ago

I've got a Xonar Essence STX II still faithfully plugging away in a PCIe slot, it'll be a sad day when I get a new system and it's no longer compatible.

[–] [email protected] 2 points 2 weeks ago

Sony Pictures Core, Kaleidescape, probably a few other niche ones.

[–] [email protected] 6 points 2 weeks ago (3 children)

Streaming services pretty much top out at 80Mbps, but more typically are around 15≃20Mbps for even 4K content, so even if they straight quadrupled the bitrate for 8K content you'd only be hitting UHD BD rates.

I don't disagree that BD will not exist for an 8K market, but that's because physical media is being killed.

This isn't even getting into the actual mastered resolution of much of this content, which you're lucky if it's even in 4K, most stuff is still mastered in 2K.

[–] [email protected] 17 points 2 weeks ago* (last edited 2 weeks ago) (9 children)

There isn't a particularly good delivery mechanism for 8K, Blu-Ray tops out at UHD/4K, and streaming is so bitrate starved 8K doesn't even matter.

[–] [email protected] 1 points 2 weeks ago* (last edited 2 weeks ago)

Most of the VCS ops in Hg are actually written in C.

GitHub is mostly written in Ruby, so that's not really a performance win.

Like I said, we're stuck with Git's UX, but we were never stuck with Hg's performance.

[–] [email protected] 1 points 2 weeks ago (2 children)

I don't think it's hyperbole to say a significant percentage of Git activity happens on GitHub (and other "foundries") – which are themselves a far cry from efficient.

My ultimate takeaway on the topic is that we're stuck with Git's very counterintuitive porcelain, and only satisfactory plumbing, regardless of performance/efficiency; but if Mercurial had won out, we'd still have its better interface (and IMO workflow), and any performance problems could've been addressed by a rewrite in C (or the Rust one that is so very slowly happening).

[–] [email protected] 3 points 1 month ago

Further hampered by the Steam "discussions" that are an incredibly unmoderated cesspit.

[–] [email protected] 6 points 1 month ago

Further, US national electric code states that a continuous load (3+ hours) is de-rated to 80% ampacity, so we're looking closer to 1400W.

The solution if you really need that much wattage in the US is to use a 240V circuit, dual pole 20A breakers are commonly available, Romex yellow is fine for 240V@20A, just gotta get a NEMA 6-20 outlet.

[–] [email protected] 8 points 1 month ago

I've had failure rates as high of new BD discs, even.

The US BD pressing plant shut down a while ago and the new ones are very hit or miss, I've gotten several that were heavily scratched or otherwise unreadable – brand new in sealed case, from the only NA factory.

[–] [email protected] 3 points 1 month ago (1 children)

If they were a small or free service I wouldn't have much issue, but they do charge, I don't think it's too much to ask that they at least attempt to scrape the wider web.

Building their own database seems the prudent thing long-term, I don't doubt they could shore up coverage over Bing. They don't have to replace the other indexes wholesale, just supplement it.

[–] [email protected] 8 points 1 month ago (3 children)

They have smallweb and news indexing, but other than that AFAICT they rely completely on other providers. Which is a shame, Google allows submitting sites for indexing and notifies if they can't.

Running a scraper doesn't need to cover everything since they have access to other indexes, but they really should be developing that ability instead of relying on Bing and other providers to provide good results, or results at all.

view more: next ›