p0358

joined 1 week ago
[–] p0358@lemmy.blahaj.zone 7 points 2 hours ago

Well, it's not that there's a particular "problem" in a sense like a bug. But it's that if the device can be pushed further, and thus by higher polling we achieve lower effective input latency and slightly smoother input, then why wouldn't we do it? The same way gamers get higher refresh rate screens (and sometimes yet try to push them further), or other devices.

As for the implementation, my module is partially based on a patchset for actual kernel module, but it's unclear to me if it was tried to be upstreamed or why it failed if so. But it clearly didn't make it in, and there's no sign of that changing any time soon. Maybe the kernel devs would consider it "unorthodox" to alter the descriptor parameters against what the manufacturer intended.

But some devices do allow themselves to be polled higher and will just sample the inputs more often, if their firmware and sensors are capable of it. In fact, many "gaming" mice have a proprietary software that uses a proprietary protocol (this often has a Linux equivalent like Sonaar for Logitech) to set on-device settings where it'll reconnect reporting different bInterval (requested polling rate) to the host based on what was set. And yet the manufacturers will by default use some "safe default" setting like 125 or 250 at most, just to avoid any potential issues on some devices and thus RMA costs, with opt-in options of 500 and 1000. But some manufacturers don't bother making such an option or an app at all, so that's where this module comes in. And especially for controllers, it's much less common to see such an option even if the app is there, even though high amount of non-Microsoft controllers do allow such overclocking (Microsoft ones at 125 Hz locked are pathetic, you can feel the latency difference between that and my 250 Hz controller overclocked to 500 Hz side-by-side).

But TL;DR is that it's just a gamer optimization, and one that isn't quite easily possible with upstream kernel currently. Some kernel modules do have some options for some degree of overclocking, but e.g. one of them has a bug that it didn't work with USB 3 devices, so yeah...

 

This DKMS module allows you to overclock some USB devices by overriding their endpoints' bInterval values in the device descriptors – if the device physically allows you to poll it at higher frequency and will give you more data.

Back on Windows this (with the same method) was rather trivial using the "hidusbf" program. And ever since moving to Linux I was pretty annoyed I didn't have a similarly simple enough way of doing the same thing. So basically I guess I had no choice but to make one.

And the module allows doing that for theoretically any USB device without patching and re-compiling the kernel. Installation instructions are in the README (there's .deb, .rpm and AUR packages):

https://github.com/p0358/usb_oc-dkms

So let me know what you think, and if you managed to overclock any gamepads or other devices, or want to try.

[–] p0358@lemmy.blahaj.zone 1 points 14 hours ago

And yet they nailed down the latency to be surprisingly low, it was much better than Parsec I used at the time on LAN, with NVIDIA datacenter being at 25 ms instead of the 5 ms it is at today (and people in the city it's at have it at sweet 1 ms) 

Of course there's a lot to dislike about the service and the trend overall, such as the recently inflated outrageous pricing, but from technical standpoint I was surprised how well it worked, with me being rather sensitive to latency. You're probably right there's more latency between mouse and the monitor already, but that also means the network doesn't necessarily add that much on top... 

[–] p0358@lemmy.blahaj.zone 1 points 15 hours ago

I will say I have one funny regression with my HDMI monitor where it sometimes goes blank for a bit when app goes full-screen on another monitor or right after wake-up. I laugh at this, because it's still a superior experience, and the kernel version that introduced this, fixed another quirk. Because the problem isn't with Linux here, but that this monitor has a broken ass firmware. And it resets itself after waking up from sleep or changing inputs, I had problems with this under Windows too, and other monitors don't do this. But I'm not going to point fingers at wrong direction, plus current state of things doesn't bother me. Same cannot be said about Windows, where another one of my monitors would randomly reset itself from time to time, which would cause the screen to remove itself from the system and cause the whole system to get 1-2 minute long aneurysm (hope you weren't gaming during that, especially a multiplayer game...). Meanwhile if that thing happens to this monitor on Linux, simply nothing happens and I don't even notice it.

Sooo maybe it's dumb luck that shit works better or just as well on Linux. But it's real. I didn't buy anything specifically for Linux, other than always sticking to AMD and avoiding NVIDIA, because I've long despised the latter. My whole system works great, the laptop I randomly purchases (AMD-based) works great, my parents' laptop works great, my grandma's computer works great, my work machine works great (well certainly much better than on Windows, though it's not a powerful machine), my friend-with-NVIDIA's computer works great (surprisingly), my other friend's computer works great (after figuring out how to install Arch; also with a broken monitor firmware suffering btw), his girlfriend's computer also works great.

Maybe it's actually dumb misfortune for those who have problems or some terribly obscure hardware. Maybe I live in some great lucky bubble where things work for the most part around me. Hard to tell which group is a majority and which isn't.

I do have the fingerprint reader on my laptop not working, that's unfortunate, but I forgot it's even a thing, since I never had one on another machine anyway. That same poor laptop got a bunch of 1-star reviews on the store's website for "poor work culture" just because Windows 11 at setup or idle would ramp up its fans to 100% for no reason, this never happens on Linux unless maybe I actually intentionally hammer it with something. It's crazy.

Okay one thing I'll have to admit, about one actual thing not working well, oh irony: my Steam Deck is the only device that has some huge problems with my Wi-Fi router. Just that device out of like 20 others. And just with that router. Drat. I'll have to see if the next major OpenWrt version will improve it.

Aaaaanyway, can you tell me more about the DP+HDMI problem? I'm actually somewhat curious. And what GPU do you have. I'm wondering if it's related to anything I've ever seen, or something else entirely.

[–] p0358@lemmy.blahaj.zone 12 points 17 hours ago (2 children)

"Children" aren't some foreign species that lives in a vacuum, who only pupate into actual human beings the moment they hit 18. This line of thinking absolves them off all responsibility and parenting, often used by horrible parents to excuse their behavior because "they're just kids" (aka I completely failed to raise my kid and don't know what to do about it). And it's high school, not kindergarten. You're the weirdo for infantilising them.

[–] p0358@lemmy.blahaj.zone 1 points 18 hours ago* (last edited 18 hours ago)

I am very sad it's using JSON instead of TOML

EDIT: and even sadder if it's just Steam Proton without regular WINE prefixes or umu's protons

[–] p0358@lemmy.blahaj.zone 6 points 18 hours ago (5 children)

I would sooner commit sudoku than ever do anything Kubernetes, and yet shit basically just works for me. Nothing is perfect, but it's 5x better than Windows, so I'm never going back. It seems the server and desktop Linux experience doesn't quite transfer and apply that much between each other. 

I'm not denying your experience to be clear. But for some people it really does work all well. Multi-monitor handling on KDE is so superior for me that I don't know how I ever dealt with whatever Windows was doing 

[–] p0358@lemmy.blahaj.zone 4 points 1 day ago

"It's a great opportunity to get rid of your student loan debt! The thing that doesn't exist in most other countries! I won't pressure my elected officials to get rif of it. I will enroll in military so that O can feel smarter than others who didn't and have to pay it back!" 

I saw that line of thinking... 

[–] p0358@lemmy.blahaj.zone 4 points 2 days ago

...does anyone not read FAQ section as "fuckyou section"?

[–] p0358@lemmy.blahaj.zone 6 points 2 days ago

To whom? Because "gimp" in any search engine only shows the image editor...

[–] p0358@lemmy.blahaj.zone 8 points 2 days ago

So what's off there? What are the challenges you had? When one searches "gimp" in any search engine, they only get search results for the image editor. One has to really go out of their way to look into some dictionary to find out the supposed meanings, and even a dictionary does not mention that it's a slur or anything, unless it's Urban Dictionary. Which of the meanings is supposed to be the bad one that's brought up?

And in any case, what would you realistically expect the GIMP project to do? The software is known worldwide for the past 30 years and the name is not a problem in the slightest in any of the non-anglophone countries. Throwing off their name and branding could be a project suicide, the rebrands are risky and I'd say don't go too well that often.

[–] p0358@lemmy.blahaj.zone 4 points 2 days ago (1 children)

Thank you.

And I consider these calls to defederation to be a good example of the problems fediverse has at its core. Defederation should be last resort, against instances either fully dedicated to or promoting illegal content, or just unmoderated or spammy.

Suggesting some whole instance should be defederated because they dared to ban people for obnoxious hate speech you've cited definitely does not make feddit look like the bad ones here whatsoever. That is my opinion here.

[–] p0358@lemmy.blahaj.zone 19 points 2 days ago (3 children)

Maybe in America? I can tell you in most of the world, nobody would even think to give a fuck about the name, it doesn't mean anything. The word "gimp" isn't even popular enough. 

This sounds like some weird copium: surely the app would take off and replace Photoshop long ago, if they just changed that damn name! There was one fork that thought that, with a different name, died shortly after creation. Because in reality nobody cares about the name. 

view more: next ›