RadDevon

joined 2 years ago
[–] RadDevon@lemmy.zip 3 points 14 hours ago

Ah, true! Unfortunately for the anonymous LLM whose reputation is at stake, this was something like a platformer.

[–] RadDevon@lemmy.zip 3 points 1 day ago (2 children)

I found a review summary that said the background music in the game made it difficult to see enemies and that I should turn down the BGM track to fix it. 😆

[–] RadDevon@lemmy.zip 9 points 1 week ago

Another frequent transit user here. When people complain that I'm early for something, I like to tell them that, since I ride transit, my choices are to be early or late, but I can't choose to be on time. 😅

[–] RadDevon@lemmy.zip 6 points 2 months ago (1 children)

I’m running this. There are a few things I don’t like about it. The biggest issue for me is that it touts itself as a privacy-centric file converter, but then it makes requests to Google Fonts and Cloudflare. I’ve blocked loading of those scripts in my browser, but I don’t understand why you would add those things to a service that’s supposed to be focused on privacy.

The other issue I had is that the default video conversion server URL is baked into the Docker image. Whereas normally I would configure something like this by simply passing an env var through to the container, here I have to build my own image which makes updating the container more of a hassle.

Seems to be fine as a file converter though.

[–] RadDevon@lemmy.zip 8 points 2 months ago* (last edited 2 months ago) (1 children)

Skate Story was a real trip. I love the surreality of the thing. It may be my favorite skateboarding game apart from Tony Hawk.

[–] RadDevon@lemmy.zip 3 points 2 months ago

I’m on a much weirder setup than you’re proposing — Bazzite Linux with a Pico 4 connected wirelessly via ALVR — and it mostly just works. I had to jump through a few hoops to get everything working to start, mostly related to tweaking wireless and audio configuration, but these are things I doubt you’ll encounter at all with an Index. I haven’t tried a game yet that doesn’t work. I mostly just care about Beat Saber and a couple of others, but they’re all working well. I’ve even bought a few new games since switching to Linux, and I can’t recall any I’ve tried that don’t work, out of maybe a dozen or so total I’ve tried. I suspect you'll have a much smoother experience with the Index.

[–] RadDevon@lemmy.zip 2 points 4 months ago

I’m using ALVR with a Pico 4 headset on Linux, and it’s pretty solid. I’m not playing a ton of VR games, but so far I haven’t found anything that doesn’t work. It has some quirks, but it works pretty well. I’m very happy with it.

[–] RadDevon@lemmy.zip 1 points 10 months ago

Just remember any backup is better than nothing.

This is comforting.

There are several reasons to backup data only and not the full system. First you may be unable to find a computer exactly/enough like the one that broke, and so the old system backup won’t even run. Second, even if you can find an identical enough system, do you want to, or maybe it is time to upgrade anyway - there are pros and cons of arm (raspberry pi) vs x86 servers (there are other obscure options you might want but those are the main ones), and you may want to switch anyway since you have. Third, odds are some of the services need to be upgraded and so you may as well use this forced computer time to apply the upgrade. Last, you may change how many servers you have, should you split services to different computers, or maybe consolidate the services on the system that died to some other server you already have.

Some good things to consider here. Whether or not I'll want to upgrade will depend on how far this theoretical failure is. If storage fails, I might just replace that and restore the backup. If it's something more significant than that and we're 2-3 years down the line, I'll probably look at an upgrade. If it's less than that, I might just replace with the same to keep things simple.

I guess one other upside of the full system backup is that I could restore just the data out of it if I decide to upgrade when some hardware fails, but I don't have the reverse flexibility (to do a full system restore) if I opt for a data-only backup.

[–] RadDevon@lemmy.zip 2 points 10 months ago (1 children)

If you don’t have the budget for on-premises backup, you almost certainly can’t afford to restore the cloud backup if anything goes wrong.

I believe egress is free on Backblaze B2.

Just make sure to test the restore procedure once in a while.

Good call on this. Curious if you have a procedure for actually doing this. I could just wipe out my system and rebuild it from the backup, but then I'm in trouble if it fails. What does a proper test of a backup actually look like?

[–] RadDevon@lemmy.zip 1 points 10 months ago (1 children)

Check out Borgbase, it’s very cheap and it’s an actual backup solution, so it offers some features you won’t get from Google drive or whatever you were considering using e.g. deduplication, recover data at different points in time and have the data be encrypted so there’s no way for them to access it.

I looked at Borgbase, but I think it will be a bit more pricey than Restic + Backblaze B2. Looks like Borgbase is $80/year for 1TB, which would be $72/year on B2 and less if I don't use all of 1TB.

The vast majority of your system is the same as it would be if you install fresh, so you’re wasting backup space in storing data you can easily recover in other ways.

I get this, but it would be faster to restore, right? And the storage I'm going to use to store these files is relatively little compared to the overall volume of data I'm backing up. For example, I'm backing up 100GB of personal photos and home movies. Backing up the system, even though strictly not necessary, will be something like 5% of this, I think, and I'd lean toward paying another few cents every month for a faster restore.

Thanks for your thoughts on the database backups. It's a helpful perspective!

[–] RadDevon@lemmy.zip 1 points 10 months ago

Much simpler than my solution. I'll look into this. Thank you!

[–] RadDevon@lemmy.zip 1 points 10 months ago (1 children)

Is your script something you can share? I'd love to see your approach. I can definitely live with a few minutes of down time in the early morning.

 

I'm in the process of setting up backups for my home server, and I feel like I'm swimming upstream. It makes me think I'm just taking the wrong approach.

I'm on a shoestring budget at the moment, so I won't really be able to implement a 3-2-1 strategy just yet. I figure the most bang for my buck right now is to set up off-site backups to a cloud provider. I first decided to do a full-system backup in the hopes I could just restore it and immediately be up and running again. I've seen a lot of comments saying this is the wrong approach, although I haven't seen anyone outline exactly why.

I then decided I would instead cherry-pick my backup locations instead. Then I started reading about backing up databases, and it seems you can't just back up the data directory (or file in the case of SQLite) and call it good. You need to dump them first and backup the dumps.

So, now I'm configuring a docker-db-backup container to back each one of them up, finding database containers and SQLite databases and configuring a backup job for each one. Then, I hope to drop all of those dumps into a single location and back that up to the cloud. This means that, if I need to rebuild, I'll have to restore the containers' volumes, restore the backups, bring up new containers, and then restore each container's backup into the new database. It's pretty far from my initial hope of being able to restore all the files and start using the newly restored system.

Am I going down the wrong path here, or is this just the best way to do it?

 

I'm running a Docker-based homelab that I manage primarily via Portainer, and I'm struggling with how to handle container updates. At first, I had all containers pulling latest, but I thought maybe this was a bad idea as I could end up updating a container without intending to. So, I circled back and pinned every container image in my docker-compose files.

Then I started looking into how to handle updates. I've heard of Watchtower, but I noticed the Linuxserver.io images all recommend not running Watchtower and instead using Diun. In looking into it, I learned it will notify you of updates based on the tag you're tracking for the container, meaning it will never do anything for my containers pinned to a specific version. This made me think maybe I've taken the wrong approach.

What is the best practice here? I want to generally try to keep things up to date, but I don't want to accidentally break things. My biggest fear about tracking latest is that I make some other change in a docker-compose and update the stack which pulls latest for all the container in that stack and breaks some of them with unintended updates. Is this a valid concern, and if so, how can I overcome it?

 

I am running Bazzite 40 on a system with an RTX 4080. Up until yesterday, I was connecting computer -> Samsung HW-Q900C soundbar -> Samsung Q90C TV. I learned that the soundbar doesn't have HDMI 2.1 ports which is why I hadn't been able to get 120Hz, so I changed my setup to computer -> TV + soundbar -> TV (eARC).

Now, I do have 120Hz, but I lost a bunch of other options in my display settings, including HDR. The only options I can set there now are resolution, orientation, refresh rate, and scale. I suspect this is an issue with the TV communicating its capabilities in a way the OS doesn't understand, but I'm not sure how to fix or work around it. Can anyone suggest a fix? Is there a setting I can change on the TV or maybe an app I can run on the computer to manually set the TV's capabilities?

Update: Just discovered kscreen-doctor. Here's the output:

Output: 445 HDMI-0
	enabled
	connected
	priority 1
	HDMI
	Modes:  446:3840x2160@60!  447:4096x2160@120  448:4096x2160@100  449:4096x2160@60  450:4096x2160@50  451:4096x2160@30  452:4096x2160@24  453:4096x2160@24  454:3840x2160@144  455:3840x2160@120*  456:3840x2160@100  457:3840x2160@60  458:3840x2160@50  459:3840x2160@30  460:3840x2160@25  461:3840x2160@24  462:3840x1600@144  463:3840x1600@120  464:3840x1600@60  465:3840x1080@144  466:3840x1080@120  467:3840x1080@60  468:2560x1440@120  469:2560x1080@144  470:2560x1080@120  471:2560x1080@60  472:1920x1080@144  473:1920x1080@120  474:1920x1080@100  475:1920x1080@60  476:1920x1080@60  477:1920x1080@50  478:1920x1080@30  479:1920x1080@25  480:1920x1080@24  481:1680x1050@60  482:1600x900@60  483:1440x900@60  484:1280x1024@75  485:1280x1024@60  486:1280x800@60  487:1280x720@60  488:1280x720@60  489:1280x720@50  490:1152x864@75  491:1024x768@75  492:1024x768@70  493:1024x768@60  494:800x600@75  495:800x600@72  496:800x600@60  497:720x576@50  498:720x480@60  499:640x480@75  500:640x480@73  501:640x480@60 
	Geometry: 0,0 3840x2160
	Scale: 1
	Rotation: 1
	Overscan: 0
	Vrr: incapable
	RgbRange: unknown
	HDR: incapable
	Wide Color Gamut: incapable
	ICC profile: incapable
	Color profile source: incapable

SOLUTION: Turns out this was my goof. I was trying to set up auto-login on my user account. In doing so, I set it to automatically log in to Plasma (X11) instead of Plasma (Wayland). Odd that the default option in that dropdown is not the one you’re currently using, but 🤷‍♂️.

What I’m now trying to figure out is why I can’t set auto-login for Plasma (Wayland). The Apply button is disabled. 🤔

Thanks to everyone who shared ideas.

 

I have a little one-line keyboard customization in my ~/.profile that is intended to make my caps lock key function as escape when pressed or ctrl when held.

# Map Caps Lock to Esc/Ctrl
setxkbmap -option 'caps:ctrl_modifier' && xcape -e 'Control_L=Escape;Caps_Lock=Escape'

It works… but only if I manually run source ~/.profile. The weird thing is that it kinda works without the manual intervention, but the caps lock also activates. This does not happen after manually sourcing.

I thought this file was automatically sourced at login. If that's the case, shouldn't the customization work automatically without the file having to be manually sourced? Is there some service that needs to be running before this command fires that is not yet running when the file is automatically sourced? Struggling to understand what's happening here… 🤔

 

How are people coping with games that just won't run on Linux (aside from leaving them behind)? Do you dual boot Windows? Virtualize? What's your strategy for this?

This will be extremely rare for me since I don't play a lot of competitive stuff, but I'd love to find a solution. I have a large library, and it's bound to happen from time to time.

1
Tiny Awards (tinyawards.net)
view more: next ›