semi

joined 3 years ago
[–] semi@lemmy.ml 2 points 2 months ago* (last edited 2 months ago)

This is exciting. My only request here is: whenever it works please release a standalone wasm file somewhere (anywhere). So many projects either require building the wasm themselves, or instead of releasing a .wasm, they release a JS wrapper that auto-loads the wasm/wasm-imports. Its a pain to try to extract the wasm out of those projects.

What I am doing is to create a omnikee-lib crate within the project that will get compiled to WASM, not just plain keepass, because I need additional adapter methods to interface with the web part of the application. I don't have the bandwidth to turn keepass into a general WASM package that could be npm installed at the moment. As I am dogfooding the crate, I might get to a point where I know what a good JS interface for it would be, though, and the omnikee-lib crate could become the official WASM interface for keepass.

[–] semi@lemmy.ml 2 points 2 months ago* (last edited 2 months ago)

sweet! I sent you the invite.

Currently, SSH key management is not supported, but it would probably be possible to implement the SSH agent protocol in the Rust part of the application. I see that russh has a SSH agent server implementation. Let me know if you are interested in contributing such a feature - I am currently working on exposing all the custom entry fields in the UI, so the project ~~would almost be ready.~~ edit: would be ready to add that feature now

[–] semi@lemmy.ml 2 points 2 months ago

thanks for your interest! I have sent you a response with an invite link.

 

cross-posted from: https://lemmy.ml/post/29344090

I'm the original author of the Rust keepass crate and wanted to prototype whether it would be possible to build a cross-platform password manager using that crate, Tauri, and Vue.js. It turns out, it is!

I have also come up with a way to compile the keepass crate to WebAssembly, so that I can additionally deploy the app to a web browser without any installation needed. See the architecture page in the docs how that is done.

The app is now working on 4 / 5 platforms that Tauri supports, with only iOS missing since I don't own an iPhone nor an Apple Developer account.

The feature set is still pretty barebones, but the hard parts of decrypting databases, listing entries, etc. are all working, so I wanted to share the proof-of-concept to gather feedback and gauge interest in building this out further.

If are an Android user and you would like help me release OmniKee on Google Play, please PM me an E-mail address associated with your Google account and I can add you to the closed test. I will need 12 testers signed up for a test for 14 days to get the permissions to fully release.

 

I'm the original author of the Rust keepass crate and wanted to prototype whether it would be possible to build a cross-platform password manager using that crate, Tauri, and Vue.js. It turns out, it is!

I have also come up with a way to compile the keepass crate to WebAssembly, so that I can additionally deploy the app to a web browser without any installation needed. See the architecture page in the docs how that is done.

The app is now working on 4 / 5 platforms that Tauri supports, with only iOS missing since I don't own an iPhone nor an Apple Developer account.

The feature set is still pretty barebones, but the hard parts of decrypting databases, listing entries, etc. are all working, so I wanted to share the proof-of-concept to gather feedback and gauge interest in building this out further.

If you are an Android user and you would like help me release OmniKee on Google Play, please PM me an E-mail address associated with your Google account and I can add you to the closed test. I will need 12 testers signed up for a test for 14 days to get the permissions to fully release.

[–] semi@lemmy.ml 7 points 2 months ago (1 children)

Since it doesn't come installed by default on a fresh system, my guess would be that you won't break anything fundamental, but this is pure speculation.

[–] semi@lemmy.ml 2 points 3 months ago

The mirrorlist is a configuration file listing servers that updates can get pulled in from.

When a package update is installed that contains a configuration file, it will not overwrite the old file but be installed with a pacnew extension so that you can merge the files (like you did). It will keep complaining at you until you remove the pacnew file, which is fine to do after you have merged successfully.

The graphical issues are probably due to something else that happened during the update.

[–] semi@lemmy.ml 1 points 4 months ago* (last edited 4 months ago)

Thanks for the comment. I have had exposure to similar claims, but wasn't seeing anyone using AMD GPUs for AI unless they were somehow incentivized by AMD, which made me suspicious.

In principle, more competition in the AI hardware market would be amazing, and Nvidia GPUs do feel overpriced, but I personally don't want to deal with the struggles of early adoption.

[–] semi@lemmy.ml 2 points 4 months ago* (last edited 4 months ago) (2 children)

For inference (running previously-trained models that need lots of RAM), the desktop could be useful, but I would be surprised if training anything bigger than toy examples on this hardware would make sense because I expect compute performance to be limited.

Does anyone here have practical recent experience with ROCm and how it compares with the far-more-dominant CUDA? I would imagine that compatibility is much better now that most models are using PyTorch and that is supported, but what is the performance compared to a dedicated Nvidia GPU?

[–] semi@lemmy.ml 1 points 6 months ago

Here is an exported result list from Kagi that should be accessible without an account.

[–] semi@lemmy.ml 9 points 6 months ago* (last edited 6 months ago)

Lucky for you, the post contains an animated JPEG showing the change over time. Lemmy clients that don't support playback will only show a static image

[–] semi@lemmy.ml 3 points 8 months ago* (last edited 8 months ago) (1 children)

This will work in general. One point of improvement: right now, if the request fails, the panic will cause your whole program to crash. You could change your function to return a Result<Html, SomeErrorType> instead, and handle errors more gracefully in the place where your function is called (e.g. ignoring pages that returned an error and continuing with the rest).

Look into anyhow for an easy to use error handling crate, allowing you to return an anyhow::Result<Html>

[–] semi@lemmy.ml 40 points 10 months ago

12.5/8=1.5625, so the Euro price went up by 56.25%

view more: next ›