this post was submitted on 14 Mar 2025
32 points (100.0% liked)

Linux

52805 readers
509 users here now

From Wikipedia, the free encyclopedia

Linux is a family of open source Unix-like operating systems based on the Linux kernel, an operating system kernel first released on September 17, 1991 by Linus Torvalds. Linux is typically packaged in a Linux distribution (or distro for short).

Distributions include the Linux kernel and supporting system software and libraries, many of which are provided by the GNU Project. Many Linux distributions use the word "Linux" in their name, but the Free Software Foundation uses the name GNU/Linux to emphasize the importance of GNU software, causing some controversy.

Rules

Related Communities

Community icon by Alpár-Etele Méder, licensed under CC BY 3.0

founded 5 years ago
MODERATORS
 

I know firefox has the very useful "Copy clean Link" option in the context menu, but I would like a similar feature for copying links from any other software, like spotify for example. So I am looking for some software that hooks into the clipboard pipeline, and cleans any URL that gets added. I tried googling for something like it, but was completely unsuccessful. Does anyone have a clue how I might go about achieving this?

Thanks in advance :)

Edit: I found out about klipper's actions, which provide the option to run a command when a string that matches a regex is added to the clipboard buffer. I am not sure how to properly use this though, so any help is appreciated!

top 26 comments
sorted by: hot top controversial new old
[–] [email protected] 4 points 3 weeks ago* (last edited 3 weeks ago) (1 children)

That'd be cool. Whenever I'm sharing a YT link, I'm always a bit suspicious of what info the youtu.be URL is hiding, so I paste it into a browser to get a clean URL.

Maybe this is silly, but I'd be cool to do that automatically.

[–] [email protected] 4 points 3 weeks ago* (last edited 3 weeks ago)

Well for youtube it's quite easy, there are only 4 useful parameters that I can think of, the video id v, the playlist id list and index if it's a playlist and the time t if you're sending a specific time in the video. Everything else can be removed. Here's what uBlock Origin with the AdGuard URL Tracking filter list:

! Youtube
$removeparam=embeds_referring_euri,domain=youtubekids.com|youtube-nocookie.com|youtube.com
$removeparam=embeds_referring_origin,domain=youtubekids.com|youtube-nocookie.com|youtube.com
$removeparam=source_ve_path,domain=youtubekids.com|youtube-nocookie.com|youtube.com
||youtube.com^$removeparam=pp
[–] [email protected] 4 points 3 weeks ago

The problem is, how do you distinguish url parameters that are essential from url parameters thats used to track?

[–] [email protected] 3 points 3 weeks ago (1 children)

I wish that option was the default in Firefox.

[–] [email protected] 3 points 3 weeks ago (2 children)

It rarely ever does anything in my experience.

Anyway, I built a URL-cleaning script in AutoHotkey, but that's Windows-only; I, too, am on the hunt for a Linux equivalent. Maybe this could be done in SikuliX or Espanso, via a Python script, but I suck at Python so far.

[–] [email protected] 4 points 3 weeks ago (1 children)

It's never worked for me either. The ClearURLs addon has a function to copy a clean URL and that works great though. It's open source, so maybe someone could turn its cleaning function into a program that could be used for the clipboard.

[–] [email protected] 1 points 3 weeks ago

Hmm. Thanks, though I'm seeking universal /offline so if I get URLs in other platforms, I won't have to turn to the browser to purge them of junk. Or maybe this could be converted into a standalone program...

[–] [email protected] 2 points 3 weeks ago

Alternatively you can configure a hotkey in the GNOME Settings, or the equivalent for other DEs, to execute a bash script or anything.

[–] [email protected] 2 points 3 weeks ago

Looks like https://old.reddit.com/r/kde/comments/d3m0fz/how_to_open_links_in_mpv_with_klipper/ is a good starting point, i.e

  • Open menu in system tray.
  • Right click on Clipboard => Configure Clipboard.
  • Go to Actions Configuration => Add Action.

then... to try! :D I'm just discovering this too but seems like the right way.

That said I'd be cautious and limit the use case to only what you have, e.g. Spotify links, at least at first because I imagine one can get into hairy edge cases quickly.

Keep us posted!

[–] [email protected] 1 points 3 weeks ago (3 children)

You never define "clean".

To strip excess URL parameters (i.e. beginning "&", almost certainly junk) if the clipboard buffer contains a URL and only a URL (Wayland only):

if url=$(printf '%s' "$(wl-paste --no-newline | awk '$1=$1' ORS=' ')" | egrep -o 'https?://[^ ]+') ; then
  wl-copy "${url%%\&*}"
fi
[–] [email protected] 2 points 3 weeks ago (1 children)

Fair enough, I haven't given that too much thought myself until now. After playing around with Firefox's URL cleaning, I realized there are some parameters I want to keep. So, by clean I mean removing all unnecessary parameters in the URL.

For example, https://youtu.be/jNQXAC9IVRw would become https://youtu.be/jNQXAC9IVRw, but https://www.youtube.com/watch?v=jNQXAC9IVRw keeps it's parameter, because it is necessary.

I guess replicating the logic for deciding which parameters to keep is not trivial, so the easiest solution is probably just manually pasting links into firefox, and just copying them cleanly from there. Thanks for providing some code, though!

[–] [email protected] 3 points 3 weeks ago (1 children)

There is no logic as to which parameters is useful and which is used for tracking. But there are databases.

Here is the one for the CleanURLs extension and here is the one for the AdGuard URL Tracking filter list (which I recommend everyone should enable in uBlock Origin).

[–] [email protected] 1 points 3 weeks ago (1 children)

Oh, nice! That's definitely valuable info. Personally, I do think it's too much work to implement that properly, though.

[–] [email protected] 1 points 3 weeks ago

There are some examples of projects that use CleanURLs db in its readme but most have not been updated for a long time.

[–] [email protected] 1 points 3 weeks ago (1 children)

Query parameters are junk? They have tons of legitimate uses, they’re one of the better places to keep state.

[–] [email protected] 1 points 3 weeks ago (1 children)

As a WebDev... URL parameters are definitely not the place to keep state... Were not in the 00's anymore. They do have legit uses, but we have JS localStorage nowadays.

[–] [email protected] 2 points 3 weeks ago (1 children)

They have pretty different use cases. Localstorage is for when you want persistence across page loads, not necessarily specific to any particular page but specific to a browser. An example would be storing user-selected light or dark mode.

Query parameters are specific to a page/URL and you get a lot of things for free when you use them:

  • back/forward navigation
  • bookmarking
  • copy-paste to share
  • page level caching
  • access on both server and client

Query parameters are good for things like searches, filters, sorting, etc

[–] [email protected] 1 points 3 weeks ago (1 children)

I disagree. I definitely prefer REST APIs that use the file path for searches, filters, sorting. You get most if not all benefits from query parameters, and if done correctly it is just as clearly readable as query params.

[–] [email protected] 1 points 3 weeks ago (1 children)

But what if you have multiple optional parameters?

[–] [email protected] 1 points 3 weeks ago (1 children)

have multiple routes point to the same endpoint, dynamically adding the parameters serverside

[–] [email protected] 2 points 3 weeks ago (1 children)

That sounds harder than just using query parameters. What are the benefits?

[–] [email protected] 2 points 3 weeks ago

Having more beautiful and structured URLs. I suppose for those cases it's more of a preference, and with the tooling I use (.NET) it's not too difficult to achieve.

I guess my gripe with your original statement was that I was thinking mostly of state like user login etc. I have to concede it's not totally garbage for the cases you mentioned.

[–] [email protected] 1 points 3 weeks ago* (last edited 3 weeks ago) (1 children)

Edit: Oh, OP basically already said the same thing.

I think it really depends on the website and even where you are on the website. For example, if you're on YT, the watch?v=<b64_id> is probably not something you want to throw away. If you're on a news site like imaginarynews.com/.../the-article-title/?tracking-garbage=<...> then you probably do. It's just a matter of having "sane" defaults that work as most people would expect.

[–] [email protected] 2 points 3 weeks ago (1 children)

Sure, but my script only gets rid of the second and later parameters, i.e. ones with & not ?. Personally I don't think I've ever seen a single site where an & param is critical. These days there few where the ? matters either, but yes YT is a holdout.

[–] [email protected] 1 points 3 weeks ago (1 children)

There are plenty of sites that use more than one parameters. It's true that a lot of sites now use the history API instead of url parameters but you can still find plenty, and you have no garante about the parameters order. Any site with a search page that have a few options will probably use url parameters instead of the history API. It's easier to parse and will end up being shorter most of the time.

[–] [email protected] 1 points 3 weeks ago

Search results, sure. Personally I have rarely if ever wanted to save or share such URLs. But sure.