this post was submitted on 28 Nov 2025
575 points (94.3% liked)

Selfhosted

53360 readers
288 users here now

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.

Rules:

  1. Be civil: we're here to support and learn from one another. Insults won't be tolerated. Flame wars are frowned upon.

  2. No spam posting.

  3. Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it's not obvious why your post topic revolves around selfhosting, please include details to make it clear.

  4. Don't duplicate the full text of your blog or github here. Just post the link for folks to click.

  5. Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).

  6. No trolling.

  7. No low-effort posts. This is subjective and will largely be determined by the community member reports.

Resources:

Any issues on the community? Report it using the report flag.

Questions? DM the mods!

founded 2 years ago
MODERATORS
 

I got into the self-hosting scene this year when I wanted to start up my own website run on old recycled thinkpad. A lot of time was spent learning about ufw, reverse proxies, header security hardening, fail2ban.

Despite all that I still had a problem with bots knocking on my ports spamming my logs. I tried some hackery getting fail2ban to read caddy logs but that didnt work for me. I nearly considered giving up and going with cloudflare like half the internet does. But my stubbornness for open source self hosting and the recent cloudflare outages this year have encouraged trying alternatives.

Coinciding with that has been an increase in exposure to seeing this thing in the places I frequent like codeberg. This is Anubis, a proxy type firewall that forces the browser client to do a proof-of-work security check and some other nice clever things to stop bots from knocking. I got interested and started thinking about beefing up security.

I'm here to tell you to try it if you have a public facing site and want to break away from cloudflare It was VERY easy to install and configure with caddyfile on a debian distro with systemctl. In an hour its filtered multiple bots and so far it seems the knocks have slowed down.

https://anubis.techaro.lol/

My botspam woes have seemingly been seriously mitigated if not completely eradicated. I'm very happy with tonights little security upgrade project that took no more than an hour of my time to install and read through documentation. Current chain is caddy reverse proxy -> points to Anubis -> points to services

Good place to start for install is here

https://anubis.techaro.lol/docs/admin/native-install/

top 50 comments
sorted by: hot top controversial new old
[–] non_burglar@lemmy.world 184 points 6 days ago (3 children)

Anubis is an elegant solution to the ai bot scraper issue, I just wish the solution to everything wasn't just spending compute everywhere. In a world where we need to rethink our energy consumption and generation, even on clients, this is a stupid use of computing power.

[–] Dojan@pawb.social 106 points 6 days ago* (last edited 6 days ago) (11 children)

It also doesn’t function without JavaScript. If you’re security or privacy conscious chances are not zero that you have JS disabled, in which case this presents a roadblock.

On the flip side of things, if you are a creator and you’d prefer to not make use of JS (there’s dozens of us) then forcing people to go through a JS “security check” feels kind of shit. The alternative is to just take the hammering, and that feels just as bad.

No hate on Anubis. Quite the opposite, really. It just sucks that we need it.

[–] SmokeyDope@piefed.social 54 points 6 days ago* (last edited 6 days ago) (1 children)

Theres a compute option that doesnt require javascript. The responsibility lays on site owners to properly configure IMO, though you can make the argument its not default I guess.

https://anubis.techaro.lol/docs/admin/configuration/challenges/metarefresh

From docs on Meta Refresh Method

Meta Refresh (No JavaScript)

The metarefresh challenge sends a browser a much simpler challenge that makes it refresh the page after a set period of time. This enables clients to pass challenges without executing JavaScript.

To use it in your Anubis configuration:

# Generic catchall rule
- name: generic-browser
  user_agent_regex: >-
    Mozilla|Opera
  action: CHALLENGE
  challenge:
    difficulty: 1 # Number of seconds to wait before refreshing the page
    algorithm: metarefresh # Specify a non-JS challenge method

This is not enabled by default while this method is tested and its false positive rate is ascertained. Many modern scrapers use headless Google Chrome, so this will have a much higher false positive rate.

load more comments (1 replies)
[–] cecilkorik@piefed.ca 11 points 6 days ago

if you are a creator and you’d prefer to not make use of JS (there’s dozens of us) then forcing people to go through a JS “security check” feels kind of shit. The alternative is to just take the hammering, and that feels just as bad.

I'm with you here. I come from an older time on the Internet. I'm not much of a creator, but I do have websites, and unlike many self-hosters I think, in the spirit of the internet, they should be open to the public as a matter of principle, not cowering away for my own private use behind some encrypted VPN. I want it to be shared. Sometimes that means taking a hammering. It's fine. It's nothing that's going to end the world if it goes down or goes away, and I try not to make a habit of being so irritating that anyone would have much legitimate reason to target me.

I don't like any of these sort of protections that put the burden onto legitimate users. I get that's the reality we live in, but I reject that reality, and substitute my own. I understand that some people need to be able to block that sort of traffic to be able to limit and justify the very real costs of providing services for free on the Internet and Anubis does its job for that. But I'm not one of those people. It has yet to cost me a cent above what I have already decided to pay, and until it does, I have the freedom to adhere to my principles on this.

To paraphrase another great movie: Why should any legitimate user be inconvenienced when the bots are the ones who suck. I refuse to punish the wrong party.

load more comments (9 replies)
[–] cadekat@pawb.social 14 points 6 days ago (6 children)

Scarcity is what powers this type of challenge: you have to prove you spent a certain amount of electricity in exchange for access to the site, and because electricity isn't free, this imposes a dollar cost on bots.

You could skip the detour through hashes/electricity and do something with a proof-of-stake cryptocurrency, and just pay for access. The site owner actually gets compensated instead of burning dead dinosaurs.

Obviously there are practical roadblocks to this today that a JavaScript proof-of-work challenge doesn't face, but longer term...

load more comments (6 replies)
load more comments (1 replies)
[–] daniskarma@lemmy.dbzer0.com 35 points 5 days ago* (last edited 5 days ago) (1 children)

I don't think you have a usecase for Anubis.

Anubis is mainly aimed against bad AI scrappers and some ddos mitigation if you have a heavy service.

You are getting hit exactly the same, anubis doesn't put up a block list or anything. It just put itself in front of the service. The load on your server and the risk you take it's very similar anubis or not anubis here. Most bots are not AI scrappers they are just proving. So the hit on your server is the same.

What you want is to properly set up fail2ban or, even better, crowdsec. That would actually block and ban bots that try to prove your server.

If you are just self-hosting with Anubis the only thing you are doing is deriving the log noise towards Anubis logs and making your devices do a PoW every once in a while when you want to use your services.

Being honest I don't know what you are self hosting. But at least it's something that's going to get ddos or AI scrapped, there's not much point with Anubis.

Also Anubis is not a substitute for fail2ban or crowdsec. You need something to detect and ban brute force attacks. If not the attacker would only need to execute the anubis challenge get the token for the week and then they are free to attack your services as they like.

[–] sudo@programming.dev 42 points 6 days ago (8 children)

I've repeatedly stated this before: Proof of Work bot-management is only Proof of Javascript bot-management. It is nothing to a headless browser to by-pass. Proof of JavaScript does work and will stop the vast majority of bot traffic. That's how Anubis actually works. You don't need to punish actual users by abusing their CPU. POW is a far higher cost on your actual users than the bots.

Last I checked Anubis has an JavaScript-less strategy called "Meta Refresh". It first serves you a blank HTML page with a <meta> tag instructing the browser to refresh and load the real page. I highly advise using the Meta Refresh strategy. It should be the default.

I'm glad someone is finally making an open source and self hostable bot management solution. And I don't give a shit about the cat-girls, nor should you. But Techaro admitted they had little idea what they were doing when they started and went for the "nuclear option". Fuck Proof of Work. It was a Dead On Arrival idea decades ago. Techaro should strip it from Anubis.

I haven't caught up with what's new with Anubis, but if they want to get stricter bot-management, they should check for actual graphics acceleration.

[–] SmokeyDope@piefed.social 32 points 6 days ago* (last edited 6 days ago) (1 children)

Something that hasn't been mentioned much in discussions about Anubis is that it has a graded tier system of how sketchy a client is and changing the kind of challenge based on a a weighted priority system.

The default bot policies it comes with has it so squeaky clean regular clients are passed through, then only slightly weighted clients/IPs get the metarefresh, then its when you get to moderate-suspicion level that JavaScript Proof of Work kicks. The bot policy and weight triggers for these levels, challenge action, and duration of clients validity are all configurable.

It seems to me that the sites who heavy hand the proof of work for every client with validity that only last every 5 minutes are the ones who are giving Anubis a bad wrap. The default bot policy settings Anubis comes with dont trigger PoW on the regular Firefox android clients ive tried including hardened ironfox. meanwhile other sites show the finger wag every connection no matter what.

Its understandable why some choose strict policies but they give the impression this is the only way it should be done which Is overkill. I'm glad theres config options to mitigate impact normal user experience.

load more comments (1 replies)
[–] rtxn@lemmy.world 14 points 5 days ago* (last edited 5 days ago) (11 children)

POW is a far higher cost on your actual users than the bots.

That sentence tells me that you either don't understand or consciously ignore the purpose of Anubis. It's not to punish the scrapers, or to block access to the website's content. It is to reduce the load on the web server when it is flooded by scraper requests. Bots running headless Chrome can easily solve the challenge, but every second a client is working on the challenge is a second that the web server doesn't have to waste CPU cycles on serving clankers.

POW is an inconvenience to users. The flood of scrapers is an existential threat to independent websites. And there is a simple fact that you conveniently ignored: it fucking works.

load more comments (11 replies)
load more comments (6 replies)
[–] quick_snail@feddit.nl 24 points 5 days ago (2 children)

Kinda sucks how it makes websites inaccessible to folks who have to disable JavaScript for security.

[–] poVoq@slrpnk.net 25 points 5 days ago (8 children)

I kinda sucks how AI scrapers make websites inaccessible to everyone 🙄

load more comments (8 replies)
[–] WhyJiffie@sh.itjust.works 12 points 5 days ago (3 children)

there's a fork that has non-js checks. I don't remember the name but maybe that's what should be made more known

load more comments (3 replies)
[–] smh@slrpnk.net 19 points 5 days ago

The creator is active on a professional slack I'm on and they're lovely and receptive to user feedback. Their tool is very popular in the online archives/cultural heritage scene (we combine small budgets and juicy, juicy data).

My site has enabled js-free screening when the site load is low, under the theory that if the site load is too high then no one's getting in anyway.

[–] url@feddit.fr 21 points 5 days ago (1 children)

Honestly im not a big fan of anubis . it fucks users with slow devices

https://lock.cmpxchg8b.com/anubis.html

[–] url@feddit.fr 13 points 5 days ago

Did i forgot to mention it doesnt work without js that i keep disabled

[–] 0_o7@lemmy.dbzer0.com 28 points 6 days ago (2 children)

I don't mind Anubis but the challenge page shouldn't really load an image. It's wasting extra bandwidth for nothing.

Just parse the challenge and move on.

[–] Allero@lemmy.today 21 points 6 days ago (2 children)

Afaik, you can set it up not to have any image, or have any other one.

load more comments (2 replies)
[–] kilgore_trout@feddit.it 17 points 6 days ago* (last edited 6 days ago) (1 children)

It's a palette of 10 colours. I would guess it uses an indexed colorspace, reducing the size to a minimum.
edit: 28 KB on disk

[–] CameronDev@programming.dev 9 points 6 days ago (2 children)

A HTTP get request is a few hundred bytes. The response is 28KB. Thats 280x. If a large botnet wanted to denial of service an Anubis protected site, requesting that image could be enough.

Ideally, Anubis should serve as little data as possible until the POW is completed. Caching the POW algorithm (and the image) to a CDN would also mitigate the issue.

[–] kilgore_trout@feddit.it 1 points 1 day ago (1 children)

I might agree, still one could argue that brand recognisability is contributing to the service as well.

[–] CameronDev@programming.dev 1 points 1 day ago

Definitely, which is why i suggested hosting the image + js on a CDN. Keeps brand awareness, and lets the CDN take the brunt of any malicious activity. with a bit of code-golfing, the data served by Anubis directly prior to POW could be a few hundred bytes, without impacting its functionality.

[–] teolan@lemmy.world 9 points 6 days ago (1 children)

The whole point of Anubis is to not have to go through a CDN to sustain scrapping botnets

load more comments (1 replies)
[–] drkt_@lemmy.dbzer0.com 11 points 5 days ago

Stop playing wack-a-mole with these fucking people and build TARPITS!

Make it HURT to crawl your site illegitimately.

[–] natecox@programming.dev 26 points 6 days ago (4 children)
load more comments (4 replies)
[–] Deathray5@lemmynsfw.com 6 points 4 days ago (1 children)

Unrelated but one day I won't get gender envy from random cartoon woman

load more comments (1 replies)
[–] henfredemars@infosec.pub 26 points 6 days ago (1 children)

I appreciate a simple piece of software that does exactly what it’s supposed to do.

load more comments (1 replies)
[–] TerHu@lemmy.dbzer0.com 13 points 5 days ago (1 children)

yes, please be mindful when using cloudflare. with them you’re possibly inviting in a much much bigger problem

https://www.devever.net/~hl/cloudflare

[–] quick_snail@feddit.nl 8 points 5 days ago* (last edited 5 days ago)

Great article, but I disagree about WAFs.

Try to secure a nonprofit's web infrastructure with as 1 IT guy and no budget for devs or security.

It would be nice if we could update servers constantly and patch unmaintained code, but sometimes you just need to front it with something that plugs those holes until you have the capacity to do updates.

But 100% the WAF should be run locally, not a MiTM from evil US corp in bed with DHS.

[–] A_norny_mousse@feddit.org 16 points 6 days ago* (last edited 5 days ago) (7 children)

At the time of commenting, this post is 8h old. I read all the top comments, many of them critical of Anubis.

I run a small website and don't have problems with bots. Of course I know what a DDOS is - maybe that's the only use case where something like Anubis would help, instead of the strictly server-side solution I deploy?

I use CrowdSec (it seems to work with caddy btw). It took a little setting up, but it does the job.
(I think it's quite similar to fail2ban in what it does, plus community-updated blocklists)

Am I missing something here? Why wouldn't that be enough? Why do I need to heckle my visitors?

Despite all that I still had a problem with bots knocking on my ports spamming my logs.

By the time Anubis gets to work, the knocking already happened so I don't really understand this argument.

If the system is set up to reject a certain type of requests, these are microsecond transactions of no (DDOS exception) harm.

[–] poVoq@slrpnk.net 12 points 5 days ago* (last edited 5 days ago) (1 children)

AI scraping is a massive issue for specific types of websites, such as git forges, wikis and to a lesser extend Lemmy etc, that rely on complex database operations that can not be easily cached. Unless you massively overprovision your infrastructure these web-applications come to a grinding halt by constantly maxing out the available CPU power.

The vast majority of the critical commenters here seem to talk from a point of total ignorance about this, or assume operators of such web applications have time for hyperviligance to constantly monitor and manually block AI scrapers (that do their best to circumvent more basic blocks). The realistic options for such operators are right now: Anubis (or similar), Cloudflare or shutting down their servers. Of these Anubis is clearly the least bad option.

load more comments (1 replies)
[–] Pastime0293@discuss.tchncs.de 7 points 5 days ago

I also used CrowdSec for almost a year, but as AI scrapers became more aggressive, CrowdSec alone wasn’t enough. The scrapers used distributed IP ranges and spoofed user agents, making them hard to detect and costing my Forgejo instance a lot in expensive routes. I tried custom CrowdSec rules but hit its limits.

Then I discovered Anubis. It’s been an excellent complement to CrowdSec — I now run both. In my experience they work very well together, so the question isn’t “A or B?” but rather “How can I combine them, if needed?”

[–] quick_snail@feddit.nl 7 points 5 days ago (2 children)

With varnish and wazuh, I've never had a need for Anubis.

My first recommendation for anyone struggling with bots is to fix their cache.

load more comments (2 replies)
load more comments (4 replies)
[–] Arghblarg@lemmy.ca 13 points 6 days ago* (last edited 6 days ago) (4 children)

I have a script that watches apache or caddy logs for poison link hits and a set of bot user agents, adding IPs to an ipset blacklist, blocking with iptables. I should polish it up for others to try. My list of unique IPs is well over 10k in just a few days.

git repos seem to be real bait for these damn AI scrapers.

load more comments (4 replies)
[–] mrbn@lemmy.ca 12 points 6 days ago (2 children)

When I visit sites on my cellphone, Anubis often doesn't let me through.

[–] cmnybo@discuss.tchncs.de 11 points 6 days ago (1 children)

I've never had any issues on my phone using Fennec or Firefox. I don't have many addons installed apart from uBlock Origin. I wouldn't be surprised if some privacy addons cause issues with Anubis though.

load more comments (1 replies)
load more comments (1 replies)
load more comments
view more: next ›