this post was submitted on 14 May 2025
315 points (99.7% liked)

Programming

21400 readers
86 users here now

Welcome to the main community in programming.dev! Feel free to post anything relating to programming here!

Cross posting is strongly encouraged in the instance. If you feel your post or another person's post makes sense in another community cross post into it.

Hope you enjoy the instance!

Rules

Rules

  • Follow the programming.dev instance rules
  • Keep content related to programming in some way
  • If you're posting long videos try to add in some form of tldr for those who don't want to watch videos

Wormhole

Follow the wormhole through a path of communities !webdev@programming.dev



founded 2 years ago
MODERATORS
 

An update from GitHub: https://github.com/orgs/community/discussions/159123#discussioncomment-13148279

The rates are here: https://docs.github.com/en/rest/using-the-rest-api/rate-limits-for-the-rest-api?apiVersion=2022-11-28

  • 60 req/hour for unauthenticated users
  • 5000 req/hour for authenticated - personal
  • 15000 req/hour for authenticated - enterprise org
top 50 comments
sorted by: hot top controversial new old
[–] traches@sh.itjust.works 138 points 1 month ago (3 children)

Probably getting hammered by ai scrapers

[–] adarza@lemmy.ca 98 points 1 month ago (1 children)
[–] rickyrigatoni@lemm.ee 12 points 1 month ago (1 children)

Yeah but they're allowed to do it because they have brazillions of dollars.

load more comments (1 replies)
[–] db0@lemmy.dbzer0.com 13 points 1 month ago

The funny thing is that rate limits won't help them with genai scrapers

[–] potatopotato@sh.itjust.works 11 points 1 month ago

Everything seems to be. There was a period where you could kinda have a sane experience browsing over a VPN or otherwise using a cloud service IP range endpoint but especially the past 6 months or so things have gotten worse exponentially by the week. Everything is moving behind cloudflare or other systems

[–] hackeryarn@lemmy.world 81 points 1 month ago (2 children)

If Microsoft knows how to do one thing well, it’s killing a successful product.

[–] henfredemars@infosec.pub 25 points 1 month ago (4 children)

I came here looking for this comment. They bought the service to destroy it. It's kind of their thing.

load more comments (4 replies)
[–] Semi_Hemi_Demigod@lemmy.world 12 points 1 month ago (1 children)
[–] adarza@lemmy.ca 15 points 1 month ago (2 children)

we could have had bob or clippy instead of 'cortana' or 'copilot'

[–] Gork@lemm.ee 14 points 1 month ago

Microsoft really should have just leaned into it and named it Clippy again.

load more comments (1 replies)
[–] midori_matcha@lemmy.world 65 points 1 month ago

Github is owned by Microsoft, so don't worry, it's going to get worse

[–] tal@lemmy.today 49 points 1 month ago (3 children)

60 req/hour for unauthenticated users

That's low enough that it may cause problems for a lot of infrastructure. Like, I'm pretty sure that the MELPA emacs package repository builds out of git, and a lot of that is on github.

[–] Xanza@lemm.ee 31 points 1 month ago* (last edited 1 month ago) (2 children)

That’s low enough that it may cause problems for a lot of infrastructure.

Likely the point. If you need more, get an API key.

load more comments (2 replies)
[–] NotSteve_@lemmy.ca 13 points 1 month ago (1 children)

Do you think any infrastructure is pulling that often while unauthenticated? It seems like an easy fix either way (in my admittedly non devops opinion)

load more comments (1 replies)
load more comments (1 replies)
[–] onlinepersona@programming.dev 44 points 1 month ago (5 children)

I see the "just create an account" and "just login" crowd have joined the discussion. Some people will defend a monopolist no matter what. If github introduced ID checks à la Google or required a Microsoft account to login, they'd just shrug and go "create a Microsoft account then, stop bitching". They don't realise they are being boiled and don't care. Consoomer behaviour.

Anti Commercial-AI license

load more comments (5 replies)
[–] Lv_InSaNe_vL@lemmy.world 40 points 1 month ago* (last edited 1 month ago) (4 children)

I honestly don't really see the problem here. This seems to mostly be targeting scrapers.

For unauthenticated users you are limited to public data only and 60 requests per hour, or 30k if you're using Git LFS. And for authenticated users it's 60k/hr.

What could you possibly be doing besides scraping that would hit those limits?

[–] chaospatterns@lemmy.world 26 points 1 month ago* (last edited 1 month ago)

You might behind a shared IP with NAT or CG-NAT that shares that limit with others, or might be fetching files from raw.githubusercontent.com as part of an update system that doesn't have access to browser credentials, or Git cloning over https:// to avoid having to unlock your SSH key every time, or cloning a Git repo with submodules that separately issue requests. An hour is a long time. Imagine if you let uBlock Origin update filter lists, then you git clone something with a few modules, and so does your coworker and now you're blocked for an entire hour.

[–] MangoPenguin@lemmy.blahaj.zone 14 points 1 month ago

60 requests per hour per IP could easily be hit from say, uBlock origin updating filter lists in a household with 5-10 devices.

load more comments (2 replies)
[–] sturlabragason@lemmy.world 38 points 1 month ago* (last edited 1 month ago) (4 children)

No no, no no no no, no no no no, no no there's no limit

https://forgejo.org/

[–] Xanza@lemm.ee 18 points 1 month ago (1 children)

Until there will be.

I think people are grossly underestimating the sheer size and significance of the issue at hand. Forgejo will very likely eventually get to the same point Github is at right now, and will have to employ some of the same safeguards.

[–] FlexibleToast@lemmy.world 25 points 1 month ago (1 children)

Except Forgejo is open source and you can run your own instance of it. I do, and it's great.

[–] Xanza@lemm.ee 7 points 1 month ago (6 children)

That's a very accurate statement which has absolutely nothing to do with what I've said. Fact of the matter stands, is that those who generally seek to use a Github alternative do so because they dislike Microsoft or closed source platforms. Which is great, but those platforms with hosted instances see an overwhelmingly significant portion of users who visit because they choose not to selfhost. It's a lifecycle.

  1. Create cool software for free
  2. Cool software gets popular
  3. Release new features and improve free software
  4. Lots of users use your cool software
  5. Running software becomes expensive, monetize
  6. Software becomes even more popular, single stream monetization no longer possible
  7. Monetize more
  8. Get more popular
  9. Monetize more

By step 30 you're selling everyone's data and pushing resource restrictions because it's expensive to run a popular service that's generally free. That doesn't change simply because people can selfhost if they want.

load more comments (6 replies)
load more comments (3 replies)
[–] theunknownmuncher@lemmy.world 29 points 1 month ago* (last edited 1 month ago) (1 children)

LOL!!!! RIP GitHub

EDIT: trying to compile any projects from source that use git submodules will be interesting. eg ROCm has more than 60 submodules to pull in 💀

[–] sxan@midwest.social 24 points 1 month ago (2 children)

The Go module system pulls dependencies from their sources. This should be interesting.

Even if you host your project on a different provider, many libraries are on github. All those unauthenticated Arch users trying to install Go-based software that pulls dependencies from github.

How does the Rust module system work? How does pip?

[–] adarza@lemmy.ca 14 points 1 month ago (1 children)

already not looking forward to the next updates on a few systems.

[–] mesamunefire@piefed.social 8 points 1 month ago (2 children)

Yeah this could very well kill some package managers. Without some real hard heavy lifting.

load more comments (2 replies)
[–] UnityDevice@lemmy.zip 7 points 1 month ago* (last edited 1 month ago) (3 children)

Compiling any larger go application would hit this limit almost immediately. For example, podman is written in go and has around 70 dependencies, or about 200 when including transitive dependencies. Not all the depends are hosted on GitHub, but the vast majority are. That means that with a limit of 60 request per hour it would take you 3 hours to build podman on a new machine.

load more comments (3 replies)
[–] timewarp@lemmy.world 25 points 1 month ago (2 children)

Crazy how many people think this is okay, yet left Reddit cause of their API shenanigans. GitHub is already halfway to requiring signing in to view anything like Twitter (X).

[–] plz1@lemmy.world 15 points 1 month ago (1 children)

They make you sign in to use search, on code anyways.

load more comments (1 replies)
load more comments (1 replies)
[–] Sunshine@lemmy.ca 23 points 1 month ago (1 children)
[–] XM34@feddit.org 9 points 1 month ago (1 children)

Codeberg has used way stricter rate limiting since pretty much forever. Nice thought, but Codeberg will not solve this problem, like at all.

load more comments (1 replies)
[–] ozoned@piefed.social 20 points 1 month ago

Wow so surprising, never saw this coming, this is my surprised face. :-l

[–] atzanteol@sh.itjust.works 18 points 1 month ago

The enshittification begins (continues?)...

[–] daniskarma@lemmy.dbzer0.com 17 points 1 month ago (19 children)

Open source repositories should rely on p2p. Torrenting repos is the way I think.

Not only for this. At any point m$ could take down your repo if they or their investors don't like it.

I wonder if it would already exist and if it could work with git?

[–] thenextguy@lemmy.world 15 points 1 month ago (2 children)

Git is p2p and distributed from day 1. Github is just a convenient website. If Microsoft takes down your repo, just upload to another system. Nothing but convenience will be lost.

[–] witten@lemmy.world 9 points 1 month ago (2 children)

Not entirely true. You lose tickets and PRs in that scenario.

load more comments (2 replies)
load more comments (1 replies)
[–] samc@feddit.uk 9 points 1 month ago (3 children)

The project's official repo should probably exist in a single location so that there is an authoritative version. At that point p2p is only necessary if traffic for the source code is getting too expensive for the project.

Personally I think the source hut model is closest to the ideal set up for OSS projects. Though I use Codeberg for my personal stuff because I'm cheap and lazy

load more comments (3 replies)
load more comments (17 replies)
[–] DoucheBagMcSwag@lemmy.dbzer0.com 12 points 1 month ago

This going to fuck over obtanium?

[–] irelephant@programming.dev 12 points 1 month ago

Its always blocked me from searching in firefox when I'm logged out for some reason.

[–] kevin____@lemm.ee 11 points 1 month ago (1 children)

Good thing git is “federated” by default.

load more comments (1 replies)
[–] ArsonButCute@lemmy.dbzer0.com 11 points 1 month ago (5 children)

THIS is why I clone all my commonly used Repos to my personal gitea instance.

load more comments (5 replies)
[–] varnia@lemm.ee 11 points 1 month ago (1 children)

Good thing I moved all my repos from git[lab|hub] to Codeberg recently.

load more comments (1 replies)
[–] brachiosaurus@mander.xyz 8 points 1 month ago (1 children)

I have a question: why do lemmy dev keep using microsoft github?

load more comments (1 replies)
load more comments
view more: next ›