this post was submitted on 19 Nov 2025
36 points (95.0% liked)

Programming

23517 readers
279 users here now

Welcome to the main community in programming.dev! Feel free to post anything relating to programming here!

Cross posting is strongly encouraged in the instance. If you feel your post or another person's post makes sense in another community cross post into it.

Hope you enjoy the instance!

Rules

Rules

  • Follow the programming.dev instance rules
  • Keep content related to programming in some way
  • If you're posting long videos try to add in some form of tldr for those who don't want to watch videos

Wormhole

Follow the wormhole through a path of communities !webdev@programming.dev



founded 2 years ago
MODERATORS
 

I have a vendor that sucks donkey balls. Their systems break often. An endpoint we rely on will start returning [] and take months to fix. They'll change a data label in their backend and not notice that it flows into all of their filters and stuff.

I have some alerts when my consumers break, but I think I'd like something more direct. What's the best way to monitor an external API?

I'm imagining some very basic ML that can pop up and tell me that something has changed, like there are more hosts or categories or whatever than usual, that a structure has gone blank or is missing, that some field has gone to 0 or null across the structure. Heck, that a field name has changed.

Is the best way to basically write tests for everything I can think of, and add more as things break, or is there a better tool? I see API monitoring tools but they are for calculating availability for your own APIs, not for enforcing someone else's!

you are viewing a single comment's thread
view the rest of the comments
[–] clay_pidgin@sh.itjust.works 3 points 1 day ago (2 children)

They generate a swaggger file for me on request with a lag time of weeks usually, but for only one of the APIs. The others are documented in emails basically. This is a B2B type of thing, they are not publicly available APIs.

[–] Nomad@infosec.pub 3 points 1 day ago (1 children)

Ask them to generate a schema file that you can download from the api. Or at least an endpoint that returns a hash of the current api schema file. That's cheap versioning telling you if something changes.

You can always use the swagger schema to verify the api. So ask some basic questions what should always be true and put that into validation scripts. If they use a framework, HEAD requests usually tell you some things.

Last really bad vendor had an openapi page that listed the endpoints but the api wouldn't adhere to the details given there. I discovered that their website used the api all the time and surfing that i was able to discover which parameters were required etc.

Last idea is statistics. Grab any count data you can get, like from pagination data and create a baseline of available data over time. That gives you an expected count and you can detect significant divergences.

I tend to show up at the vendors it guys in person and bribe them into helping me behind their bosses backs. Chocolate, coffee and some banter can do wonders.

[–] clay_pidgin@sh.itjust.works 2 points 23 hours ago* (last edited 18 hours ago) (1 children)

I'm 3,500 miles from the vendor's devs, sadly.

Asking them to put the swagger file itself behind the API is a good idea. Their dev backlog is 3-24 months.

I used the same trick to determine the required headers and parameters - I checked their website which uses the same API.

The source of their delays is that different devs or teams "own" different endpoints and make their changes without documenting. It's annoying, stuff like the same data being in field "hostId" on one endpoint but "deviceId" on another.

[–] Nomad@infosec.pub 2 points 6 hours ago (1 children)

Just build a few selenium Tests to ensure the API requests the website performs don't change without you noticing :)

[–] clay_pidgin@sh.itjust.works 1 points 4 hours ago (1 children)

That's not a bad idea. Usually, so far, their frontend team doesn't hear about the changes either!

[–] Nomad@infosec.pub 2 points 1 hour ago (1 children)

Wow that's bad practice. Sell your monitoring to them to help improve their quality.

[–] clay_pidgin@sh.itjust.works 1 points 41 minutes ago

I honestly think we provide a significant impetus for improvement on their side. They have lots of other customers, but most aren't as involved and embedded in the data as we are.

[–] yaroto98@lemmy.world 2 points 1 day ago (1 children)

Are any of their apis a GET that returns lists? I create a lot of automated api tests. You might be able to GET a list of users (or whatever) then pick a random 10 user_ids and query another api, say user_addresses and pass in each id one at a time and verify a proper result. You don't have to verify the data itself, just that the values you care about are not empty and they key exists.

You can dynamically test a lot this way and if a key gets changed from 'street' to 'street_address' your failing tests should let you know.

[–] clay_pidgin@sh.itjust.works 2 points 1 day ago (1 children)

Unfortunately on the main API I use of theirs, there's an endpoint with a list of objects and their IDs, and those IDs are used everywhere else. The rest of the endpoints aren't connected. I can't walk e.g. school > students > student > grades or something

[–] yaroto98@lemmy.world 2 points 1 day ago (1 children)

I made my career out of automated testing with a focus on apis. I'm not aware of any easy tool to do what you want. The easiest way to quick whip up basic api tests that I've found is python/pytest with requests. You can parameterize lots of inputs, run tests in parallel, easily add new endpoints as you go, benchmark the apis for response times, etc. It'll take a lot of work in the beginning, then save you a lot of work in the end.

Now, AI will be able to make the process go faster. If you give it a sample input and output it can do 95% of a pytest in 10s. But beware that last 5%.

[–] jjjalljs@ttrpg.network 2 points 1 day ago

Yeah I would use python and pytest, probably.

You need to decide what you expect to be a passing case. Known keys are all there? All values in acceptable range? Do you have anything where you know exactly what the response should be?

How many endpoints are there?