this post was submitted on 12 Jan 2026
11 points (100.0% liked)

JavaScript

2627 readers
1 users here now

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] Kissaki@programming.dev 5 points 4 days ago (1 children)

After the first half, content repetition sets in, making me wonder about degree of LLM. I feel like at the end I read two things in particular three times.

Either way, The first half or third was interesting and valuable.

[–] victorz@lemmy.world 4 points 4 days ago (1 children)

I did some benchmarks a few months ago on iterator helpers, but unless I used them incorrectly, regular arrays were much more performant, maybe an order of magnitude, than supposedly lazy iterator helpers.

Maybe it's a poor implementation in Firefox? I don't know.

[–] towerful@programming.dev 2 points 4 days ago* (last edited 4 days ago) (1 children)

I think it's the take(10) that makes it performant.
I guess things like any() will also short-circuit having to process the entire array through a chain of operators before acting on a subset of the final processed array.
If that makes sense....

But if you are filtering then mapping an array, an iterator won't help. You need to go through every element of the source array (to filter) then every element of the filtered array (to map) anyway. There is no opportunity to short circuit, so an iterator won't help.
Edit: and I bet there is overhead keeping iterators active/alive and context switching as each new iteration gets requested, instead of batch processing an array then moving on to the next batch process.

[–] victorz@lemmy.world 3 points 4 days ago

I imagine so, it was just surprising to see it underperform so heavily.