this post was submitted on 30 Mar 2026
8 points (100.0% liked)

Programming

26291 readers
649 users here now

Welcome to the main community in programming.dev! Feel free to post anything relating to programming here!

Cross posting is strongly encouraged in the instance. If you feel your post or another person's post makes sense in another community cross post into it.

Hope you enjoy the instance!

Rules

Rules

  • Follow the programming.dev instance rules
  • Keep content related to programming in some way
  • If you're posting long videos try to add in some form of tldr for those who don't want to watch videos

Wormhole

Follow the wormhole through a path of communities !webdev@programming.dev



founded 2 years ago
MODERATORS
 

I need to scan very large JSONL files efficiently and am considering a parallel grep-style approach over line-delimited text.

Would love to hear how you would design it.

you are viewing a single comment's thread
view the rest of the comments
[–] entwine@programming.dev 5 points 18 hours ago

You don't actually need to "split" anything, you just read from different offsets per thread. Mmap might be the most efficient way to do this (or at least the easiest)

Whether or not that's going to run into hardware bottlenecks is a separate issue from designing a parallel algorithm. Idk what OP is trying to accomplish, but if their hardware is known (eg this is an internal tool meant to run in a data center), they'll need to read up on their hardware and virtualization architecture to squeeze the most IO performance.

But if parsing is actually the bottleneck, there's a lot you can do to optimize it in software. Simdjson would be a good place to start.