Say you have a stream of incoming data. Perhaps it's a database table that's monotonically increasing.
You want to do some processing on it that will take a long time because of the size of the data. Say it's one million records.
You take a snap shot and processes the million records. Say that takes 4 hours. In the meantime, ten thousand new records have come in. So you take a snapshot and process those. Say that takes two minutes. In the meantime, a hundred new records have come in. So you take a snapshot of those...and so on.
The analogy with the paradoxes of Zeno of Elea is obvious and so Nicholas Tollervey and I have decided "Zeno processing" might be a useful term for this approach.
At some point the processing is quick enough that either no new data comes in or you can take the stream down for enough time to finish off the processing.
I'm sure there's an existing name for this technique, but I like "Zeno processing".