Working on something way past my bedtime again...
I am performing a finditem and iterating through the results with a #findindex but during a couple of tests I realized there is a delay before doing anything if the #findcnt of my search has found too many items. That's expected, the #findindex is needing to process what was found one by one. It dawned on me however that there is no need to iterate through EVERY result, by limiting it to the first 50 or so I could eliminate the initial pause before the script does something. (well, not eliminate, just break it up so as not to procrastinate so long at the start!)
If finditem finds 1000 results how do I best limit it to processing only the first 50 via a #findindex loop? I do use continue as a filter but the sheer volume sometimes leads to prolonged inaction. I also wouldn't need to tell it to check the next 50 and so on afterwards until the 1000 are processed because when it's done acting on the first 50 it could be run again. Call it chunking if you will, and 50 is an arbitrary number, I'd fine tune it based on results.
Is there a command for limiting how many of the finditem results will be processed with #findindex if the #findcnt is exceedingly high, say 1000+ ?