I’m unsure if this particular algorithm has a name, my search terms haven’t come across what I’m particularly looking for.
Say I have an array like so, where I have an increasing sequence of integers, some of which are missing:
[ 10 11 12 15 16 19 ]
What I would like is to be able to get the start and end indices for each monotonically increasing sequence.
For the case above, The output would be (assuming 0 indexing):
det.start = 0, det.end = 2
det.start = 3, det.end = 4
det.start = 5, det.end = 5
In MATLAB for example, I’d resort to using diff and separating the sequence that way. Unsure if this particular algorithm has a name, or an already available CUDA parallel implementation.