I’m unsure if this particular algorithm has a name, my search terms haven’t come across what I’m particularly looking for.

Say I have an array like so, where I have an increasing sequence of integers, some of which are missing:

[ 10 11 12 15 16 19 ]

What I would like is to be able to get the start and end indices for each monotonically increasing sequence.

For the case above, The output would be (assuming 0 indexing):

det[0].start = 0, det[0].end = 2

det[1].start = 3, det[1].end = 4

det[2].start = 5, det[2].end = 5

In MATLAB for example, I’d resort to using diff and separating the sequence that way. Unsure if this particular algorithm has a name, or an already available CUDA parallel implementation.