I’m unsure if this particular algorithm has a name, my search terms haven’t come across what I’m particularly looking for.
Say I have an array like so, where I have an increasing sequence of integers, some of which are missing:
[ 10 11 12 15 16 19 ]
What I would like is to be able to get the start and end indices for each monotonically increasing sequence.
For the case above, The output would be (assuming 0 indexing):
In MATLAB for example, I’d resort to using diff and separating the sequence that way. Unsure if this particular algorithm has a name, or an already available CUDA parallel implementation.
As for what to call it: This seems to be a hybrid of delta encoding followed by run-length encoding, and as such a fairly obvious concept, but if I search for “run-length delta encoding” I am not getting a whole lot of search engine matches. There may be something similar in graph theory, like clustering, but I am not very knowledgeable about that area.
[Later:] Searching for “delta run-length encoding” on the other hand gives a reasonable number of hits, including papers on sparse matrix methods.
Thanks for the input. Adjacent difference is the name I was after. It puts it in a format similar to what I was after and yes, just saw it’s trivial from the slides here: