VPI functions distorting images, non-vpi OK

hello!

I have a script which performs a warp perspective then sends it to an RTMP target with gstreamer, via opencv videowriter

This script works fine when I am loading the image from the harddrive and passing it straight to the video writer, but when I put it through the VPI functions it must become warped - or there is some kind of incompatibility

My gstreamer command for opencv video writer is as follows:


def gstreamer_pipeline_out_2_WORKS():
    return (
        "appsrc ! "
        "queue ! "
        "videoconvert ! "
        "video/x-raw,format=RGBA ! "
        "nvvidconv ! "
        "nvv4l2h264enc ! "
        "video/x-h264,stream-format=(string)byte-stream,alignment=(string)au ! "
        "h264parse ! "
        "flvmux name=mux ! "
        "rtmpsink location=rtmp://global-live.mux.com:5222/app/[secret key]"
    )
out_stream = cv2.VideoWriter(
        gstreamer_pipeline_out_2_WORKS(),
        cv2.CAP_GSTREAMER, 0, 30.0,
        (1500,1500))

This works fine for non VPI images, but the same image passed through the VPI functions and monitored at the RTMP end point is distorted (but definitely there and moving with the perspective homography time-shift) - which looks like a bit-depth problem

my VPI loop is as follows:

    cnt = 0
    while True:
        cnt +=1
        hom = np.array([
                [1, math.sin(cnt/10), 0],
                [0, 1, 0],
                [0, 0, 1]])
        

        with vpi.Backend.CUDA:
            with streamLeft:
                frame1 = vpi.asimage(test_img).convert(vpi.Format.NV12_ER)
            with streamRight:
                frame2 = vpi.asimage(test_img).convert(vpi.Format.NV12_ER)

        with vpi.Backend.CUDA:
            with streamLeft:
                frame1 = frame1.perspwarp(hom)
            with streamRight:
                frame2 = frame2.perspwarp(hom)

        with vpi.Backend.CUDA:
            with streamLeft:
                frame1 = frame1.convert(vpi.Format.RGB8)
            with streamRight:
                frame2 = frame2.convert(vpi.Format.RGB8)

        streamLeft.sync()
        streamRight.sync()

        with frame1.rlock_cpu() as data:
            out_stream.write(data)

does anyone have any ideas what could be distorting the image? When I run without the VPI I get this at the RTMP stream monitor, as it should be

image

thanks for any assistance!

Hi,

Could you share the complete python code so we can give it a check in our environment as well?
Thanks.

hello, that would be great thanks. I have supplied you with an RTMP endpoint and public link but I believe the gstreamer section is OK

If you stream to the RTMP endpoint you can use this to see the live stream (takes a few seconds), if you paste it into the address bar it should download an m3u8 file which allows you to view, or you can paste into for instance VLC live stream field
https://stream.mux.com/vL9SJU61FSv8sSQR01F6ajKI702WeK2pXRuLVtw25zquo.m3u8

Ultimately the image will be replaced with camera input but they are not available at the moment

import sys
try:
    import cv2
except Exception as e:
    print("error importing cv2 - attempting again with path")
    sys.path.append('/usr/local/lib/python3.8/site-packages')
    import cv2
    print("successfully imported cv2")
import time
import math
import vpi
import random
import time
import numpy as np


def gstreamer_pipeline_out_2():
    return (
        "appsrc ! "
        "queue ! "
        "videoconvert ! "
        "video/x-raw,format=RGBA ! "
        "nvvidconv ! "
        "nvv4l2h264enc ! "
        "video/x-h264,stream-format=(string)byte-stream,alignment=(string)au ! "
        "h264parse ! "
        "flvmux name=mux ! "
        "rtmpsink location=rtmp://global-live.mux.com:5222/app/51bc0427-ad29-2909-4979-11ee335d2b53"
    )


def stream_to_mux2():

    test_img = cv2.imread("still-life-composition-tips-01-e1646414421456.jpg")
    test_img = cv2.resize(test_img, (1500, 1500))
    test_img_no_op = test_img.copy()
    streamLeft = vpi.Stream()
    streamRight = vpi.Stream()

    out_stream = cv2.VideoWriter(
        gstreamer_pipeline_out_2(),
        cv2.CAP_GSTREAMER, 0, 30.0,
        (1500,1500))
    cnt = 0
    while True:
        cnt +=1

        # time-based moving transform
        hom = np.array([
                [1, math.sin(cnt/10), 0],
                [0, 1, 0],
                [0, 0, 1]])
        

        with vpi.Backend.CUDA:
            with streamLeft:
                frame1 = vpi.asimage(test_img).convert(vpi.Format.NV12_ER)
            with streamRight:
                frame2 = vpi.asimage(test_img).convert(vpi.Format.NV12_ER)


        with vpi.Backend.CUDA:
            with streamLeft:
                frame1 = frame1.perspwarp(hom)
            with streamRight:
                frame2 = frame2.perspwarp(hom)


        with vpi.Backend.CUDA:
            with streamLeft:
                frame1 = frame1.convert(vpi.Format.RGB8)
            with streamRight:
                frame2 = frame2.convert(vpi.Format.RGB8)

        streamLeft.sync()
        streamRight.sync()


        # don't blow up buffers?
        time.sleep(0.030)

        no_vpi_test = False

        if no_vpi_test is True:
            #check image is updating by flashing a layer
            test_img_no_op[:,:,0] = random.randint(0,250)
            out_stream.write(test_img_no_op)
        
        else:
            with frame1.rlock_cpu() as data:
                print(data.shape)
                print(type(data[1,1,1]))
                out_stream.write(data)
            with frame2.rlock_cpu() as data:
                # sanity check to make sure its doing something
                print (data[0:5, 0:5, 0])

if __name__ == '__main__':
    stream_to_mux2()

Just in case anyone else has the same problem - I had to change out_stream.write(data) to out_stream.write(data.copy()) and now I get an undistorted image sent to our video endpoint. This isn’t ideal obviously so any input very welcome

Hi,

Could you try if adding vpi.clear_cache() can help?

For example:

...
streamLeft.sync()
streamRight.sync()
vpi.clear_cache()
...

Thanks.

I couldn’t check if it worked due to my output not working, but it slowed the pipeline down too much to keep up with the required fps, thanks anyway I will read up what it does

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.