Piping Headless Nano 2gb detectnet stream via usb to Android Tablet

Hi all.

How would i stream my real-time detectnet to my android tablet via usb cable

I was hoping to do it using usb to usb otg cable.

Whats the best way to do this? Does Gstreamer allow you to pipe the detectnet stream using usb? If so is there an example for this command on the jetson and the host android tablet?

If there is a better way love to hear it?

I prefer to use the nano 2gb as an usb accelerator and hande all the webcam C920 interactions and ai inferencing on the the board and have my application running on the tablet.

Hi,
Generally we use RTSP streaming for sending results from Jetson device to another device. For using USB protocol, there is no existing implementation and would need other users to share experience.

Hi @pow, you would need there to be some virtual ethernet networking connection established over the USB port.

There isnโ€™t support in GStreamer for doing video streaming using low-level USB connection directly.

Typically when you plug the Jetsonโ€™s MicroUSB port into a PC, it makes a virtual ethernet connection (on 192.168.55.1 static IP). Then over that network, you can just use RTP video streaming like normal. However Iโ€™m not sure of the specifics of Android OTG and if it automatically works that way or not.

Hi guys thanks for that info.

I will first do it with pc as host and try to get it working on android tablet after.

@dusty_nv does that mean by connecting my host to jetson nano 2gb via microusb. I am able to use rtp to stream without using any wifi correct? This virtual ethernet connection essentially mimics it via microusb cable?

If you can provide the sample rtp command to run that would be great. Just to be clear the nano will not have wifi module connected.

thanks guys, i just cant find any documentation on webcam connected nano connected to a host pc via cabled connection. your help is much appreciated.

the youtube Jetson AI fundamental videos are great. looking forward to watching more of these on the nano 2gb

Yes, when you connect your Nano to a PC over MicroUSB, it should establish a virtual ethernet connection. You should also see a virtual folder pop-up on your PC that contains additional instructions (i.e. for connecting to your Jetson via SSH).

The Jetsonโ€™s static IP over this MicroUSB networking connection is 192.168.55.1. So you should try to ping 192.168.55.1 from your PC and see if the Jetson responds. Then find out what the IP address of your PC is on that interface (i.e. with ipconfig / ifconfig command).

Then when you want to stream RTP from your Jetson to your PC, just use that IP of your PC in the video command launched from your Jetson.

1 Like

Tried doing it over my linux pc host through wifi not seeing the pipe even though the pipeline is running on host pc

tried the sdp vlc method on my windows cuz gst-launch was not opening even when i was in the root

Finally tried to run both sening and receiving on same nano but no luck
here u can see the ip and the commands i ranโ€ฆ

update:

even tried running it in docker with this command

docker run -it -p 1234:1234 --net=host restreamio/gstreamer:1.18.1.0-prod-dbg

still no streamโ€ฆ jetson showing dectnet working fine

Can you ping your Linux PC from your Jetson and vice versa? Can you post the terminal log of running detectnet program from your Jetson?

It seems your Windows GStreamer error on PC was related to not being able to open an xvimagesink window, and not necessarily RTP. Windows-based GStreamer might require a different window sink element, as xvimagesink is using X11.

What is the GStreamer pipeline you are running on your Linux PC? Is there any terminal output? Is it possible that a firewall is blocking port 1234?

Ping my linux host and ping the jetson nano are success (see attached pic)

So i ran the command again with port 9999 on jetson
sudo python3 detectnet.py /dev/video0 --output-codec=h264 rtp://10.0.0.128:9999

Here is the log from the jetson (sorry if there is abetter way to show it I am not aware)

` est@test-desktop:~/jetson-inference/python/examples$ sudo python3 detectnet.py /dev/video0 --output-codec=h264 rtp://10.0.0.128:9999
[sudo] password for test:
jetson.inference โ€“ detectNet loading network using argv command line params

detectNet โ€“ loading detection network model from:
โ€“ model networks/SSD-Mobilenet-v2/ssd_mobilenet_v2_coco.uff
โ€“ input_blob โ€˜Inputโ€™
โ€“ output_blob โ€˜NMSโ€™
โ€“ output_count โ€˜NMS_1โ€™
โ€“ class_labels networks/SSD-Mobilenet-v2/ssd_coco_labels.txt
โ€“ threshold 0.500000
โ€“ batch_size 1

[TRT] TensorRT version 7.1.3
[TRT] loading NVIDIA pluginsโ€ฆ
[TRT] Registered plugin creator - ::GridAnchor_TRT version 1
[TRT] Registered plugin creator - ::NMS_TRT version 1
[TRT] Registered plugin creator - ::Reorg_TRT version 1
[TRT] Registered plugin creator - ::Region_TRT version 1
[TRT] Registered plugin creator - ::Clip_TRT version 1
[TRT] Registered plugin creator - ::LReLU_TRT version 1
[TRT] Registered plugin creator - ::PriorBox_TRT version 1
[TRT] Registered plugin creator - ::Normalize_TRT version 1
[TRT] Registered plugin creator - ::RPROI_TRT version 1
[TRT] Registered plugin creator - ::BatchedNMS_TRT version 1
[TRT] Could not register plugin creator - ::FlattenConcat_TRT version 1
[TRT] Registered plugin creator - ::CropAndResize version 1
[TRT] Registered plugin creator - ::DetectionLayer_TRT version 1
[TRT] Registered plugin creator - ::Proposal version 1
[TRT] Registered plugin creator - ::ProposalLayer_TRT version 1
[TRT] Registered plugin creator - ::PyramidROIAlign_TRT version 1
[TRT] Registered plugin creator - ::ResizeNearest_TRT version 1
[TRT] Registered plugin creator - ::Split version 1
[TRT] Registered plugin creator - ::SpecialSlice_TRT version 1
[TRT] Registered plugin creator - ::InstanceNormalization_TRT version 1
[TRT] detected model format - UFF (extension โ€˜.uffโ€™)
[TRT] desired precision specified for GPU: FASTEST
[TRT] requested fasted precision for device GPU without providing valid calibrator, disabling INT8
[TRT] native precisions detected for GPU: FP32, FP16
[TRT] selecting fastest native precision for GPU: FP16
[TRT] attempting to open engine cache file /usr/local/bin/networks/SSD-Mobilenet-v2/ssd_mobilenet_v2_coco.uff.1.1.7103.GPU.FP16.engine
[TRT] loading network plan from engine cacheโ€ฆ /usr/local/bin/networks/SSD-Mobilenet-v2/ssd_mobilenet_v2_coco.uff.1.1.7103.GPU.FP16.engine
[TRT] device GPU, loaded /usr/local/bin/networks/SSD-Mobilenet-v2/ssd_mobilenet_v2_coco.uff
[TRT] Using an engine plan file across different models of devices is not recommended and is likely to affect performance or even cause errors.
[TRT] Deserialize required 6539711 microseconds.
[TRT]
[TRT] CUDA engine context initialized on device GPU:
[TRT] โ€“ layers 119
[TRT] โ€“ maxBatchSize 1
[TRT] โ€“ workspace 0
[TRT] โ€“ deviceMemory 59078656
[TRT] โ€“ bindings 3
[TRT] binding 0
โ€“ index 0
โ€“ name โ€˜Inputโ€™
โ€“ type FP32
โ€“ in/out INPUT
โ€“ # dims 3
โ€“ dim #0 3 (SPATIAL)
โ€“ dim #1 300 (SPATIAL)
โ€“ dim #2 300 (SPATIAL)
[TRT] binding 1
โ€“ index 1
โ€“ name โ€˜NMSโ€™
โ€“ type FP32
โ€“ in/out OUTPUT
โ€“ # dims 3
โ€“ dim #0 1 (SPATIAL)
โ€“ dim #1 100 (SPATIAL)
โ€“ dim #2 7 (SPATIAL)
[TRT] binding 2
โ€“ index 2
โ€“ name โ€˜NMS_1โ€™
โ€“ type FP32
โ€“ in/out OUTPUT
โ€“ # dims 3
โ€“ dim #0 1 (SPATIAL)
โ€“ dim #1 1 (SPATIAL)
โ€“ dim #2 1 (SPATIAL)
[TRT]
[TRT] binding to input 0 Input binding index: 0
[TRT] binding to input 0 Input dims (b=1 c=3 h=300 w=300) size=1080000
[TRT] binding to output 0 NMS binding index: 1
[TRT] binding to output 0 NMS dims (b=1 c=1 h=100 w=7) size=2800
[TRT] binding to output 1 NMS_1 binding index: 2
[TRT] binding to output 1 NMS_1 dims (b=1 c=1 h=1 w=1) size=4
[TRT]
[TRT] device GPU, /usr/local/bin/networks/SSD-Mobilenet-v2/ssd_mobilenet_v2_coco.uff initialized.
[TRT] W = 7 H = 100 C = 1
[TRT] detectNet โ€“ maximum bounding boxes: 100
[TRT] detectNet โ€“ loaded 91 class info entries
[TRT] detectNet โ€“ number of object classes: 91
[gstreamer] initialized gstreamer, version 1.14.5.0
[gstreamer] gstCamera โ€“ attempting to create device v4l2:///dev/video0
[gstreamer] gstCamera โ€“ found v4l2 device: Microsoftยฎ LifeCam Showโ„ข
[gstreamer] v4l2-proplist, device.path=(string)/dev/video0, udev-probed=(boolean)false, device.api=(string)v4l2, v4l2.device.driver=(string)uvcvideo, v4l2.device.card=(string)โ€œMicrosoft\302\256\ LifeCam\ Showโ„ขโ€, v4l2.device.bus_info=(string)usb-70090000.xusb-3.2, v4l2.device.version=(uint)264588, v4l2.device.capabilities=(uint)2216689665, v4l2.device.device_caps=(uint)69206017;
[gstreamer] gstCamera โ€“ found 15 caps for v4l2 device /dev/video0
[gstreamer] [0] video/x-raw, format=(string)YUY2, width=(int)800, height=(int)600, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)15/1;
[gstreamer] [1] video/x-raw, format=(string)YUY2, width=(int)640, height=(int)480, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)15/1;
[gstreamer] [2] video/x-raw, format=(string)YUY2, width=(int)352, height=(int)288, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)15/1;
[gstreamer] [3] video/x-raw, format=(string)YUY2, width=(int)320, height=(int)240, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)15/1;
[gstreamer] [4] video/x-raw, format=(string)YUY2, width=(int)176, height=(int)144, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)15/1;
[gstreamer] [5] video/x-raw, format=(string)YUY2, width=(int)160, height=(int)120, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)15/1;
[gstreamer] [6] image/jpeg, width=(int)1600, height=(int)1200, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)15/2;
[gstreamer] [7] image/jpeg, width=(int)1280, height=(int)960, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)15/2;
[gstreamer] [8] image/jpeg, width=(int)1024, height=(int)768, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)15/2;
[gstreamer] [9] image/jpeg, width=(int)800, height=(int)600, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)15/1;
[gstreamer] [10] image/jpeg, width=(int)640, height=(int)480, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)15/1;
[gstreamer] [11] image/jpeg, width=(int)352, height=(int)288, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)15/1;
[gstreamer] [12] image/jpeg, width=(int)320, height=(int)240, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)15/1;
[gstreamer] [13] image/jpeg, width=(int)176, height=(int)144, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)15/1;
[gstreamer] [14] image/jpeg, width=(int)160, height=(int)120, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)15/1;
[gstreamer] gstCamera โ€“ selected device profile: codec=mjpeg format=unknown width=1280 height=960
[gstreamer] gstCamera pipeline string:
[gstreamer] v4l2src device=/dev/video0 ! image/jpeg, width=(int)1280, height=(int)960 ! jpegdec ! video/x-raw ! appsink name=mysink
[gstreamer] gstCamera successfully created device v4l2:///dev/video0
[video] created gstCamera from v4l2:///dev/video0

gstCamera video options:

โ€“ URI: v4l2:///dev/video0
- protocol: v4l2
- location: /dev/video0
โ€“ deviceType: v4l2
โ€“ ioType: input
โ€“ codec: mjpeg
โ€“ width: 1280
โ€“ height: 960
โ€“ frameRate: 7.500000
โ€“ bitRate: 0
โ€“ numBuffers: 4
โ€“ zeroCopy: true
โ€“ flipMethod: none
โ€“ loop: 0

[OpenGL] glDisplay โ€“ X screen 0 resolution: 1920x1080
[OpenGL] glDisplay โ€“ X window resolution: 1920x1080
[OpenGL] glDisplay โ€“ display device initialized (1920x1080)
[video] created glDisplay from display://0

glDisplay video options:

โ€“ URI: display://0
- protocol: display
- location: 0
โ€“ deviceType: display
โ€“ ioType: output
โ€“ codec: raw
โ€“ width: 1920
โ€“ height: 1080
โ€“ frameRate: 0.000000
โ€“ bitRate: 0
โ€“ numBuffers: 4
โ€“ zeroCopy: true
โ€“ flipMethod: none
โ€“ loop: 0

[gstreamer] opening gstCamera for streaming, transitioning pipeline to GST_STATE_PLAYING
[gstreamer] gstreamer changed state from NULL to READY ==> mysink
[gstreamer] gstreamer changed state from NULL to READY ==> capsfilter1
[gstreamer] gstreamer changed state from NULL to READY ==> jpegdec0
[gstreamer] gstreamer changed state from NULL to READY ==> capsfilter0
[gstreamer] gstreamer changed state from NULL to READY ==> v4l2src0
[gstreamer] gstreamer changed state from NULL to READY ==> pipeline0
[gstreamer] gstreamer changed state from READY to PAUSED ==> capsfilter1
[gstreamer] gstreamer changed state from READY to PAUSED ==> jpegdec0
[gstreamer] gstreamer changed state from READY to PAUSED ==> capsfilter0
[gstreamer] gstreamer stream status CREATE ==> src
[gstreamer] gstreamer changed state from READY to PAUSED ==> v4l2src0
[gstreamer] gstreamer changed state from READY to PAUSED ==> pipeline0
[gstreamer] gstreamer message new-clock ==> pipeline0
[gstreamer] gstreamer stream status ENTER ==> src
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> capsfilter1
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> jpegdec0
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> capsfilter0
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> v4l2src0
[gstreamer] gstreamer message stream-start ==> pipeline0
[gstreamer] gstCamera โ€“ onPreroll
[gstreamer] gstCamera โ€“ map buffer size was less than max size (1843200 vs 1843207)
[gstreamer] gstCamera recieve caps: video/x-raw, format=(string)I420, width=(int)1280, height=(int)960, interlace-mode=(string)progressive, multiview-mode=(string)mono, multiview-flags=(GstVideoMultiviewFlagsSet)0:ffffffff:/right-view-first/left-flipped/left-flopped/right-flipped/right-flopped/half-aspect/mixed-mono, pixel-aspect-ratio=(fraction)1/1, chroma-site=(string)mpeg2, colorimetry=(string)1:4:0:0, framerate=(fraction)15/2
[gstreamer] gstCamera โ€“ recieved first frame, codec=mjpeg format=i420 width=1280 height=960 size=1843207
RingBuffer โ€“ allocated 4 buffers (1843207 bytes each, 7372828 bytes total)
[gstreamer] gstreamer changed state from READY to PAUSED ==> mysink
[gstreamer] gstreamer message async-done ==> pipeline0
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> mysink
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> pipeline0
RingBuffer โ€“ allocated 4 buffers (3686400 bytes each, 14745600 bytes total)
detected 0 objects in image
[OpenGL] glDisplay โ€“ set the window size to 1280x960
[OpenGL] creating 1280x960 texture (GL_RGB8 format, 3686400 bytes)
[cuda] registered openGL texture for interop access (1280x960, GL_RGB8, 3686400 bytes)

[TRT] ------------------------------------------------
[TRT] Timing Report /usr/local/bin/networks/SSD-Mobilenet-v2/ssd_mobilenet_v2_coco.uff
[TRT] ------------------------------------------------
[TRT] Pre-Process CPU 0.08323ms CUDA 2.98047ms
[TRT] Network CPU 189.00584ms CUDA 153.44464ms
[TRT] Post-Process CPU 0.02672ms CUDA 0.02714ms
[TRT] Total CPU 189.11580ms CUDA 156.45224ms
[TRT] ------------------------------------------------

[TRT] note โ€“ when processing a single image, run โ€˜sudo jetson_clocksโ€™ before
to disable DVFS for more accurate profiling/timing measurements

detected 0 objects in image

[TRT] ------------------------------------------------
[TRT] Timing Report /usr/local/bin/networks/SSD-Mobilenet-v2/ssd_mobilenet_v2_coco.uff
[TRT] ------------------------------------------------
[TRT] Pre-Process CPU 0.08135ms CUDA 1.13667ms
[TRT] Network CPU 91.56368ms CUDA 77.26234ms
[TRT] Post-Process CPU 0.05448ms CUDA 0.05688ms
[TRT] Total CPU 91.69952ms CUDA 78.45589ms
[TRT] ------------------------------------------------

detected 0 objects in image

[TRT] ------------------------------------------------
[TRT] Timing Report /usr/local/bin/networks/SSD-Mobilenet-v2/ssd_mobilenet_v2_coco.uff
[TRT] ------------------------------------------------
[TRT] Pre-Process CPU 0.08755ms CUDA 1.01719ms
[TRT] Network CPU 77.71540ms CUDA 64.65385ms
[TRT] Post-Process CPU 0.07563ms CUDA 0.07818ms
[TRT] Total CPU 77.87858ms CUDA 65.74922ms
[TRT] ------------------------------------------------

detected 0 objects in image

[TRT] ------------------------------------------------
[TRT] Timing Report /usr/local/bin/networks/SSD-Mobilenet-v2/ssd_mobilenet_v2_coco.uff
[TRT] ------------------------------------------------
[TRT] Pre-Process CPU 0.93647ms CUDA 1.14859ms
[TRT] Network CPU 80.06955ms CUDA 68.97802ms
[TRT] Post-Process CPU 0.03870ms CUDA 0.00068ms
[TRT] Total CPU 81.04472ms CUDA 70.12730ms
[TRT] ------------------------------------------------

detected 0 objects in image

[TRT] ------------------------------------------------
[TRT] Timing Report /usr/local/bin/networks/SSD-Mobilenet-v2/ssd_mobilenet_v2_coco.uff
[TRT] ------------------------------------------------
[TRT] Pre-Process CPU 0.20964ms CUDA 1.17062ms
[TRT] Network CPU 78.71015ms CUDA 65.03578ms
[TRT] Post-Process CPU 0.06313ms CUDA 0.06229ms
[TRT] Total CPU 78.98292ms CUDA 66.26870ms
[TRT] ------------------------------------------------

detected 0 objects in image

[TRT] ------------------------------------------------
[TRT] Timing Report /usr/local/bin/networks/SSD-Mobilenet-v2/ssd_mobilenet_v2_coco.uff
[TRT] ------------------------------------------------
[TRT] Pre-Process CPU 0.09208ms CUDA 1.03151ms
[TRT] Network CPU 79.32522ms CUDA 66.61557ms
[TRT] Post-Process CPU 0.07099ms CUDA 0.16448ms
[TRT] Total CPU 79.48830ms CUDA 67.81156ms
[TRT] ------------------------------------------------

detected 2 objects in image
<detectNet.Detection object>
โ€“ ClassID: 65
โ€“ Confidence: 0.586404
โ€“ Left: 38.3999
โ€“ Top: 47.0204
โ€“ Right: 1189.14
โ€“ Bottom: 937.993
โ€“ Width: 1150.74
โ€“ Height: 890.973
โ€“ Area: 1.02528e+06
โ€“ Center: (613.77, 492.507)
<detectNet.Detection object>
โ€“ ClassID: 15
โ€“ Confidence: 0.506469
โ€“ Left: 8.1221
โ€“ Top: 386.618
โ€“ Right: 701.907
โ€“ Bottom: 854.552
โ€“ Width: 693.785
โ€“ Height: 467.934
โ€“ Area: 324646
โ€“ Center: (355.015, 620.585)

[TRT] ------------------------------------------------
[TRT] Timing Report /usr/local/bin/networks/SSD-Mobilenet-v2/ssd_mobilenet_v2_coco.uff
[TRT] ------------------------------------------------
[TRT] Pre-Process CPU 0.10448ms CUDA 1.15188ms
[TRT] Network CPU 81.03545ms CUDA 66.73776ms
[TRT] Post-Process CPU 0.04646ms CUDA 0.12849ms
[TRT] Visualize CPU 29.29482ms CUDA 29.45260ms
[TRT] Total CPU 110.48121ms CUDA 97.47073ms
[TRT] ------------------------------------------------

detected 0 objects in image

[TRT] ------------------------------------------------
[TRT] Timing Report /usr/local/bin/networks/SSD-Mobilenet-v2/ssd_mobilenet_v2_coco.uff
[TRT] ------------------------------------------------
[TRT] Pre-Process CPU 0.57115ms CUDA 1.04240ms
[TRT] Network CPU 76.20772ms CUDA 61.06995ms
[TRT] Post-Process CPU 0.06719ms CUDA 0.06594ms
[TRT] Visualize CPU 29.29482ms CUDA 29.45260ms
[TRT] Total CPU 106.14088ms CUDA 91.63088ms
[TRT] ------------------------------------------------

detected 0 objects in image

[TRT] ------------------------------------------------
[TRT] Timing Report /usr/local/bin/networks/SSD-Mobilenet-v2/ssd_mobilenet_v2_coco.uff
[TRT] ------------------------------------------------
[TRT] Pre-Process CPU 0.08693ms CUDA 1.03182ms
[TRT] Network CPU 72.21333ms CUDA 59.39000ms
[TRT] Post-Process CPU 0.07682ms CUDA 0.03594ms
[TRT] Visualize CPU 29.29482ms CUDA 29.45260ms
[TRT] Total CPU 101.67191ms CUDA 89.91036ms
[TRT] ------------------------------------------------

detected 0 objects in image

[TRT] ------------------------------------------------
[TRT] Timing Report /usr/local/bin/networks/SSD-Mobilenet-v2/ssd_mobilenet_v2_coco.uff
[TRT] ------------------------------------------------
[TRT] Pre-Process CPU 0.09406ms CUDA 1.04844ms
[TRT] Network CPU 71.93036ms CUDA 59.04422ms
[TRT] Post-Process CPU 0.06505ms CUDA 0.06307ms
[TRT] Visualize CPU 29.29482ms CUDA 29.45260ms
[TRT] Total CPU 101.38430ms CUDA 89.60834ms
[TRT] ------------------------------------------------
`

#####################

This is the Output from the Linux pc
linuxnikola@linuxnikola-P67X-UD3-B3:~$ gst-launch-1.0 -v udpsrc port=9999 caps = "application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, payload=(int)96" ! rtph264depay ! decodebin ! videoconvert ! autovideosink Setting pipeline to PAUSED ... Pipeline is live and does not need PREROLL ... /GstPipeline:pipeline0/GstUDPSrc:udpsrc0.GstPad:src: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, payload=(int)96 /GstPipeline:pipeline0/GstRtpH264Depay:rtph264depay0.GstPad:sink: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, payload=(int)96 Setting pipeline to PLAYING ... New clock: GstSystemClock

Any suggestions @dusty_nv I tried troubleshooting by sending the jetson feed to itself and still same results when ran from two terminals on the jetson