multi-encoding and multi-decoding issues

I am sending two video stream using Gstreamer and gst-rtsp-server.
Each video stream is 1024x768, 30fps.
Also I am using TX2 for client and server. One TX2 is as server and the other is as client.

Below code is part of my code at the server

gst_rtsp_media_factory_set_launch (factory,"( "
"appsrc name=mysrc1 ! videoconvert ! omxvp8enc ! rtpvp8pay name=pay0 pt=96 " 				
"appsrc name=mysrc2 ! videoconvert ! omxvp8enc ! rtpvp8pay name=pay1 pt=97 "				
")");				

and I used pipeline like below at the client.

gst-launch-1.0 rtspsrc location=rtsp://169.000.00.00:8554/test ! queue ! rtpvp8depay ! omxvp8dec ! nvvidconv ! nvoverlaysink overlay-x=400 overlay-y=600 overlay-w=640 overlay-h=480 overlay=1 & gst-launch-1.0 rtspsrc location=rtsp://169.000.00.00:8554/test ! queue ! rtpvp8depay ! omxvp8dec ! nvvidconv ! nvoverlaysink overlay-w=640 overlay-h=480 overlay=2

When I excute the code and pipeline

I can get the message like below at the server.

NvMMLiteOpen : Block : BlockType = 7
===== NVMEDIA: NVENC =====
NvMMLiteBlockCreate : Block : BlockType = 7
NvMMLiteOpen : Block : BlockType = 7
===== NVMEDIA: NVENC =====
NvMMLiteBlockCreate : Block : BlockType = 7

and I can get the message like below at the client

[1] 9414
Setting pipeline to PAUSED …
Setting pipeline to PAUSED …
Pipeline is live and does not need PREROLL …
Pipeline is live and does not need PREROLL …
Progress: (open) Opening Stream
Progress: (connect) Connecting to rtsp://169.000.00.00:8554/test
Progress: (open) Opening Stream
Progress: (connect) Connecting to rtsp://169.000.00.00:8554/test
Progress: (open) Retrieving server options
Progress: (open) Retrieving server options
Progress: (open) Retrieving media info
Progress: (open) Retrieving media info
Progress: (request) SETUP stream 0
Progress: (request) SETUP stream 0
Progress: (request) SETUP stream 1
Progress: (request) SETUP stream 1
Progress: (open) Opened Stream
Setting pipeline to PLAYING …
New clock: GstSystemClock
Progress: (request) Sending PLAY request
Progress: (request) Sending PLAY request
Progress: (open) Opened Stream
Setting pipeline to PLAYING …
New clock: GstSystemClock
Progress: (request) Sending PLAY request
Progress: (request) Sending PLAY request
Progress: (request) Sent PLAY request
Progress: (request) Sent PLAY request

(gst-launch-1.0:9415): GStreamer-CRITICAL **: 16:20:45.573: gst_caps_is_empty: assertion ‘GST_IS_CAPS (caps)’ failed

(gst-launch-1.0:9415): GStreamer-CRITICAL **: 16:20:45.574: gst_caps_truncate: assertion ‘GST_IS_CAPS (caps)’ failed

(gst-launch-1.0:9415): GStreamer-CRITICAL **: 16:20:45.574: gst_caps_fixate: assertion ‘GST_IS_CAPS (caps)’ failed

(gst-launch-1.0:9415): GStreamer-CRITICAL **: 16:20:45.574: gst_caps_get_structure: assertion ‘GST_IS_CAPS (caps)’ failed

(gst-launch-1.0:9415): GStreamer-CRITICAL **: 16:20:45.574: gst_structure_get_string: assertion ‘structure != NULL’ failed

(gst-launch-1.0:9415): GStreamer-CRITICAL **: 16:20:45.574: gst_mini_object_unref: assertion ‘mini_object != NULL’ failed
NvMMLiteOpen : Block : BlockType = 278
NVMEDIA: Reading vendor.tegra.display-size : status: 6
NvMMLiteBlockCreate : Block : BlockType = 278
Allocating new output: 1024x768 (x 11), ThumbnailMode = 0
OPENMAX: HandleNewStreamFormat: 3605: Send OMX_EventPortSettingsChanged: nFrameWidth = 1024, nFrameHeight = 768

(gst-launch-1.0:9414): GStreamer-CRITICAL **: 16:20:45.608: gst_caps_is_empty: assertion ‘GST_IS_CAPS (caps)’ failed

(gst-launch-1.0:9414): GStreamer-CRITICAL **: 16:20:45.608: gst_caps_truncate: assertion ‘GST_IS_CAPS (caps)’ failed

(gst-launch-1.0:9414): GStreamer-CRITICAL **: 16:20:45.608: gst_caps_fixate: assertion ‘GST_IS_CAPS (caps)’ failed

(gst-launch-1.0:9414): GStreamer-CRITICAL **: 16:20:45.608: gst_caps_get_structure: assertion ‘GST_IS_CAPS (caps)’ failed

(gst-launch-1.0:9414): GStreamer-CRITICAL **: 16:20:45.608: gst_structure_get_string: assertion ‘structure != NULL’ failed

(gst-launch-1.0:9414): GStreamer-CRITICAL **: 16:20:45.608: gst_mini_object_unref: assertion ‘mini_object != NULL’ failed
NvMMLiteOpen : Block : BlockType = 278
NVMEDIA: Reading vendor.tegra.display-size : status: 6
NvMMLiteBlockCreate : Block : BlockType = 278
Allocating new output: 1024x768 (x 11), ThumbnailMode = 0
OPENMAX: HandleNewStreamFormat: 3605: Send OMX_EventPortSettingsChanged: nFrameWidth = 1024, nFrameHeight = 768

here is what I want to know.

Both server and client using multi resource to encoding or decoding the video stream?

More detail, I am wondering both video streams are encoded at the same time by using multi resource of tx2.
Also at the client, they decoded at the same time?

If not, how can I do that at gstreamer? how can I controll multi-resources?

Plus, I guess below message means how many resources are used at decoding.
Is that correct?

Allocating new output: 1024x768 (x 11).

Hi,
We only try single encoding case in running test-appsrc and test-launch:
https://devtalk.nvidia.com/default/topic/1049924/jetson-agx-xavier/using-rtsp-encode-with-h264-and-stream-images-at-device-memory-on-xavier-/post/5331159/#5331159
https://devtalk.nvidia.com/default/topic/1043770/jetson-tx2/problems-minimizing-latency-and-maximizing-quality-for-rtsp-and-mpeg-ts-/post/5295828/#5295828

Could you go to gstreamer forum to get suggestion of running two encoding threads in test-appsrc and test-launch? You may ask with software encoder/decoder x264enc and avdec_h264. After you have the working pipeline, you can replace with hardware decoder and try again.