EDSR model needs to be implemented. How to do the same

Please provide complete information as applicable to your setup.

**• Hardware Platform (Jetson / GPU)**T4
• DeepStream Version5.0
• JetPack Version (valid for Jetson only)
• TensorRT Version7.0
**• NVIDIA GPU Driver Version (valid for GPU only)**440
EDSR model needs to be implemented. How to do the same. As it is super scaling no examples for deepstream implementation except for classification detection and segmentation are there. Kindly do help

same to what?

Hi @mchi
implemention in deepstream was what I meant. As I have noticed the Deestream SDK supports only 3 modes detection classification and segmentation. So can upscalling model be implemented, if so what are the important steps?

Hi @GalibaSashi,
Do you alrady have EDSR model that can run with TensorRT?
If not, what’s your EDSR model you have?

Thanks!

I do not have an EDSR model that can be run in TensorRT. Mine is a tensorflow model. I have the frozen.pb file with me.

for a quick try, you can use “nvinferserver” to run your tensorflow model.

Refer to - https://github.com/NVIDIA-AI-IOT/deepstream_python_apps/tree/master/apps/deepstream-ssd-parser sample.

I do not understand
this is ssd right? edsr model is different right

Hi,
We don’t have exact EDSR implemenation. What I provide is a referece sample.
You can replace the ssd model with your model and run with this app with few modification.

can you specify the modifications

we have provided the sample and doc that is easy for read and modifcation.
please try to do it on your side, if you see any specific question, we could support you.

I checked,This is not for T4 right , how do I make it work for T4.

This sample works for x86_64 + dGPU, why do you say it’s not for T4?

Oh sorry my mistake… where should I change the model path??

Can you help

in the model config file - dstest_ssd_nopostprocess.txt