How to obtain the real world location from the bounding box ( both object and camera moving) in a realtime from DeepStream?

Hi everyone,

I have a CSI camera, I am running an object tracking with a bounding box around the object and both camera and object can be moving in a different direction. I have the real-world location of the camera from GPS location and velocity of movement almost every second, I would like to know the real-world location of the detected object from the bounding box and possibly the velocity of the moving target. Does anyone solve a similar issue?
I know using the camera calibration I can obtain the real-world coordinate for not moving objects and from images. Now I want to do it in a real-time object detector which I do not know if the solution will remain valid.
Thank you

**• Hardware Platform **
• DeepStream Version
• Issue Type

Please provide complete information as applicable to your setup.

• Hardware Platform (Jetson / GPU)
• DeepStream Version
• JetPack Version (valid for Jetson only)
• TensorRT Version
• NVIDIA GPU Driver Version (valid for GPU only)
• Issue Type( questions, new requirements, bugs)
• How to reproduce the issue ? (This is for bugs. Including which sample app is using, the configuration files content, the command line used and other details for reproducing)
• Requirement details( This is for new requirement. Including the module name-for which plugin or for which sample application, the function description)

Sorry, I am just looking for the concept right now, I did not implement it yet as I explained before, I have a Jetson TX2 device and a moving camera with an object which is moving toward the camera or moving with the camera. I am not able to provide more information now because I did not run the full model.