Has anyone tried to integrate jetson-inference supported by TensorRT with OpenFrameworks(OF) on Jetson TX2?
Does it help in increasing the FPS?
I came across this link on github . https://github.com/dusty-nv/jetson-inference/issues/20
Is it feasible to be implemented on Jetson TX2?
Can anyone help me with more information about this.