has someone tried to connect the Kinect V2 to the jetson nano? I read in the previous topic that, with TX1 and TX2 at the beginning, there was some problem related to the firmware.
I’m also asking if the Nano can handle the stream rate of the camera and also do some computation or image analysis; like recognizing an object or track the position in the space with a sufficiently high frame rate.
I’m using the JETSON TX 2 with the following configuration:
L4T 28.2.1 [ JetPack 3.3 or 3.2.1 ]
Ubuntu 16.04.6 LTS
Kernel Version: 4.4.38-tegra
And I’d like to connect it with a Kinect v2, however, I didn’t have success. I’ve done everything that I read on the forums here on Github and on JetsonHacks, even tried to install as Kinect v1, but nothing. After all, I saw that it’s not possible to connect a default board (TX2) with this device. Is it true?
By the way, which camera is it better to develop SLAM applications? ZED?
Hey… I got working kinect V1 with ROS OpenNI driver on jetson nano… Got navigation stack and gmapping working… but the problem is when i start ros control with gmapping, the laser scan data start to jump… after some Googling somewhere i saw that can happen due to lack of cpu power… is there anything i can do for that…
That’s great news that you got the Kinect v1 working on a Nano! I’m hoping to get it going myself. Can you point me to the tutorial you followed? I"ve got a three foot tall robot with a shiny new kinect on the front but i just can’t get it working on my rock64pro. i just got a nano to try but i’d read they were only compatible with kinect v2.
Regarding your question, powering the nano via the dc barrel connector will provide better performance.
If the problem turns out to be that he nano just can’t handle the point cloud (as is the case with the Pi3 i think) there is a ros program called point cloud to laser scan (or something really similar) that helps.
If you are working with ROS, you’ll not have any issue with that. After i installed that, kinect was listed in
even if you are not working with ROS, openni driver may be able to help you.
As for my other problem, it was because two nodes publishing transformation for same tf link. After i stop one node, it was working fine.
Jetson nano handles pointcloudes well from kinect v1 and i was able visualize them on rviz… I was also able to run rtabmap_ros to create 3D map of my room with it. (It was my first time with rtabmap and really COOOOOOL !!!) ROS navigation also worked fine and i got some nice results with AMCL and ROS Control for differential drive robot.
i am running in 10w mode and even with my 5v 2.1A power bank everything working fine. just make sure you added a cooling fan. (I added a cheap 12v one) one time nano turned off i think because i forgot to turn on cooling fan and i ran it for some time. it was heated a lot. other than that, everything working fine and i’m really happy with jetson nano… I always wanted to get my hand on a jetson but they were out of my budget until now. Nvidia did a great job with this… Thank you Nvidia for making this available at this price… :)
I had one issue as sometimes when i visualize data in rviz like a pointcloud, nano stopped responding. if that’s a problem, just run rviz in an another machine connected to same ros master through wifi with all other processes running in jetson nano… But rtabmap 3d map generation visualized ok in rviz with a bit of lag.
If you need any help with any of there please ask me…
I got it! I was worrying at use libfreecnet v2, but when I tried the full installation (libfreecnet + bridge_kinect2 + rtabmap) I’ve done.
By the way, is good to know that is possible to connect with other devices!