How do I apply the 2D navigation simulation example to a smart car with 3 omni-directional wheels?
[Warning] [omni.graph.core.plugin] /World/wheeltec_robot/ActionGraph/articulation_controller: OmniGraph Warning: shape mismatch: value array of shape (1,2) could not be broadcast to indexing result of shape (1,3)
Hi @fmy553 - Can you share some more details to assist better with your issue?
it sounds like you’re using the differential controller. what you need is the holonomic controller. I can’t help you with that though because I have no idea what you’re doing or how :/
Hi @mgussert I have now changed the controller to the honolomic controller, and I want to use my three-wheeled omnidirectional car for ROS 2D navigation operations, but I don’t know how to convert the value of cmd_vel to the honolomic controller velocity commands for the vehicle.
Here is my Action Graph:
That looks like some mighty fine spaghetti… mighty fine!
I can’t help you without a more detailed discussion of what you are doing and how. In your graph it looks like you have the holonomic controller connected to your articulation controller, but I don’t see an execution line from playback tick to the holonomic controller… why do you want to “convert” the cmd_vel at all? The flow of these kinds of graphs should be like the following
commands → X controller → Articulation controller
The commands are high level, and what kind of data they are composed of depends entirely on what “X” is. in the case of your holonomic controller, the command is “go with this linear and angular velocity”. Those values are fed to the X controller, which converts the command to specific joint actions. Those joint actions then get fed to the articulation controller, where they are applied to the actual robot.
hope this helps!
I’m glad you can answer my question @mgussert . I want to realize the navigation function of ros, so I need to be able to control the movement of the robot through the linear velocity and angular velocity of the topic cmd_vel. The speed values of each wheel received at the controller end are given by the holonomic controller, but I don’t know what kind of transformation needs to be done from the linear velocity and angular velocity of cmd_vel to the holonomic controller to achieve my goal.
If you want @fmy553 you can check GitHub - eliabntt/GRADE-RR: Generating Animated Dynamic Environments for Robotics Research and eventually write to me (I just updated the v2022 branch). My solution for the three-wheeled omnidirectional robot (as well as for the drone) was to get 3 joints (x,y,yaw) and use my controller (GitHub - eliabntt/custom_6dof_joint_controller: Control a 6 joint robot (free floating) with six separate PID with ROS in simulation.) to send command to them. In practice, instead of having to convert the cmd_vel to controllers inside isaac, I do that outside and then send the joint_command topic. That is effective and good in my experience. I used a similar trick in Gazebo. Then, to get odometry, you can simply get the dynamic control interface and publish the data
Thanks for answering my question @eliabntt94 , but I still don’t know how to do it, can you share your Action Graph to me.
The actiongraph is created using this and the called functions (specifically add_joint_states).
this instead is used to set the initial location (note that the joint paths will have to be modified for your use case). I don’t have any picture right now.
Then in the main script you run your clock, your joint graph etc etc, and you will have a topic listening for the “joint_commands” which will be published by the joint controller node :)
The main script that I already ported is this, it’s a bit convoluted and lengthy but the robot parts are from line 304 to 341 to load the robot, this to automatically launch the ros nodes, and here onward to publish clock etc.