I have kids and robots at home, so I like pitting them against each other for sport. Safely of course. This contest is to see who can colour in a picture that neither one has seen before.
For this project, we are using a Kuka KR10 R1100 with a KRC4 controller named Russell. All of the processing is done in OpenCV and Python on a Jetson AGX and a simple 1080p webcam is used for vision.
When the competition starts, Maddy can start colouring away. Russell needs to move in to “see” the picture with the camera. OpenCV then does thresholding and connected component detection to find the areas that need to be coloured in. Each area is then contoured and shrunk slightly repeatedly until there is no area left to fill. The robot will follow those contours. This creates a constant offset between each marker pass and fills in the whole area with a very simple algorithm. The contours are then passed to a simple robot script writing package that will generate Kuka KRL code, which is loaded over the network to the arm. The arm will follow the contours that were measured with the tips of the markers, one colour per area.
The end-of-arm-tooling has 7 markers in it so the robot doesn’t need to switch tooling mid-program. Not ideal in manufacturing, but simpler since this is a 24 hour challenge