Industrial defect detection

Hi, I’m new to the deep learning space and wanted help creating a workflow for realtime defect detection in an industrial setting(focused on PCB defects). Optimally, I would be able to train the model using GPUs on AWS and deploy on an edge device like the Jetson. I have two ideas for an approach:

  1. Treat the defect detection like an object detection problem. Train a TLT object detection model on a defect dataset in AWS and export the model for deployment on Jetson with Deepstream.

  2. I found this paper treating defect detection as an image segmentation problem: Engagement | Highspot. To summarize, the approach is to create a Unet model in TensorFlow and run inference with TF-TRT.

Overall, my questions are:

  1. Is one of the approaches more viable than the other?
  2. How would I adapt the second approach for realtime defect detection? Could I do something similar to the first approach to deploy the model on a Jetson?

I am new to this process, so examples/implementation details are more than welcome!

Best,
Marco

I think you can try both approaches.
For 2nd approach, there is Maskrcnn network in current TLT version. You can try it.

Thanks for your advice. For the Unet/Tensorflow approach, could a follow a similar guideline to this post for deployment on the Jetson?:
https://developer.nvidia.com/blog/deploying-models-from-tensorflow-model-zoo-using-deepstream-and-triton-inference-server/

I would ultimately want to end up with a deployable app for realtime detection

For TLT maskrcnn, there is a blog https://developer.nvidia.com/blog/training-instance-segmentation-models-using-maskrcnn-on-the-transfer-learning-toolkit/
For deployment of maskrcnn model, refer to Integrating TAO Models into DeepStream — TAO Toolkit 3.22.05 documentation

Thank you, that is very helpful. If I were to also try an approach like the Unet where the model originates in tensorflow, is it still straightforward to deploy it on the Jetson? The TLT to deepstream pipeline seems very established, but I am less sure where to find guidance for models from tensorflow.

I’m likely going to try all these approaches and compare.

Please check tlt user guide. You can also trigger jupyter notebook inside the tlt docker for reference.