Can anyone demonstrate how to load a tflite model

I am trying to use the models supplied with the google coral usb accelerator product (the inat-insect model to be precise, no traning required for my use case). A sample image-console style program would be ideal

Hi bogflap,

Not sure what models supplied by google coral usb accelerator product’, you may start with:
https://github.com/dusty-nv/jetson-inference/
https://devtalk.nvidia.com/default/topic/1050377/jetson-nano/deep-learning-inference-benchmarking-instructions/

Thanks

Hi kayccc

The pretrained models I have been looking at are at

The one I am particularly interested in is the MobileNet V2 (iNat insects) one.
This has been pretrained on the insect dataset from the web site

This is an invaluable source of classified images of thousands of animals. The insect section is of particular interest to me, especially honey bee images, and being able to use the coral pretrained models would save me a huge amount of work.

Ok so Google may be seen as a competitor of yours but finding pretrained models of anything more than imagenet or coco or faces is not easy. So any pretrained models from any source to run on any hardware is good as far as I am concerned.

Hi,

Would you mind to check does this sample meet your requirement?
https://github.com/AastaNV/TRT_object_detection

Thanks.

Does COCO include iNaturalist insects and is it in tflite format?

So anyway I flashed the latest nano image, got python to install tensorflow and all its dependecies.
Then I tried to run convert_to_uff.py but that failed. Finished with

return _TF_TO_NP[self._type_enum]
KeyError: 20

Other methods seem to require that I know the names of the input and output nodes.
Ran up tensorboard pointing at a log_dir direct by using something like

import tensorflow as tf
from tensorflow.python.platform import gfile
with tf.Session() as sess:
model_filename =‘PATH_TO_PB.pb’
with gfile.FastGFile(model_filename, ‘rb’) as f:
graph_def = tf.GraphDef()
graph_def.ParseFromString(f.read())
g_in = tf.import_graph_def(graph_def)
LOGDIR=‘YOUR_LOG_LOCATION’
train_writer = tf.summary.FileWriter(LOGDIR)
train_writer.add_graph(sess.graph)

to populate the log dir

looked at the resulting graph and went upl!

There really must be an easier way of doing all of this. There are all these pretrained models out there but unless I want COCO classes then I no can do.

By the way I am trying to use the following models

http://download.tensorflow.org/models/object_detection/faster_rcnn_resnet101_fgvc_2018_07_19.tar.gz
and
http://download.tensorflow.org/models/object_detection/faster_rcnn_resnet50_fgvc_2018_07_19.tar.gz

I guess I will never know

Hi,

The simplest way is to use TFTRT.
But the performance is not as good as pure TensorRT due to the overhead from TensorFlow.

Here is a tutorial for your reference:

We verify several object detection models, including faster_rcnn_resnet50_coco.
Although the sample is not for TensorFlow lite, you should still be able to use the similar way to create a tftrt model.

Thanks and sorry for any inconvenience.

It should work with Google’s instructions and Google has sample code here.

https://coral.googlesource.com/
“edgetpu” is probably what you are after.

Let me know if you run into problems. I have a Coral (god help me, they will cancel it now for sure) but haven’t tested it with my nano yet. I have ideas about using it as a secondary inference engine. They have canned models for exactly what I want to do.

Here are tflite build instructions for arm64:
https://www.tensorflow.org/lite/guide/build_arm64