Session run gets stuck running inference on Jetson Nano

Hi
I am trying to go through a tutorial as shown here in this link:
https://medium.com/swlh/how-to-run-tensorflow-object-detection-model-on-jetson-nano-8f8c6d4352e8

I have installed Tensorflow 1.13.1 as per the instructions given at
https://docs.nvidia.com/deeplearning/sdk/tensorrt-install-guide/index.html

I started with a brand new image on the SD card.

I completed step 1 in the tutorial. I have a tensor_rt.pb frozen graph file that I am using as downloaded from the colab notebook.

Issue 1: It takes forever to run the get_frozen_graph() function. My graph file is only 23MB. But eventually it executes.

Issue 2:
In step 2 all lines run ok till (I split the code in different cells to find out which one is causing the issue) where it is actually doing the inferencing on the dogs.jpg image:

scores, boxes, classes, num_detections = tf_sess.run([tf_scores, tf_boxes, tf_classes, tf_num_detections], feed_dict={
tf_input: image[None, …]
})

The jupyter notebook gets stuck, no error messages and everything hangs. Mouse stops responding on the Jetson and keystrokes are non responsive. I can only unplug and restart the Nano

Any ideas?

Omkar

probably insufficient power… try lowering the power consumption (and speed, sorry) of your nano using nvpmodel -m 1 and try again (will take longer, sorry). get frozen graph takes forever but you can cache the frozen graph, I have posted an example somewhere. If you pass the first stage I can point you in the right direction.

Ok checking on that. I got frustrated and started rebuilding the SD card. So after I install Tensorflow as per instructions (if I understood you correctly, I will run sudo nvpmodel 1, this will set it to run on a 5W mode instead of 10W mode?

Although, I am using the barrel jack power connector to power the Jetson. Its rated to 4amps. So I dont understand why reducing power can help?

Oh, if you are running using a 4a power supply, my answer is probably wrong. Is the nano shutting down or only hangs? if you have a console connected, see messages and also examine the syslog for any problem.
Sorry I can’t help more… Oh wait! do you have a swap file?

Not sure. I did not specifically set one. All I did is flash the Jetson Nano developer kit iso image on the SD card and installed Tensorflow. So I need to set it up?..Googling…Googling …

First paragraph!!! … lol… This should be a part of official instructions.

Yes, now after adding a 8GB swap file acc to instructions in this link, it is at least moving along. Thanks Moshe. Swap file is the answer. You need it for inferencing even with TensorRT.

glad I could help - it can be very frustrating… BTW, not that it matters to anything other then my ego, but you have accepted your own response as answer :)