AWS iot Core

I am interested in pushing inference data to AWS iot Core, wanted to know an update for when something like that will be released and integrated smoothly with deepstream 5.0.

I am using AWS IoT core with my deepstream apps. Just code it - AWS provide very simple API’s in Python, C and C++ to use. Whenever I get a detection I send a message to AWS IoT. You can check their sample programs for the code to use.

I use a python “controller” program to manage starting and stopping and monitoring the memory usage of my deepstream app. I send the message to IoT from python and its very simple and (so far) very stable. Check here: https://github.com/aws/aws-iot-device-sdk-python-v2

The doco is not good at the moment as they are creating it right now since moving from v1 to v2 but you can see everything you need from the sample programs.

Thank you very much for your reply, when you get a detection from deepstream 5.0 (such as the timestamp, bounding box coordinate) how are you connecting it to the iot python sdk?

Currently, I can edit the config file to write the detections in kitti format to an output directory, but I am not sure how to connect it directly with the python iot sdk. Are you using the deepstream python sdk to do the stopping and controlling?

I am currently editing the text config files to run my deepstream application which work with the models I have trained with tlt 2.0.

I apologize for the influx of questions, but I have been trying to clear up some ambiguity and you have put me on the right path.

There are many ways. You can modify the deepstream-app code and send messages to iot from the probe in the tracker source pad. In that probe you have access to all data like bounding boxes etc.
In my case as I spawn my deepstream app from a python controller program I send the IoT messages from python. You can get the bounding box info downloaded as files in kitti format. You can add a config setting to make that happen. Alternatively just exit the code to save whatever data you like.

I am also interested in the python solution, this “controller” program does it integrate with the Deepstream Python bindings?

Is there any example / documentation you can point me to where I can “spawn my deepstream app from a python controller program”?

Would this be an example of “spawning my deepstream app from a python controller program”:

cap = cv2.VideoCapture(gst_str, cv2.CAP_GSTREAMER)

where the gst_str can be :

gst_str = ‘nvarguscamerasrc ! video/x-raw(memory:NVMM), width=%d, height=%d, format=(string)NV12, framerate=(fraction)%d/1 ! nvvidconv ! video/x-raw, width=(int)%d, height=(int)%d, format=(string)BGRx ! videoconvert ! appsink’ % (3280, 3280, 21, 224,224)

Nah - if you google the python subprocess module you’ll see what I mean.
I have a python program that sits in a while True loop waiting for commands (messages from aws iot).
When a start command comes in I use the python subprocess module to start the deepstream c program as a separate process.

Every few minutes I check the available memory on the system. If it gets down below 10% I send a sigint to the deepstream program to kill and restart it. This is because deepstream has had some minor memory leaks that build up over time.

Then in the deepstream c program I edited the code so that in the tracker probe when a detection is made I write a file with the info it it.
The python program watches that directory and when a file is detected it read the details and sends a message to iot.

I’m on a phone right now so can’t add any code. If you ping me back I can help with some example stuff tomorrow…

1 Like

Thank you Jason, using the python subprocess module makes sense. I would love to see any examples of how you are using the python controller.

I am definitely noticing longer delays in portion to the length of the video through the detection pipeline which must be the memory leaks you mentioned.

You shouldn’t notice any delays in processing speed until the free memory gets really low (< 5% from my experience). Also if you run headless (no ubuntu gui) you get back heaps of memory. Only 0.8GB used on my jetson nano.

In python to start another app you can do this:

process_obj = sp.Popen(["deepstream-app", "-c", "config.txt"])

You can spin off a separate thread to monitor this instance and restart it if memory is low or it it crashes.

Using the psutil library:

memory_usage = dict(psutil.virtual_memory()._asdict())
print("Performing routine memory check. Using {:0.1f}%".format(memory_usage["percent"]))

If you wish to restart the process send it a SIGTERM:

process_obj.send_signal(signal.SIGINT)

This is the same as doing a CTRL-C on the running app. Deepstream-app will shutdown nicely if you do this.

I’ve had apps running for months on end using this technique.

Sending a message with the AWS IoT python sdk v2 is easy too. Here’s an example of how I send a log message - enabled to to do remote monitoring:

def send_log_message(msg):
    currentDT = datetime.datetime.now()
    iot_message = {
        "adapterID": config["DEVICEID"],
        "timestamp": currentDT.strftime("%Y-%m-%d %H:%M:%S"),
        "message": msg
    }
    mqtt_connection.publish(
        topic="logs",
        payload=json.dumps(iot_message),
        qos=mqtt.QoS.AT_LEAST_ONCE
    )

For most comms, however I use the thing shadow. The thing shadow sits on top of the mqtt messages and is a basically just a plain JSON document that the service replicates between the cloud and your device.

You can check out the example programs here: https://github.com/aws/aws-iot-device-sdk-python-v2.

1 Like

Thank you Jason for the help.