Hi, I trained an Object Detection model in DIGITS and load by detectnet-console.py from jetson-inference and everything work well. Then, I use APIs from detectnet-console.py to setup an inferece server:
class DeskV1: def __init__(self, model, proto, label): self.model = model self.proto = proto self.label = label def BuildNet(self): self.net = jetson.inference.detectNet(self.model, [f'--model={self.model}', f'--prototxt={self.proto}', f'--labels={self.label}'], 0.5) def Inference(self, image): img, width, height = jetson.utils.loadImageRGBA(image) detections = self.net.Detect(img, width, height, overlay="box,labels") centers = [] for detection in detections: centers.append(detection.Center) size = (width, height) del img return size, centers
I suppose create DeskV1 instance once and call “Inference” everytime the server received new image. But everytime “Inference” work done, the memory increase 0.2G, so some time later the server crash.
How to free memory after Inference invoked? detectnet-camera.py setup network once and do inference everytime when camera captures image, but it never leaks memroy, why?