I am wondering whether it is possible to apply inferencing on several datasets with one specs file. It’s a bit annoying to turn on a docker at each time (3 minutes for startup) for a couple of minutes of inferencing. Putting all data in the same folder is not an option as it really mixes up things for me, especially naming the outputs.
inference_config {
images_dir: '/workspace/tao-experiments/data/training/Lightly/testing/image_2'
batch_size: 1
bbox_caption_on: True
detection_image_output_dir: '/workspace/inf/inference_results_imgs'
labels_dump_dir: '/workspace/inf/inference_dump_labels'
rpn_pre_nms_top_N: 6000
rpn_nms_max_boxes: 300
rpn_nms_overlap_threshold: 0.9
object_confidence_thres: 0.0001 #0.0001
bbox_visualize_threshold: 0.6
classifier_nms_max_boxes: 100
classifier_nms_overlap_threshold: 0.0001
}
I am using a FasterRCNN.
For instance, is it possible to have two inference_config entries? or several image_dir, detection_image_output_dir and labels_dump_dir in one entry?
Best