AINVR issues with recorded vst videos

Please provide complete information as applicable to your setup.

• Hardware Platform (Jetson Orin NX16)
• DeepStream Version 7.0
• JetPack Version 6 36.3

In VST i’m trying to use the recorded videos, the problem is, the fps changes constantly and the video is recorded that way, i need the recorded videos for the follwing process:

  • Fetch the metadata saved by deepstream and emdx for a certain timerange
  • Match the timestamp of each object in the metadata with the video that corresponds to it (generally for a narrow timerange it’s only one video)
  • Retrieve the video and open it with CV2, fetch the frame that is in the timestamp of the metadata and draw the bboxes on it.

My problem is, when the FPs is changing, i can’t get the right frame to draw the bboxes on, thus the bboxes would be off. This is the way i get the frame:

def extract_frame_opencv(video_file, timestamp, output_image):
    """
    Extract a frame from the video at a specific timestamp using OpenCV.

    Parameters:
    - video_file (str): The path to the video file.
    - timestamp (timedelta): The timestamp at which to extract the frame.
    - output_image (str): The output path where the frame will be saved.

    Returns:
    - bool: True if the frame was successfully extracted and saved, False otherwise.
    """
    # Open the video file
    cap = cv2.VideoCapture(video_file)
    if not cap.isOpened():
        print(f"Error: Could not open video file {video_file}")
        return False

    # Get the frames per second (fps) of the video to calculate the frame number
    fps = cap.get(cv2.CAP_PROP_FPS)
    frame_number = int(fps * timestamp.total_seconds())  # Calculate frame index at the timestamp
    print(f"Timestamp: {timestamp}, FPS: {fps}, Frame number: {frame_number}")

    cap.set(cv2.CAP_PROP_POS_FRAMES, frame_number)

    ret, frame = cap.read()
    if ret:
        cv2.imwrite(output_image, frame)
        print(f"Frame at {timestamp} seconds saved as {output_image}")
        cap.release()
        return True
    else:
        print(f"Error: Could not read frame at {timestamp} seconds.")
        cap.release()
        return False

Is there a way to set a fixed fps? or any parameter in VST that can help in my case

Is it possible to use cv2.CAP_PROP_POS_MSEC to read video frame based on timestamp?

i actually tried that, and i’m getting the same result, i think the issue is that the timestamp where the frame was recorded is different that the timestamp of the metadata of the frame, which makes it impossible to match those two, i tried my code with a video of low resolution (640x420) and the gap was smaller, i set an offset of 200 ms :

timestamp_ms = int(timestamp.total_seconds() * 1000) - int(TIME_OFFSET)
cap.set(cv2.CAP_PROP_POS_MSEC, timestamp_ms)

I got the right frame, but when i try with a hgher resolution the gap gets bigger.
Note that all of my services run on the same network, so there shouldn’t be that much delay right?
Also note that i’m getting the video from nvstreamer to vst to deepstream.
So i don’t know what’s causing this delay.
Here’s my VST config:

vst_config.json: >
    {
        "network":
        {
                "http_port":"81",
                "server_domain_name":"",
                "stunurl_list": ["stun.l.google.com:19302","stun1.l.google.com:19302"],
                "static_turnurl_list": [],
                "use_coturn_auth_secret": false,
                "coturn_turnurl_list_with_secret": [],
                "use_twilio_stun_turn": false,
                "twilio_account_sid": "",
                "twilio_auth_token": "",
                "use_reverse_proxy": false,
                "reverse_proxy_server_address": "REVERSE_PROXY_SERVER_ADDRESS:100",
                "ntp_servers": [],
                "use_sensor_ntp_time": false,
                "max_webrtc_out_connections": 8,
                "max_webrtc_in_connections": 8,
                "webservice_access_control_list":"",
                "rtsp_server_port": -1,
                "rtsp_server_instances_count": 1,
                "rtsp_preferred_network_iface":"eth0",
                "rtsp_in_base_udp_port_num": -1,
                "rtsp_out_base_udp_port_num": -1,
                "rtsp_streaming_over_tcp": false,
                "rtsp_server_reclamation_client_timeout_sec": 10,
                "rx_socket_buffer_size":2000000,
                "tx_socket_buffer_size":2000000,
                "stream_monitor_interval_secs": 2,
                "rtp_udp_port_range" : "31000-31100",
                "udp_latency_ms": 200,
                "udp_drop_on_latency": false,
                "webrtc_latency_ms": 1000,
                "enable_frame_drop": false,
                "webrtc_peer_conn_timeout_sec": 10,
                "webrtc_max_birate": 10000,
                "webrtc_min_birate": 2000,
                "webrtc_start_birate": 4000,
                "enable_grpc": true,
                "grpc_server_port": "50051",
                "webrtc_in_audio_sender_max_bitrate": 128000,
                "webrtc_in_video_degradation_preference": "resolution",
                "webrtc_in_video_sender_max_framerate": 30,
                "remote_vst_address": "",
                "webrtc_port_range": {"min":31100, "max":31200},
                "enable_websocket_pingpong": false,
                "websocket_keep_alive_ms": 5000
        },
        "onvif":
        {
                "device_discovery_timeout_secs":50,
                "onvif_request_timeout_secs":20,
                "device_discovery_freq_secs":15,
                "device_discovery_interfaces": ["eth1"],
                "max_devices_supported": 16,
                "default_bitrate_kbps": 8000,
                "default_framerate": 30,
                "default_resolution": "1920x1080",
                "default_gov_length": 60
        },
        "data":
        {
                "storage_config_file": "./configs/vst_storage.json",
                "storage_threshold_percentage": 95,
                "storage_monitoring_frequency_secs": 2,
                "nv_streamer_directory_path": "/home/vst/vst_release/streamer_videos/",
                "nv_streamer_loop_playback":true,
                "nv_streamer_seekable":false,
                "nv_streamer_sync_playback":false,
                "nv_streamer_max_upload_file_size_MB": 10000,
                "nv_streamer_media_container_supported": ["mp4","mkv"],
                "nv_streamer_metadata_container_supported": ["json"],
                "nv_streamer_rtsp_server_output_buffer_size_kb": 1000,
                "supported_video_codecs": ["h264", "h265"],
                "supported_audio_codecs": ["pcmu","pcma","mpeg4-generic"],
                "enable_aging_policy": false,
                "max_video_download_size_MB":1000,
                "always_recording": true,
                "event_recording": false,
                "event_record_length_secs": 10,
                "record_buffer_length_secs": 2,
                "use_software_path": false,
                "use_webrtc_inbuilt_encoder": "",
                "webrtc_in_fixed_resolution": "1280x720",
                "webrtc_in_max_framerate": 30,
                "webrtc_in_video_bitrate_thresold_percentage": 50,
                "webrtc_in_passthrough": false,
                "webrtc_sender_quality": "pass_through",
                "enable_rtsp_server_sei_metadata": false,
                "enable_proxy_server_sei_metadata": false,
                "gpu_indices" : [],
                "webrtc_out_enable_insert_sps_pps" : true,
                "webrtc_out_set_iframe_interval" : 30,
                "webrtc_out_set_idr_interval" : 256,
                "webrtc_out_min_drc_interval" : 5,
                "webrtc_out_encode_fallback_option" : "software",
                "device_name" : "VST",
                "device_location" : "Pune",
                "enable_dec_low_latency_mode": false,
                "enable_avsync_udp_input": true,
                "use_standalone_udp_input": false,
                "enable_silent_audio_in_udp_input": false,
                "enable_udp_input_dump": false,
                "webrtc_out_default_resolution": "1920x1080",
                "use_webrtc_hw_dec": true,
                "recorder_enable_frame_drop": false,
                "recorder_max_frame_queue_size_bytes": 16000000,
                "webrtc_out_enc_quality_tuning": "ultra_low_latency",
                "webrtc_out_enc_preset": "ultra_fast",
                "enable_drc": true,
                "enable_ipc_path": false,
                "ipc_socket_path": "/tmp/",
                "ipc_src_buffer_timestamp_copy": true,
                "ipc_src_connection_attempts": 5,
                "ipc_src_connection_interval_us": 1000000,
                "ipc_sink_buffer_timestamp_copy": true,
                "ipc_sink_buffer_copy": true,
                "use_external_peerconnection": false
        },
        "notifications":
        {
                "enable_notification": true,
                "use_message_broker" : "redis",
                "message_broker_topic": "vst.event",
                "message_broker_payload_key": "sensor.id",
                "message_broker_metadata_topic": "test",
                "redis_server_env_var": "redis:6379",
                "kafka_server_address": "10.0.0.1:9092"
        },
        "debug":
        {
                "enable_perf_logging":true,
                "enable_qos_monitoring":true,
                "qos_logfile_path":"/opt/vst_release/webroot/log/",
                "qos_data_capture_interval_sec":1,
                "qos_data_publish_interval_sec":5,
                "enable_gst_debug_probes":true,
                "enable_prometheus":false,
                "prometheus_port": "8080",
                "enable_highlighting_logs":true,
                "enable_debug_apis": true,
                "dump_webrtc_input_stats": false,
                "enable_frameid_in_webrtc_stream": false,
                "enable_network_bandwidth_notification" : false,
                "enable_latency_logging": true,
                "enable_loopback_multicast": false
        },
        "overlay":
        {
                "video_metadata_server": "",
                "video_metadata_query_batch_size_num_frames": 300,
                "use_video_metadata_protobuf": false,
                "enable_gem_drawing": true,
                "analytic_server_address": "https://llm.namla.ai/emdx",
                "overlay_text_font_type": "DejaVuSansMono.ttf"
        },
        "security":
        {
                "use_https": false,
                "use_rtsp_authentication": false,
                "use_http_digest_authentication": false,
                "use_multi_user": false,
                "enable_user_cleanup": false,
                "session_max_age_sec": 2592000,
                "multi_user_extra_options": ["Secure", "SameSite=none"],
                "nv_org_id": "",
                "nv_ngc_key": ""
        }
    }
  vst_storage.json: |
    {
        "data_path": "./vst_data/",
        "video_path": "./vst_video/",
        "total_video_storage_size_MB": 10000
    }

Please have a try with JPS 2.0: Release Notes (version 2.0) — Jetson Platform Services documentation. Can you show some log or picture to show the timestamp match/mismatch between recorded video and metadata?