The enqueue() method has been deprecated when used with engines built from a network created with NetworkDefinitionCreationFlag::kEXPLICIT_BATCH flag

Description

A clear and concise description of the bug or issue.

Environment

TensorRT Version: 8.4.1.5
GPU Type: 1660 Ti
Nvidia Driver Version: 515.65.01
CUDA Version: 11.7
CUDNN Version: 8.5.0
Operating System + Version: Linux 5.15.0-46-generic #49~20.04.1-Ubuntu
Python Version (if applicable):
TensorFlow Version (if applicable):
PyTorch Version (if applicable):
Baremetal or Container (if container which image + tag):

Relevant Files

Please attach or include links to any models, data, files, or scripts necessary to reproduce your issue. (Github repo, Google Drive, Dropbox, etc.)

Steps To Reproduce

Please include:

  • Exact steps/commands to build your repro
  • Exact steps/commands to run your repro
  • Full traceback of errors encountered

I converted a model via /usr/src/tensorrt/bin/trtexec command succesfully. But When I want to run project, for every inference it generates these messages:


[W] [TRT] The enqueue() method has been deprecated when used with engines built from a network created with
 NetworkDefinitionCreationFlag::kEXPLICIT_BATCH flag. Please use enqueueV2() instead.

How could I disable these messages? My project has been wrote in C++.

Hi,

You can modify the ProfileVerbosity level to remove warning messages. Please refer to the following doc.
https://docs.nvidia.com/deeplearning/tensorrt/api/c_api/classnvinfer1_1_1_i_logger.html

Thank you.

1 Like

Changing class Logger : public nvinfer1::ILogger as follows:

public:
    nvinfer1::ILogger::Severity profile_severity_level;

public:
    Logger(Severity severity = Severity::kWARNING)
        : mReportableSeverity(severity)
    {
        json config_json;
        std::ifstream config("../../Configuration.json");
        config >> config_json;

        profile_severity_level = config_json["ProfileSeverityLevel"];
    }
    void log(Severity severity, const char* msg) noexcept override
    {
        switch (profile_severity_level)
        {
        case nvinfer1::ILogger::Severity::kINTERNAL_ERROR: //0
        {
            if (severity != Severity::kERROR && severity != Severity::kWARNING &&
                    severity != Severity::kINFO && severity != Severity::kVERBOSE)
            {
                std::cout << msg << std::endl;
                LogStreamConsumer(mReportableSeverity, severity) << "[TRT] " <<
                                                                    std::string(msg) << std::endl;
            }
            break;
        }
        case nvinfer1::ILogger::Severity::kERROR: //1
        {
            if (severity != Severity::kWARNING && severity != Severity::kINFO
                    && severity != Severity::kVERBOSE)
            {
                std::cout << msg << std::endl;
                LogStreamConsumer(mReportableSeverity, severity) << "[TRT] " <<
                                                                    std::string(msg) << std::endl;
            }
            break;
        }
        case nvinfer1::ILogger::Severity::kWARNING: //2
        {
            if (severity != Severity::kINFO && severity != Severity::kVERBOSE)
            {
                std::cout << msg << std::endl;
                LogStreamConsumer(mReportableSeverity, severity) << "[TRT] " <<
                                                                    std::string(msg) << std::endl;
            }
            break;
        }
        case nvinfer1::ILogger::Severity::kINFO: //3
        {
            if (severity != Severity::kVERBOSE)
            {
                std::cout << msg << std::endl;
                LogStreamConsumer(mReportableSeverity, severity) << "[TRT] " <<
                                                                    std::string(msg) << std::endl;
            }
            break;
        }
        case nvinfer1::ILogger::Severity::kVERBOSE: //4
        {
            std::cout << msg << std::endl;
            LogStreamConsumer(mReportableSeverity, severity) << "[TRT] " <<
                                                                std::string(msg) << std::endl;
            break;
        }
        }
    }

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.