I want to host a small LLM using tensorrt-llm on an Nvidia a10 GPU.
tensorrt-llm has tensorrt as a dependency.
The wheel file for tensorrt has the license as “other/proprietary” (Client Challenge) but the github (GitHub - NVIDIA/TensorRT: NVIDIA® TensorRT™ is an SDK for high-performance deep learning inference on NVIDIA GPUs. This repository contains the open source components of TensorRT.) says apache 2.0.
Can I pip install tensorrt-llm along with tensorrt and use it for commercial purposes or do i need a license ?