Support from MKL

Hi,

I would like to know the support provided for math kernel library such MUMPS, LAPACK etc. on the DRIVE Pegasus and also how does it stack up against the MKL provided for x86 architecture. Since most of the optimizer tend to utilize the fore mentioned libraries such as IPOPT. I couldn’t find any information about this in the documentation. Any support is appreciated.

Hardware Platform: DRIVE AGX Pegasus™ Developer Kit

Dear @prajwal.ainapur,
If you are looking for support of Math functions on GPU, we provide CUDA libraries(https://docs.nvidia.com/cuda/index.html#cuda-api-references) like cublas,cuSparse etc. Could you please check them?

Hi @SivaRamaKrishnaNV,

I went through those documents that you had shared but the ones that am looking for is on the CPU level. Our use case focuses on convex optimization techniques which in return utilize the above mentioned library in the back end. Please go through the below link, which is a documentation for IPOPT.

https://coin-or.github.io/Ipopt/INSTALL.html#EXTERNALCODE

Dear @prajwal.ainapur,
We do not have any CPU related libs shipped on board. You may check in relevant library forums for support on aarch64.