cuDNN installation location

Hello,
I have problem building JAX related to cuDNN installation location.
The provided deb, rpm and containers in the official website install it under /usr ( Update to version 9.13.1 and changing the installation location (!1) · Merge requests · Arch Linux / Packaging / Packages / cudnn · GitLab ). In that location, it will break toolchains for Clang or GCC because Bazel will link the cuDNN include directory (/usr/include) to Bazel workspace. This issue (https://github.com/google-ml-infra/rules_ml_toolchain/issues/148) describe the problem.
But according to XLA maintainers in this comment (https://github.com/openxla/xla/issues/16866#issuecomment-2825417201), cuDNN should be shipped with same location as CUDA (you can see this comment as well: https://github.com/openxla/xla/issues/16866#issuecomment-3361582864).

What do you think? Is there anything missing from reading the cuDNN installation guide from the official website: https://docs.nvidia.com/deeplearning/cudnn/installation/latest/linux.html ?