Why NVENC can't support HEVCmain10 in GTX 1050Ti?

I see NVIDIA VIDEO CODEC SDK | NVIDIA Developer which indicates 1050Ti supports HEVCmain10, but my test failed.
When I use “.\AppEncCuda.exe -h”, it shows

Encoder Capability
#  GPU                  H264 H264_444 H264_ME H264_WxH  HEVC HEVC_Main10 HEVC_Lossless HEVC_SAO HEVC_444 HEVC_ME HEVC_WxH
0  GeForce GTX 1050 Ti    +      +       +    4096x4096   +       +            +           +        +       +    8192x8192

Did you debug through the code to see where it’s failing? 1050 Ti should support HEVC main 10 encoding.

I am sorry to bother you, I have carefully checked the code, my computer is supporting 10bit encoding. But the result I used to encode with NVENC is all green (both 0).

My settings:
1.AppEncCuda.exe -i xxx -o xxx -s 2048x788 -if p010 -gpu 0 -codec hevc -preset default -profile main10 -rc vbr_hq -fps 25 -gop 150 -qmin 1 -qmax 10

2.16bits image input which the first 6bits are set to zero.

Is there a problem with my settings? Thank you very much for your reply.

I am sorry to bother you, I have carefully checked the code, my computer is supporting 10bit encoding. But the result I used to encode with NVENC is all green (both 0).

My settings:
1.AppEncCuda.exe -i xxx -o xxx -s 2048x788 -if p010 -gpu 0 -codec hevc -preset default -profile main10 -rc vbr_hq -fps 25 -gop 150 -qmin 1 -qmax 10

2.16bits image input which the first 6bits are set to zero.

Is there a problem with my settings? Thank you very much for your reply.

Are you setting 6 MSBs or 6 LSBs to 0? P010 expects 6 LSBs to be 0.

This is really the best news in recent days! Thank you very much for your advice! I originally set the LSB to zero, and the result was successful after the modification! In addition, what is the main difference between the 10bit encoding parameters and 8bit on the GPU? (I once only set 8bit parameters ) If it is convenient, I hope I can get suggestions or web links.