D3D12 - Can't use 16-bit index buffer for indirect drawing with NV's GPU.

I use JJoosten’s DX sample.
(See Samples/Desktop/D3D12ExecuteIndirect.)

It’s a simple modification of Microsoft’s sample.
It uses ExecuteIndirect() with 32-bit index buffer.

I try to modify it to 16-bit index buffer but nothing drawn.
R16 IB works properly with DrawIndexedInstanced(), so I’m sure the index buffer is correct.
Also, both WARP and intel hd graphic works properly with indirect drawing when using R16 IB.

Tested GPU: GTX 1070 ti, GTX 1080.
Driver: 388.13.

How to solve this problem?
With 32-bit index buffer, we will have twice memory usage on our models.
I hope ExecuteIndirect() can work with 16-bit index buffer as well.

Squall Liu