NVEnc output corrupt when enableSubFrameWrite = 1

My preset:

m_stEncodeConfig.encodeCodecConfig.hevcConfig.sliceMode = 3u;
m_stEncodeConfig.encodeCodecConfig.hevcConfig.sliceModeData = (uint32_t)m_stEncodeStreamInfo.nMaxSliceNum; //4
m_stCreateEncodeParams.reportSliceOffsets = 1;
m_stCreateEncodeParams.enableSubFrameWrite = 1;

code of process output:

        NV_ENC_LOCK_BITSTREAM lockBitstreamData;
        memset(&lockBitstreamData, 0, sizeof(lockBitstreamData));
        lockBitstreamData.version = NV_ENC_LOCK_BITSTREAM_VER;
        lockBitstreamData.outputBitstream = pEncodeBuffer->stOutputBfr.hBitstreamBuffer;
        lockBitstreamData.doNotWait = 1u;
        std::vector<uint32_t> arrSliceOffset(m_stEncodeConfig.encodeCodecConfig.hevcConfig.sliceModeData);
        lockBitstreamData.sliceOffsets = arrSliceOffset.data();
        while (true)
        {
            NVENCSTATUS status = m_pEncodeAPI->nvEncLockBitstream(m_hEncoder, &lockBitstreamData);
            auto tick = int(std::chrono::steady_clock::now().time_since_epoch().count() / 1000000);
            if (status == NVENCSTATUS::NV_ENC_SUCCESS)
            {
                if (lockBitstreamData.hwEncodeStatus == 2)
                {
                    printf("%d bitstreamSizeInBytes = %d, numSlices = %d\n", tick, lockBitstreamData.bitstreamSizeInBytes, lockBitstreamData.numSlices);
                    static std::ofstream of("slice.h265", std::ios::trunc | std::ios::binary);
                    of.write((char*)lockBitstreamData.bitstreamBufferPtr, lockBitstreamData.bitstreamSizeInBytes);
                    of.flush();
                    break;
                }
                else
                {
                    printf("%d bitstreamSizeInBytes = %d, numSlices = %d\n", tick, lockBitstreamData.bitstreamSizeInBytes, lockBitstreamData.numSlices);
                }
                NVENCAPI_CALL_CHECK(m_pEncodeAPI->nvEncUnlockBitstream(m_hEncoder, lockBitstreamData.outputBitstream));
            }
            else
            {
                break;
            }
        }

play bitstream:

ffplay -i slice.h265
  1. output : Packet corrupt
  2. arrSliceOffset[0] always = 255.

I watch the memory from VS Debug and compare with enableSubFrameWrite = 0,the bitstreamSizeInBytes less of valid data size . It’s BUG or I loss some details? Anybady can tell me,how can I correct use of enableSubFrameWrite?