int8 reading from Calibration cache problems on uff path

It looks like the WriteCalibrationCache() does not write scale factors for all layers.
And because of that on a re-run with existing calibration cache file, the program segfaults.
Here is the result of a first run with strictTypeConstraints set to true. I get some warnings like:

Warning: no implementation of conv2d_46/convolution + activation_48/Relu obeys the requested constraints, using a higher precision type
Warning: no implementation of (Unnamed Layer* 265) [Shuffle] + policy/Reshape obeys the requested constraints, using a higher precision type
Warning: no implementation of value/MatMul obeys the requested constraints, using a higher precision type
Warning: no implementation of value/BiasAdd obeys the requested constraints, using a higher precision type
Warning: no implementation of value/Softmax_HL_1129584897 obeys the requested constraints, using a higher precision type
Warning: no implementation of value/Softmax obeys the requested constraints, using a higher precision type

But this does not cause a problem for the first run that produced the cache file shown at the bottom.
Note that scale factors for many layers, such as convolutions, are missing from the cache file.
This is a ResNet network in uff format. I don’t know if the missing conv operator is the problem though because it fails on value/Softmax_HL layer

terminate called after throwing an instance of 'std::runtime_error'
  what():  Could not find tensor value/Softmax_HL_1307127257 in tensorScales. 
Aborted

I also get similar errors if i delete one line from the sampleINT8 produced cache file that uses Caffee network.

Note that the numbers for the value/Softmax_HL_xxx are different from the first run and re-run.
The relevant keras code for this layer is

x = Flatten()(vx)
    x = Dense(256, activation='tanh')(x)
    y = Dense( 32, activation='tanh')(y)
    x = Concatenate()([x, y])
    x = Dense( 32, activation='tanh')(x)
    value = Dense(NVALUE, activation='softmax', name='value')(x)

The calibration cache file

1
value/BiasAdd: 3d35fc36
activation_36/Relu: 3d6400c9
activation_37/Relu: 3de99f8d
activation_28/Relu: 3d13720a
value/Softmax_HL_1129584897: 3c0190d6
activation_25/Relu: 3dc3c5a7
activation_27/Relu: 3db7c8e6
activation_29/Relu: 3d5c8aa1
value/Softmax: 3c0190d6
aux_input: 3dc18f1e
activation_30/Relu: 3d266675
activation_44/Relu: 3d55f3e1
dense_9/Tanh: 3c010a14
activation_47/Relu: 3dca193f
activation_48/Relu: 3e100849
dense_8/MatMul: 3d89357d
flatten_3/Reshape: 3c014fa1
activation_34/Relu: 3d46a6ae
activation_33/Relu: 3d69da60
activation_38/Relu: 3de56242
activation_32/Relu: 3d4519b3
dense_9/MatMul: 3e116a77
activation_43/Relu: 3d488873
activation_35/Relu: 3db2e4a2
activation_31/Relu: 3d8a8ae8
activation_40/Relu: 3dbf79af
activation_41/Relu: 3d813275
dense_8/Tanh: 3c010a14
activation_42/Relu: 3dbcb9cf
main_input: 3c010a14
activation_24/Relu: 3d011362
activation_39/Relu: 3db668b1
activation_23/Relu: 3dcd90ff
policy/Reshape: 3e100849
value/MatMul: 3d42f828
activation_26/Relu: 3d08977a
activation_45/Relu: 3db0ad91
dense_7/MatMul: 3d30ff1a
dense_7/Tanh: 3c010a14

thank you for letting us know. engineering is triaging.