Dropout + TLT

dropout_rate not working in “model_config” .
How I can use this function?

ex:
model_config {
arch: “resnet”
pretrained_model_file: <path_to_model_file>
freeze_blocks: 0
freeze_blocks: 1
all_projections: True
num_layers: 18
use_pooling: False
use_batch_norm: True
dropout_rate: 0.6

OR dropout_rate: 0.1 <--------

training_precision: {
backend_floatx: FLOAT32
}
objective_set: {
cov {}
bbox {
scale: 35.0
offset: 0.5
}
}
}
###It doesn’t work.

The dropout_rate only supports value between 0.0 and 0.1.
See https://docs.nvidia.com/metropolis/TLT/tlt-getting-started-guide/index.html#model

Thank you for answer.
But I try with 0.1 or 0.0 too. Doesn’t work. But a question. 0.1 is the same 10% ou 90%?
When I running the model, the messa say “There aren’t dropout_rate param”.

The dropout_rate means the fraction of the input units to drop. 0.0 means dropout is not used.
Any error log for “it doesn’t work”?

0.1 means that 10% of neurons will then be discarded (will have 0 values) ok?
Yes. I put here soon.

Correct.

Morganh, these are the images. tlt-streamanalytics v2.0.

.
Doesn’t work!

Hi danielmrocha,
For classification network, please set to “dropout”.
The dropout_rate is only available at detectnet_v2 network.

Finally running.
Thank you very much Morganh.
Regards