Does the latest GTX 1660 model support cuda?

That error is coming from Tensorflow.

I suggest you ask questions about cudnn on the cudnn forum.

https://devtalk.nvidia.com/default/board/305/cudnn/

Also, you should search for Tensorflow issue reports. many users report being able to fix this with “allow growth=true”

config = tf.ConfigProto()
config.gpu_options.allow_growth = True
sess = tf.Session(config=config)

Do some searching. This is not a Tensorflow support forum. I won’t be able to respond to further questions about tensorflow problems on this forum.

If you want to verify that CUDNN is working correctly with your gtx 1660, then run the CUDNN sample codes provided by NVIDIA. If they work correctly, then CUDNN is working correctly on your GTX 1660, and you will need to investigate problems reported by Tensorflow as Tensorflow issues.

1 Like