Do I have to reinstall cuda after changing the graphic card

I changed my graphics card NVIDIA Geoforce Ti 2080, after then i trained my network but i get a runtime error. Kindly refer to the below content regarding the prompted error. Therefore, my doubt is: do I have to reinstall Cuda again or change other settings in the cuda toolkit if I change the graphic card? Please help me to resolve it.
I0514 12:53:03.102141 1740 net.cpp:84] Creating Layer data
I0514 12:53:03.102141 1740 net.cpp:380] data -> data
I0514 12:53:03.102141 1740 net.cpp:380] data -> label
I0514 12:53:03.102141 1740 hdf5_data_layer.cpp:80] Loading list of HDF5 filenames from: examples/SRCNNG/SRCNN/testg.txt
I0514 12:53:03.103140 1740 hdf5_data_layer.cpp:94] Number of HDF5 files: 1
I0514 12:53:03.126129 1740 net.cpp:122] Setting up data
I0514 12:53:03.126129 1740 net.cpp:129] Top shape: 2 1 33 33 (2178)
I0514 12:53:03.126129 1740 net.cpp:129] Top shape: 2 1 21 21 (882)
I0514 12:53:03.126129 1740 net.cpp:137] Memory required for data: 12240
I0514 12:53:03.126129 1740 layer_factory.cpp:58] Creating layer conv1
I0514 12:53:03.126129 1740 net.cpp:84] Creating Layer conv1
I0514 12:53:03.126129 1740 net.cpp:406] conv1 <- data
I0514 12:53:03.126129 1740 net.cpp:380] conv1 -> conv1
I0514 12:53:03.129128 1740 net.cpp:122] Setting up conv1
I0514 12:53:03.129128 1740 net.cpp:129] Top shape: 2 64 25 25 (80000)
I0514 12:53:03.129128 1740 net.cpp:137] Memory required for data: 332240
I0514 12:53:03.129128 1740 layer_factory.cpp:58] Creating layer relu1
I0514 12:53:03.129128 1740 net.cpp:84] Creating Layer relu1
I0514 12:53:03.129128 1740 net.cpp:406] relu1 <- conv1
I0514 12:53:03.129128 1740 net.cpp:367] relu1 -> conv1 (in-place)
I0514 12:53:03.130127 1740 net.cpp:122] Setting up relu1
I0514 12:53:03.130127 1740 net.cpp:129] Top shape: 2 64 25 25 (80000)
I0514 12:53:03.130127 1740 net.cpp:137] Memory required for data: 652240
I0514 12:53:03.130127 1740 layer_factory.cpp:58] Creating layer conv2
I0514 12:53:03.130127 1740 net.cpp:84] Creating Layer conv2
I0514 12:53:03.130127 1740 net.cpp:406] conv2 <- conv1
I0514 12:53:03.130127 1740 net.cpp:380] conv2 -> conv2
I0514 12:53:03.132124 1740 net.cpp:122] Setting up conv2
I0514 12:53:03.132124 1740 net.cpp:129] Top shape: 2 32 25 25 (40000)
I0514 12:53:03.132124 1740 net.cpp:137] Memory required for data: 812240
I0514 12:53:03.132124 1740 layer_factory.cpp:58] Creating layer relu2
I0514 12:53:03.132124 1740 net.cpp:84] Creating Layer relu2
I0514 12:53:03.132124 1740 net.cpp:406] relu2 <- conv2
I0514 12:53:03.132124 1740 net.cpp:367] relu2 -> conv2 (in-place)
I0514 12:53:03.132124 1740 net.cpp:122] Setting up relu2
I0514 12:53:03.132124 1740 net.cpp:129] Top shape: 2 32 25 25 (40000)
I0514 12:53:03.132124 1740 net.cpp:137] Memory required for data: 972240
I0514 12:53:03.132124 1740 layer_factory.cpp:58] Creating layer conv3
I0514 12:53:03.132124 1740 net.cpp:84] Creating Layer conv3
I0514 12:53:03.132124 1740 net.cpp:406] conv3 <- conv2
I0514 12:53:03.132124 1740 net.cpp:380] conv3 -> conv3
I0514 12:53:03.134121 1740 net.cpp:122] Setting up conv3
I0514 12:53:03.134121 1740 net.cpp:129] Top shape: 2 1 21 21 (882)
I0514 12:53:03.134121 1740 net.cpp:137] Memory required for data: 975768
I0514 12:53:03.135125 1740 layer_factory.cpp:58] Creating layer loss
I0514 12:53:03.135125 1740 net.cpp:84] Creating Layer loss
I0514 12:53:03.135125 1740 net.cpp:406] loss <- conv3
I0514 12:53:03.135125 1740 net.cpp:406] loss <- label
I0514 12:53:03.135125 1740 net.cpp:380] loss -> loss
I0514 12:53:03.135125 1740 net.cpp:122] Setting up loss
I0514 12:53:03.135125 1740 net.cpp:129] Top shape: (1)
I0514 12:53:03.135125 1740 net.cpp:132] with loss weight 1
I0514 12:53:03.135125 1740 net.cpp:137] Memory required for data: 975772
I0514 12:53:03.135125 1740 net.cpp:198] loss needs backward computation.
I0514 12:53:03.135125 1740 net.cpp:198] conv3 needs backward computation.
I0514 12:53:03.135125 1740 net.cpp:198] relu2 needs backward computation.
I0514 12:53:03.135125 1740 net.cpp:198] conv2 needs backward computation.
I0514 12:53:03.135125 1740 net.cpp:198] relu1 needs backward computation.
I0514 12:53:03.135125 1740 net.cpp:198] conv1 needs backward computation.
I0514 12:53:03.135125 1740 net.cpp:200] data does not need backward computation.
I0514 12:53:03.135125 1740 net.cpp:242] This network produces output loss
I0514 12:53:03.135125 1740 net.cpp:255] Network initialization done.
I0514 12:53:03.135125 1740 solver.cpp:56] Solver scaffolding done.
I0514 12:53:03.137121 1740 caffe.cpp:243] Resuming from examples\SRCNNG\SRCNN_iter_4462500.solverstate
I0514 12:53:03.137121 1740 sgd_solver.cpp:318] SGDSolver: restoring history
I0514 12:53:03.137121 1740 caffe.cpp:249] Starting Optimization
I0514 12:53:03.137121 1740 solver.cpp:272] Solving SRCNN
I0514 12:53:03.137121 1740 solver.cpp:273] Learning Rate Policy: fixed
I0514 12:53:03.138123 1740 solver.cpp:330] Iteration 4462500, Testing net (#0)
I0514 12:53:03.786809 1740 solver.cpp:397] Test net output #0: loss = 0 (* 1 = 0 loss)
I0514 12:53:03.817797 1740 solver.cpp:218] Iteration 4462500 (6.56652e+06 iter/s, 0.679584s/100 iters), loss = 0
I0514 12:53:03.817797 1740 solver.cpp:237] Train net output #0: loss = 0 (* 1 = 0 loss)
I0514 12:53:03.817797 1740 sgd_solver.cpp:105] Iteration 4462500, lr = 0.0001
F0514 12:53:03.817797 1740 sgd_solver.cu:19] Check failed: error == cudaSuccess (8 vs. 0) invalid device function
*** Check failure stack trace: ***