Why thano does not update?

Hi,
I’m tying to resolve the follow problem: a theano function has as outputs the value that a class method return after has made a while loop, within which a parameter is updated:

import theano
import theano.tensor as T
import numpy as np
import copy
theano.config.exception_verbosity = 'high'

class Test(object):
    def __init__(self):
        self.rate=0.01
        W_val=40.00
        self.W=theano.shared(value=W_val, borrow=True)
    def start(self, x, y):
        for i in range(5):
            z=T.mean(x*self.W/y)
            gz=T.grad(z, self.W)
            self.W-=self.rate*gz
        return z

x_set=np.array([1.,2.,1.,2.,1.,2.,1.,2.,1.,2.])
y_set=np.array([1,2,1,2,1,2,1,2,1,2])
x_set = theano.shared(x_set, borrow=True)
y_set = theano.shared(y_set, borrow=True)
y_set=T.cast(y_set, 'int32')
batch_size=2

x = T.dvector('x')
y = T.ivector('y')
index = T.lscalar()

test = Test()
cost=test.start(x,y)

train = theano.function(
    inputs=[index],
    outputs=cost,
    givens={
        x: x_set[index * batch_size: (index + 1) * batch_size],
        y: y_set[index * batch_size: (index + 1) * batch_size]
    }
)

for i in range(5):
    result=train(i)
    print(result)

this is the result of the print:

39.96000000089407
39.96000000089407
39.96000000089407
39.96000000089407
39.96000000089407

Now the gradient of mean(x*W/y) is equal to 1 (because x and y always have the same value). So the first time i should have 39.95, than 39.90 and so on… Why i always have the same result??

Thanks