i’m looking for a cuda and/or ptx instruction or set thereof i could put in device code so that i could get the gpu temperature (out to a few decimal places) and/or some timer or clock (such as for performance metrics) to use as a source of entropy for a random number generator. anyone know how to do this? i’m not talking about some utility or software. i’m looking for device resident source code for a CUDA program.
i only need a few bits of entropy, total for the whole device. but i need it in a few clock cycles. that’s why i’m hoping maybe there’s a special register i can read or something.
i’m looking for a cuda and/or ptx instruction or set thereof i could put in device code so that i could get the gpu temperature (out to a few decimal places) and/or some timer or clock (such as for performance metrics) to use as a source of entropy for a random number generator. anyone know how to do this? i’m not talking about some utility or software. i’m looking for device resident source code for a CUDA program.
i only need a few bits of entropy, total for the whole device. but i need it in a few clock cycles. that’s why i’m hoping maybe there’s a special register i can read or something.
thanks. i looked that up. unfortunately according to http://developer.download.nvidia.com/compu…g_Guide_1.0.pdf , section 4.3.3 that’s essentially deterministic. since the control flow in my program is static (independant of the data) that clock function is essentially “how many instructions lie between this line and the last time this line was reached.” so that’s always going to be a += b, and i could just as well write a+=b. point being it’s not sufficient for my purposes. unless you’re talking about a different function. i was looking for some kind of clock that’s not in sync w/the core clock, preferably whose frequency has a high lowest common multiple with the core clock, so that the numbers i get aren’t periodic in any way. (and it would be even better if it didn’t keep time very well - if the “ticks” were not reliably of the same duration.)
thanks. i looked that up. unfortunately according to http://developer.download.nvidia.com/compu…g_Guide_1.0.pdf , section 4.3.3 that’s essentially deterministic. since the control flow in my program is static (independant of the data) that clock function is essentially “how many instructions lie between this line and the last time this line was reached.” so that’s always going to be a += b, and i could just as well write a+=b. point being it’s not sufficient for my purposes. unless you’re talking about a different function. i was looking for some kind of clock that’s not in sync w/the core clock, preferably whose frequency has a high lowest common multiple with the core clock, so that the numbers i get aren’t periodic in any way. (and it would be even better if it didn’t keep time very well - if the “ticks” were not reliably of the same duration.)
i only need a few bits at a time. it’s to eliminate periodicty from a fast/efficient pseudo-random number generator on the gpu. so preferably i’d have it every time a random number is requested. and i’m looking for something that’s truly random not pseudo-random. like the difference in time between two clocks that aren’t synchronized, or the temperature of a card out to relatively high precision.
i only need a few bits at a time. it’s to eliminate periodicty from a fast/efficient pseudo-random number generator on the gpu. so preferably i’d have it every time a random number is requested. and i’m looking for something that’s truly random not pseudo-random. like the difference in time between two clocks that aren’t synchronized, or the temperature of a card out to relatively high precision.
I don’t think current GPUs have anything sufficiently non-deterministic for this purpose. You’ll have to upload a block of entropy for each kernel call.
I don’t think current GPUs have anything sufficiently non-deterministic for this purpose. You’ll have to upload a block of entropy for each kernel call.