Support sent me to this site as they said they have no mechanism to suggest NVIDIA software settings/configurations for use cases, so hopefully someone can provide some insight.
I’m a scientist with Lockheed Martin and I was wondering if there are any sort of use based recommended settings in order to maximize frame rates for streaming/scanning type performance. Our current hardware is the K4200 video card with 4K NEC monitors, the computer itself is 32 cores with 64 GB memory. Essentially we have an image review program that is scanning through sections of large digital photos to do quality control looking for bad camera pixels. We are seeing a lot of jitter/jumpiness as the program tries to smoothly scroll across these very large images. Currently I can get the system to run fairly smoothly with the imagery co-located on the machine’s hard drive, but not if I’m accessing images on a verified 100 MB/s network which is the workflow we’d need to use. We may just be asking too much of the hardware, but I wanted to see if you guys had any specific ideas/settings to try for this particular type of use case. Perhaps something to increase the cache read ahead etc.