DDC/CI over I2C fails, GTX660Ti

I’m trying to communicate with monitors using DDC/CI over I2C on /dev/i2c-n address 0x37.

I can read monitor EDID data at address 0x50. I can also read address 0x37 to detect that the address is live. However, writes to address 0x37 fail on my GTX660Ti, so DDC/CI fails.

I’ve reduced the failure situation to the following example, which shows writes of varying sizes. On the chance that the content being written might affect the result, the code attemtps to write two different sequences of bytes: a valid DDC Feature Request, and a sequence of 0x00 bytes. All test cases succeed or the open source nouveau and radeon drivers. They also all succeed on a system with a GT210 using driver 331.38.

For a GT660ti with either the 331.38 or 337.12 drivers, writes succeed if 1 or 2 bytes are written. Writes of 3 or more bytes fail with errno=5 (EIO).

Here’s the code:

void demo_nvidia_bug_sample_code(int busno) {
   printf("\n(%s) Starting   \n", __func__ );
   char * writefunc = "write";
   int rc;
   char devname[12];
   snprintf(devname, 11, "/dev/i2c-%d", busno);

   int fh = open(devname,   O_NONBLOCK|O_RDWR);
   ioctl(fh, I2C_SLAVE, 0x37);

   // try a read, it succeeds
   unsigned char * readbuf = calloc(sizeof(unsigned char), 256);
   rc = read(fh, readbuf+1, 1);
   if (rc < 0) {
      printf("(%s) read() returned %d, errno=%s. Terminating execution  \n", __func__, rc, errno_name(errno) );
      exit(1);
   }
   printf("(%s) read succeeded.  Address 0x37 active on %s\n", __func__, devname);

   unsigned char zeroBytes[5] = {0};  // 0x00;

   unsigned char ddc_cmd_bytes[] = {
      0x6e,              // address 0x37, shifted left 1 bit
      0x51,              // source address
      0x02 | 0x80,       // number of DDC data bytes, with high bit set
      0x01,              // DDC Get Feature Command
      0x10,              // Luminosity feature code
      0x00,              // checksum, to be set
   };
   ddc_cmd_bytes[5] = ddc_checksum(ddc_cmd_bytes, 5, false);    // calculate DDC checksum on all bytes
   assert(ddc_cmd_bytes[5] == 0xac);

   printf("\n(%s) Try writing fragments of DDC request string...\n", __func__ );
   int bytect;
   for (bytect=sizeof(ddc_cmd_bytes)-1; bytect > 0; bytect--) {
      usleep(5000);
      errno = 0;
      rc = write(fh, ddc_cmd_bytes+1, bytect);
      if (rc == bytect)
         printf("(%s) bytect=%d, %s() returned rc=%d as expected\n", __func__, bytect, writefunc, rc);
      else if (rc < 0)
         printf("(%s) bytect=%d, Error. %s(), returned %d, errno=%s\n", __func__, bytect, writefunc, rc, errno_name(errno));
      else
         printf("(%s) bytect=%d, Truly weird. rc=%d\n", __func__, bytect, rc);
   }

   printf("\n(%s) Try writing null bytes...\n", __func__ );
   for (bytect=sizeof(zeroBytes); bytect > 0; bytect--) {
      usleep(5000);
      errno = 0;
      rc = write(fh, zeroBytes, bytect);
      if (rc == bytect)
         printf("(%s) bytect=%d, %s() returned rc=%d as expected\n", __func__, bytect, writefunc, rc);
      else if (rc < 0)
         printf("(%s) bytect=%d, Error. %s(), returned %d, errno=%s\n", __func__, bytect, writefunc, rc, errno_name(errno));
      else
         printf("(%s) bytect=%d, Truly weird. rc=%d\n", __func__, bytect, rc);
   }
   close(fh);

}

And here’s the output when executed with a GTX660Ti and driver 331.38:

(demo_nvidia_bug_sample_code) Starting   
(demo_nvidia_bug_sample_code) read succeeded.  Address 0x37 active on /dev/i2c-0

(demo_nvidia_bug_sample_code) Try writing fragments of DDC request string...
(demo_nvidia_bug_sample_code) bytect=5, Error. write(), returned -1, errno=5 - EIO      ( I/O error )
(demo_nvidia_bug_sample_code) bytect=4, Error. write(), returned -1, errno=5 - EIO      ( I/O error )
(demo_nvidia_bug_sample_code) bytect=3, Error. write(), returned -1, errno=5 - EIO      ( I/O error )
(demo_nvidia_bug_sample_code) bytect=2, write() returned rc=2 as expected
(demo_nvidia_bug_sample_code) bytect=1, write() returned rc=1 as expected

(demo_nvidia_bug_sample_code) Try writing null bytes...
(demo_nvidia_bug_sample_code) bytect=5, Error. write(), returned -1, errno=5 - EIO      ( I/O error )
(demo_nvidia_bug_sample_code) bytect=4, Error. write(), returned -1, errno=5 - EIO      ( I/O error )
(demo_nvidia_bug_sample_code) bytect=3, Error. write(), returned -1, errno=5 - EIO      ( I/O error )
(demo_nvidia_bug_sample_code) bytect=2, write() returned rc=2 as expected
(demo_nvidia_bug_sample_code) bytect=1, write() returned rc=1 as expected

Environments in which the test case shows errors:

GTX660Ti, OpenSUSE 13.1, nvidia driver 337.12
GTX660Ti, Ubuntu 14.04, nvidia driver 331.38

Environments in which the test case shows no errors (and DDC/CI works):
GTX660Ti, Fedora 20, nouveau open source driver
AMD card, OpenSuse 13.1, readeon open source driver
GT210, Mint 17, nouveau driver
GT210, Mint 17, nvidia driver 331.38

If I can come up with additional Nvidia cards, I will test them in the testbed Mint 17 system and post the results.

Is there a workaround for this situation, an alternative method to execute DDC/CI on the GTX660Ti, or is this simply a failure of the proprietary nvidia driver on newer cards?

This bug has been known for ages however NVIDIA doesn’t have it on its priority list.

Probably the same bug, maybe not. ddccontrol has not been maintained for some time, which is why I wrote my own program. ddccontrol fails to detect 2 of the 3 monitors on my GTX660Ti, even in the nouveau environment where my program works, and even then it’s unhappy that the monitor isn’t in its database. It’s doing memory mapping and PCIO, while my code is simply reading and writing /dev/i2c-n. Also, I thought it useful to narrow the “DDC/CI doesn’t work” bug to the more specific “writes of more than 2 bytes fail on /dev/i2c-n”, and to point to a Nvidia card where the problem doesn’t exist.

But I take your larger point. It’s been discouraging to see no action on the ddccontrol bug.