I also use ColorEyes Display, but on a Mac.
explain the whys and wherefores
First some background,
DDC = display data channel
LUT = look up table, maps an 8-bit level (0-255) to a brightness level (0-1023 or more)
In the CRT past, the LUT on the video card would take your 8bit RGB color data and transform those (8bit) 256 values into a brightness level. This analog signal gets sent to the CRT. So you calibrate the CRT by adjusting the video cards RGB LUTs.
There are a few problems with this, first you need voltage stable video card output, and a CRT monitor that provides consistant brightness levels from the analog video signal. For example, simply swapping out a video card on a PC, and you would need to recalibrate because the output voltages, even between two of the same cards, could be different. This becomes a mess when you need to administer hundreds of color calibrated workstations (say at a movie special effects business).
For high quality LCDs, this is different. The DVI cable sends the 8bit RGB data digitally to the LCD, and the transformation of those 256 values into a brightness level is done in the LCD (via RGB LUTs in the LCD). No analog problems. This is why DVI is the future, and why it is recomended.
However if you cannot set the LUTs in the LCD then DVI is not a good option for calibration (IMHO) using VGA cable is better (because you can set the LUTs in the video card).
About DDC,
DDC1 : DDC started out as a simply way for monitors to tell the graphics card what video modes it supported. This is the DDC1 standard. It was a one way transfer from monitor to graphics card, there is no way for the graphics card to talk back to the monitor to set LUTs or anything else. The monitor just keeps sending the display settings all the time.
Then the display and graphics card industries got wise and added two way communication so the graphics card could talk to the monitor (through I2C protocol).
DDC1/2B : Graphics card can ask the monitor for display settings. This does not support setting of display LUTs.
DDC1/2AB : Graphics card has full control over the monitor. This is needed to be able to set the LUTs on a high quality LCD display. This is also refered to as DDC/CI.
So you can probably guess where I am going with this,
You will need at least DDC1/2AB (also refered to as DDC/CI) support in the LCD display, the graphics card, the graphics card driver, the operating system, and the calibration software for this to work properly.
However, there is another problem.
While the protocol used to set the common settings (ie brightness and contrast) is standard, as far as I can tell, there is no standard for setting the LUTs on high quality LCDs. So adjusting the display LUT seems to be using a proprietary extension or proprietary protocols. Some displays even use a seperate USB connection for this purpose.
So now your calibration software has to be aware of all the proprietary methods to adjust each display's built in lookup tables. And of course the operating system and graphics cards (and drivers) also have to be providing support for this as well.
In essence, a lack of a common standard for setting LCD display LUTs has become a nightmare for software developers, and thus also a nightmare for us users.
With any luck, VESA will establish a common standard for setting LCD display's LUTs, and then this feature will be in even in-expensive LCD displays. Then Linux will get color calibration support, Adobe will port Photoshop to Linux, ... ok I'll stop dreaming.
I hope this has made sense to those interested in the topic (I'm definatly not a good technical writer)...
Please feel free to add corrections if I got anything wrong.