One may wonder, why am I doing this. Simple. I have dual NEC PA241 monitors which have DisplayPort and DVI-I input. My video card is an MSI Radeon R9 390 with 8GB GDDR5, but only has one DisplayPort.
Call me dumb, but even though I've been building my own computers since 1982, I can't keep up with all specs all the time. (Hey, I'm a photographer. Enough changes there daily.) I never realized that my DVI-I on my video card couldn't output 30-bit, which the monitors can handle. Unfortunately, the card only has one DisplayPort and one HDMI, and the HDMI can't talk to the DisplayPort.
I still have an older Asus Radeon R9 290X 4GB GDDR5 card and I've thought of putting that in the system for the second monitor, on which I frequently run Lightroom. The primary monitor is usually Photoshop. This way, both monitors would run 30-bit displays.
I run Windows 10 x64. Has anyone done this? Were there issues? I looked at current prices for video cards, thinking I might go ahead and buy a new one, and they're outrageous, all thanks to the bloody BitCoin poeple. (I have no idea how that really works, except that it's put a crunch on retail prices as well as caused scarcity for AMD cards.) Thanks for the advice.