Luminous Landscape Forum
Equipment & Techniques => Computers & Peripherals => Topic started by: Steve House on May 23, 2013, 06:00:02 pm
-
Trying to configure a new high-end PC and going nuts with the video card selection. Pretty well decided to go with an Intel i7-3930k CPU on either an Asus P9X79 WS or Sabertooth x79 motherboard, 32 gig ram, but graphics card selection is still up in the air. Will be using a wide gamut monitor, probably Dell U2713h, that supports 10-bit colour. The issue is I want optimum image performance with Lightroom (mostly) and Photoshop (occasionally) but I also want to move into high quality video editing in Premiere Pro. Including Premiere in the software mix pretty well limits the choice to nVidia cards. The GTX 680 (~$600) or GTX Titan (~$1k) apparently perform very well with Premiere and After Effects but don't support 10-bit colour. To get to 10-bit requires moving to the Quadro line and for similar number of CUDAs one would be looking at the Quadro K5000 (~$1800) for something roughly equivalent to the GTX 680 in terms processor count. That's really pushing the budget - do'able but I have other places where the $$ could be put to use - and I'm wondering if those 2 bits are worth it for general photography. Indeed, does LR even do 10-bit or is it coming in LR5?
-
You don't say what OS. If it's Win 7, be careful. 10-bit display bit depth and Aero don't play together, or didn't two years ago.
http://blog.kasson.com/?p=674
Jim
-
You don't say what OS. If it's Win 7, be careful. 10-bit display bit depth and Aero don't play together, or didn't two years ago.
http://blog.kasson.com/?p=674
Jim
Most likely Windows 8
-
I'm wondering if those 2 bits are worth it for general photography.
I'd say probably not worth the investment, unless you do a lot of grayscale imagery.... and I've currently got a 10 bit monitor setup.
-
Exactly. You have no need for 10 bit, you are going way too far over the edge...
I use Premiere and Vegas. I still use a 570 because the 6xx series does not work with Vegas, Sony did not licence the code yet, and I use it a lot more than Premiere (hate it). CUDA support is spotty no matter the app though, don't stress about it, get a good sweet spot card. You will NOT notice the difference the few SECONDS make going between a $200 and $1000 card for what you are after. Gaming - YES. CUDA rendering, not a chance. Personally I would get a 660ti.
-
I wouldn't use the pseudo 10-bit mode on the Ultrasharp if I were you. Unless you have a true 10-bit monitor, it is pointless. I believe 10-bit is pointless anyway, but that's a different story.
A good GPU is necessary. The best GPU isn't. You can make do with a GTX 680 (4GB or 2GB) and a Quadro K600. The Quadro will drive two monitors and supports S3D.
If you decide on 10-bit, why not get the Quadro K600 first, and see if it meets your needs? Until a couple of years ago most people made do without CUDA.
-
With the caveat that I don't do video editing, I got the NVIDIA GeForce GTX 660 2GB GDDR5 for my recent build. LR doesn't use a GPU at this point in time so it's pretty irrelevant for that. The few PS operations that I tried with it seem to be quick and easy. In terms of video, there was a free game that came with the GPU and I loaded it to see what the performance looked like and it was pretty astounding (I'm not a game player myself but free is free). I don't know how this would apply to video editing software programs but there was no aliasing and the video response was quite quick and smooth. I believe I paid $190 for the card and it was the second most expensive part of the build next to the CPU.
-
For what it's worth, I have the 3930K with Sabertooth x79, and a 4GB GTX 680. It's been 4 months, no problems so far. Haven't done a ton of video editing, but this card, or even a 570 would be great for Premiere.
The GTX 780 has been released, but only a 3GB version. The Titan is a beast, but I heard next year there's a new GPU coming out, so, no need for me to upgrade in the meantime.
-
For what it is worth:
http://www.anandtech.com/show/6973/nvidia-geforce-gtx-780-review
Cheers,
Bernard