Pages: [1]   Go Down

Author Topic: Video Card Insanity  (Read 4670 times)

Steve House

  • Full Member
  • ***
  • Offline Offline
  • Posts: 249
Video Card Insanity
« on: May 23, 2013, 06:00:02 PM »

Trying to configure a new high-end PC and going nuts with the video card selection.  Pretty well decided to go with an Intel i7-3930k CPU on either an Asus P9X79 WS or Sabertooth x79 motherboard, 32 gig ram, but graphics card selection is still up in the air.  Will be using a wide gamut monitor, probably Dell U2713h, that supports 10-bit colour.  The issue is I want optimum image performance with Lightroom (mostly) and Photoshop (occasionally) but I also want to move into high quality video editing in Premiere Pro.  Including Premiere in the software mix pretty well limits the choice to nVidia cards.  The GTX 680 (~$600) or GTX Titan (~$1k) apparently perform very well with Premiere and After Effects but don't support 10-bit colour.  To get to 10-bit requires moving to the Quadro line and for similar number of CUDAs one would be looking at the Quadro K5000 (~$1800) for something roughly equivalent to the GTX 680 in terms processor count.  That's really pushing the budget - do'able but I have other places where the $$ could be put to use - and I'm wondering if those 2 bits are worth it for general photography.  Indeed, does LR even do 10-bit or is it coming in LR5?
Logged

Jim Kasson

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 1388
    • The Last Word
Re: Video Card Insanity
« Reply #1 on: May 23, 2013, 06:23:57 PM »

You don't say what OS. If it's Win 7, be careful. 10-bit display bit depth and Aero don't play together, or didn't two years ago.

http://blog.kasson.com/?p=674

Jim

Steve House

  • Full Member
  • ***
  • Offline Offline
  • Posts: 249
Re: Video Card Insanity
« Reply #2 on: May 23, 2013, 07:18:33 PM »

You don't say what OS. If it's Win 7, be careful. 10-bit display bit depth and Aero don't play together, or didn't two years ago.

http://blog.kasson.com/?p=674

Jim
Most likely Windows 8
Logged

Sheldon N

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 828
Re: Video Card Insanity
« Reply #3 on: May 23, 2013, 08:00:04 PM »

I'm wondering if those 2 bits are worth it for general photography. 

I'd say probably not worth the investment, unless you do a lot of grayscale imagery.... and I've currently got a 10 bit monitor setup.

kaelaria

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 2222
    • http://www.bgpictures.com
Re: Video Card Insanity
« Reply #4 on: May 24, 2013, 03:36:57 AM »

Exactly.  You have no need for 10 bit, you are going way too far over the edge...

I use Premiere and Vegas.  I still use a 570 because the 6xx series does not work with Vegas, Sony did not licence the code yet, and I use it a lot more than Premiere (hate it).  CUDA support is spotty no matter the app though, don't stress about it, get a good sweet spot card.  You will NOT notice the difference the few SECONDS make going between a $200 and $1000 card for what you are after.  Gaming - YES.  CUDA rendering, not a chance.  Personally I would get a 660ti.

Sareesh Sudhakaran

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 546
    • The Indie Farm
Re: Video Card Insanity
« Reply #5 on: May 24, 2013, 05:07:37 AM »

I wouldn't use the pseudo 10-bit mode on the Ultrasharp if I were you. Unless you have a true 10-bit monitor, it is pointless. I believe 10-bit is pointless anyway, but that's a different story.

A good GPU is necessary. The best GPU isn't. You can make do with a GTX 680 (4GB or 2GB) and a Quadro K600. The Quadro will drive two monitors and supports S3D.

If you decide on 10-bit, why not get the Quadro K600 first, and see if it meets your needs? Until a couple of years ago most people made do without CUDA.
Logged
Get the Free Comprehensive Guide to Rigging ANY Camera - one guide to rig them all - DSLRs to the Arri Alexa.

Alan Goldhammer

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 1967
    • A Goldhammer Photography
Re: Video Card Insanity
« Reply #6 on: May 25, 2013, 08:24:22 AM »

With the caveat that I don't do video editing, I got the NVIDIA GeForce GTX 660 2GB GDDR5 for my recent build.  LR doesn't use a GPU at this point in time so it's pretty irrelevant for that.  The few PS operations that I tried with it seem to be quick and easy.  In terms of video, there was a free game that came with the GPU and I loaded it to see what the performance looked like and it was pretty astounding (I'm not a game player myself but free is free).  I don't know how this would apply to video editing software programs but there was no aliasing and the video response was quite quick and smooth.  I believe I paid $190 for the card and it was the second most expensive part of the build next to the CPU.
Logged

RobertJ

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 656
Re: Video Card Insanity
« Reply #7 on: May 26, 2013, 04:33:46 PM »

For what it's worth, I have the 3930K with Sabertooth x79, and a 4GB GTX 680.  It's been 4 months, no problems so far.  Haven't done a ton of video editing, but this card, or even a 570 would be great for Premiere. 

The GTX 780 has been released, but only a 3GB version.  The Titan is a beast, but I heard next year there's a new GPU coming out, so, no need for me to upgrade in the meantime.
Logged

BernardLanguillier

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 9466
    • http://www.flickr.com/photos/bernardlanguillier/sets/
Re: Video Card Insanity
« Reply #8 on: May 27, 2013, 10:28:01 AM »

Logged
A few images online here!
Pages: [1]   Go Up