Pages: [1]   Go Down

Author Topic: Lightroom video card suggestion  (Read 611 times)

soboyle

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 318
    • shaunoboylephoto.com
Lightroom video card suggestion
« on: June 20, 2018, 03:21:22 PM »

Is it me or has Lightroom Classic CC run reeeaaaalllly slow since the last 2 upgrades? I haven't tested the 7.4 release from today yet, but I'm looking at replacing my video card, it's the oldest component in my system. Any suggestions on a card which will give the best performance running Lightroom but not be overkill since I don't do gaming or use other graphics intense programs. Adobe recommends Nvidia GTX 700 series but would something like a GTX 1060 or 1070 be better (more future proof)? It will be driving an older NEC Spectraview LCD2690 monitor (1920x1200) but that will likely be replaced with something with more resolution soon.
« Last Edit: June 20, 2018, 03:45:35 PM by soboyle »
Logged

digitaldog

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 13920
    • http://www.digitaldog.net/
Re: Lightroom video card sugestion
« Reply #1 on: June 20, 2018, 03:29:21 PM »

Try the new version first, it's supposed to be faster.
With that SpectraView, you may want a card that does support 10-bits per color. That panel is high bit, so is the application.
Logged
Andrew Rodney
Author “Color Management for Photographers"

soboyle

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 318
    • shaunoboylephoto.com
Re: Lightroom video card sugestion
« Reply #2 on: June 20, 2018, 03:42:57 PM »

Thanks Andrew. Any suggestion on cards that support 10 bit?
Logged

digitaldog

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 13920
    • http://www.digitaldog.net/
Re: Lightroom video card sugestion
« Reply #3 on: June 20, 2018, 03:45:37 PM »

Thanks Andrew. Any suggestion on cards that support 10 bit?
Sorry no, no direct experience with any cards that can be replaced. I'm running a 15" MacBook Pro, I'm told it is' an Intel HD Graphics 530/Radeon Pro 460. I know it's 10-bit.
Logged
Andrew Rodney
Author “Color Management for Photographers"

soboyle

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 318
    • shaunoboylephoto.com
Re: Lightroom video card suggestion
« Reply #4 on: June 20, 2018, 03:48:15 PM »

Can I ask what external monitor you are plugging into your macbook pro?
Logged

digitaldog

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 13920
    • http://www.digitaldog.net/
Re: Lightroom video card suggestion
« Reply #5 on: June 20, 2018, 03:49:25 PM »

Can I ask what external monitor you are plugging into your macbook pro?
NEC SpectraView PA272W.
Logged
Andrew Rodney
Author “Color Management for Photographers"

Rory

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 503
    • Recent images
Re: Lightroom video card sugestion
« Reply #6 on: June 20, 2018, 04:23:32 PM »

Any suggestion on cards that support 10 bit?

I do not think lightroom supports 10bit. 
Logged
[url=http://www.flickr.com/photos/roryhi

Wayne Fox

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 4059
    • waynefox.com
Re: Lightroom video card suggestion
« Reply #7 on: June 20, 2018, 05:42:38 PM »

Might want to look elsewhere for possible slow down problems.  If anything LR seems snappier the last couple of releases.  It certainly doesn’t seem slower to me at all.

You didn’t post the specs on what hardware/OS you are using.  Things like HD/SSD performance, ram, CPU to me are more important than the video card.  A lot of users have their video card options turned off in LR.


Logged

Frodo

  • Jr. Member
  • **
  • Offline Offline
  • Posts: 96
    • http://
Re: Lightroom video card sugestion
« Reply #8 on: June 24, 2018, 03:04:20 AM »

I do not think lightroom supports 10bit.
Not on Windows at least AFAIK
Logged

Hoggy

  • Full Member
  • ***
  • Offline Offline
  • Posts: 201
  • Never take life, or anything in it, too seriously.
Re: Lightroom video card suggestion
« Reply #9 on: June 26, 2018, 01:24:21 AM »

I don't think LR does 10-bit - yet.  But I'm hoping it will soon.
However PS does support 10-bit.

Since you said you may be getting a higher resolution monitor, presumably 4k or so..  The only ones I really know about that support 10-bit are the NVIDIA Quadro series cards.

I have a BenQ SW320 monitor ready and waiting for when my new super massively WAY-over-the-top decked-out Monster Dream Machine ships soon. (I'm giddy as a LIttle girl..  ;D 8) (for those that understand that reference))  That monitor is 4k, 10-bit, HDR10, so it needs a beefy video card.  I chose the 8GB NVIDIA Quadro P4000.  It's able to to drive, IIRC, four 8k monitors at 60Hz or four 4k monitors at 120Hz.  So it should be able to drive that monitor just fine - along with a 1080p TV via HDMI (via a 35-foot Displayport to HDMI converter cable).

From what I understand from my recent multiple crash-courses in current tech, the Quadro's are geared towards 'professionals', and the GTX series towards gamers.  I'm not a pro, but since I don't game either, I decided to go with the Quadro for its 10-bit capability.
« Last Edit: June 26, 2018, 01:54:19 AM by Hoggy »
Logged
Cams: Pentax K-3, K-30 & Canon G7X, S100
Firm supporter of DNG, throwing away originals.
It's the hash, man..  That good hash!
Pages: [1]   Go Up