Pages: [1]   Go Down

Author Topic: Lightroom video card suggestion  (Read 2550 times)

soboyle

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 360
    • shaunoboylephoto.com
Lightroom video card suggestion
« on: June 20, 2018, 03:21:22 pm »

Is it me or has Lightroom Classic CC run reeeaaaalllly slow since the last 2 upgrades? I haven't tested the 7.4 release from today yet, but I'm looking at replacing my video card, it's the oldest component in my system. Any suggestions on a card which will give the best performance running Lightroom but not be overkill since I don't do gaming or use other graphics intense programs. Adobe recommends Nvidia GTX 700 series but would something like a GTX 1060 or 1070 be better (more future proof)? It will be driving an older NEC Spectraview LCD2690 monitor (1920x1200) but that will likely be replaced with something with more resolution soon.
« Last Edit: June 20, 2018, 03:45:35 pm by soboyle »
Logged

digitaldog

  • Sr. Member
  • ****
  • Online Online
  • Posts: 20630
  • Andrew Rodney
    • http://www.digitaldog.net/
Re: Lightroom video card sugestion
« Reply #1 on: June 20, 2018, 03:29:21 pm »

Try the new version first, it's supposed to be faster.
With that SpectraView, you may want a card that does support 10-bits per color. That panel is high bit, so is the application.
Logged
http://www.digitaldog.net/
Author "Color Management for Photographers".

soboyle

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 360
    • shaunoboylephoto.com
Re: Lightroom video card sugestion
« Reply #2 on: June 20, 2018, 03:42:57 pm »

Thanks Andrew. Any suggestion on cards that support 10 bit?

digitaldog

  • Sr. Member
  • ****
  • Online Online
  • Posts: 20630
  • Andrew Rodney
    • http://www.digitaldog.net/
Re: Lightroom video card sugestion
« Reply #3 on: June 20, 2018, 03:45:37 pm »

Thanks Andrew. Any suggestion on cards that support 10 bit?
Sorry no, no direct experience with any cards that can be replaced. I'm running a 15" MacBook Pro, I'm told it is' an Intel HD Graphics 530/Radeon Pro 460. I know it's 10-bit.
Logged
http://www.digitaldog.net/
Author "Color Management for Photographers".

soboyle

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 360
    • shaunoboylephoto.com
Re: Lightroom video card suggestion
« Reply #4 on: June 20, 2018, 03:48:15 pm »

Can I ask what external monitor you are plugging into your macbook pro?

digitaldog

  • Sr. Member
  • ****
  • Online Online
  • Posts: 20630
  • Andrew Rodney
    • http://www.digitaldog.net/
Re: Lightroom video card suggestion
« Reply #5 on: June 20, 2018, 03:49:25 pm »

Can I ask what external monitor you are plugging into your macbook pro?
NEC SpectraView PA272W.
Logged
http://www.digitaldog.net/
Author "Color Management for Photographers".

Rory

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 528
    • Recent images
Re: Lightroom video card sugestion
« Reply #6 on: June 20, 2018, 04:23:32 pm »

Any suggestion on cards that support 10 bit?

I do not think lightroom supports 10bit. 
Logged
[url=http://www.flickr.com/photos/roryhi

Wayne Fox

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 4237
    • waynefox.com
Re: Lightroom video card suggestion
« Reply #7 on: June 20, 2018, 05:42:38 pm »

Might want to look elsewhere for possible slow down problems.  If anything LR seems snappier the last couple of releases.  It certainly doesn’t seem slower to me at all.

You didn’t post the specs on what hardware/OS you are using.  Things like HD/SSD performance, ram, CPU to me are more important than the video card.  A lot of users have their video card options turned off in LR.


Logged

Frodo

  • Full Member
  • ***
  • Offline Offline
  • Posts: 152
Re: Lightroom video card sugestion
« Reply #8 on: June 24, 2018, 03:04:20 am »

I do not think lightroom supports 10bit.
Not on Windows at least AFAIK
Logged

Hoggy

  • Full Member
  • ***
  • Offline Offline
  • Posts: 207
  • Never take life, or anything in it, too seriously.
Re: Lightroom video card suggestion
« Reply #9 on: June 26, 2018, 01:24:21 am »

I don't think LR does 10-bit - yet.  But I'm hoping it will soon.
However PS does support 10-bit.

Since you said you may be getting a higher resolution monitor, presumably 4k or so..  The only ones I really know about that support 10-bit are the NVIDIA Quadro series cards.

I have a BenQ SW320 monitor ready and waiting for when my new super massively WAY-over-the-top decked-out Monster Dream Machine ships soon. (I'm giddy as a LIttle girl..  ;D 8) (for those that understand that reference))  That monitor is 4k, 10-bit, HDR10, so it needs a beefy video card.  I chose the 8GB NVIDIA Quadro P4000.  It's able to to drive, IIRC, four 8k monitors at 60Hz or four 4k monitors at 120Hz.  So it should be able to drive that monitor just fine - along with a 1080p TV via HDMI (via a 35-foot Displayport to HDMI converter cable).

From what I understand from my recent multiple crash-courses in current tech, the Quadro's are geared towards 'professionals', and the GTX series towards gamers.  I'm not a pro, but since I don't game either, I decided to go with the Quadro for its 10-bit capability.
« Last Edit: June 26, 2018, 01:54:19 am by Hoggy »
Logged
Cams: Pentax K-3, K-30 & Canon G7X, S100
Firm supporter of DNG, throwing away originals.
It's the hash, man..  That good hash!

nemophoto

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 1021
    • Nemo Niemann Photography
Re: Lightroom video card suggestion
« Reply #10 on: July 22, 2018, 01:28:47 pm »

From my experience, Lightroom runs slowly — regardless of the update or video card. I have a GTX 1080 and turned off the GPU acceleration because it, honestly, sucked. Performance was worse. I came across an Adobe post that basically discourages making lots of adjustments via brushes etc. On 5Ds files, LR crawls after about a dozen local adjustments. I sometimes have to quit and restart to recover the speed. I’ll notice that after a series of local adjusts, the HDD is writing continually. If I stop until I see the HDD light calm down, then I have some speed back for a bit. My scratch disk is an SSD, though I haven’t seen a huge difference with either PS or LR with it.


Sent from my iPhone using Tapatalk
Logged

saiguy

  • Full Member
  • ***
  • Offline Offline
  • Posts: 236
Re: Lightroom video card suggestion
« Reply #11 on: July 22, 2018, 04:06:32 pm »

LR does not use a scratch disk. If many local adjustments are needed, better to do that in PS. LR has to hold all that in memory. That's why you can benefit from a restart. Adobe says a best practice is to use as few pins as possible. In other words, make as many edits on each pin as possible. It is also possible that after the restart it loads all that back into memory anyway?

Adobe also suggests doing the local adjustment early on. But it will still slow down everything afterwards. If I want to use Dehaze, which is very processor intensive, I apply it, make any compensating edits as needed, then I turn it off and reapply it at the end of the session.

Check Adobe LR performance tips and they explain all this best practice stuff, including graphic card issues.
Logged

nemophoto

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 1021
    • Nemo Niemann Photography
Re: Lightroom video card suggestion
« Reply #12 on: July 23, 2018, 10:21:52 pm »

LR, from what I was told, DOES use a sort of scratch disk, though not in the same way as PS. It uses it to cache aspects of the program and file you are working on. But no, not a scratch disk in the same way. I also read the article about doing all your local edits and THEN doing your global adjustments. Stupid approach. But the reasoning was that for any local adjustments you make, LR then has to apply all the various global adjustments as well.

I think the way LR was written, there were never supposed to be local adjustments. It was supposed to convert your RAW files, based on global edits, and then anything else was supposed to be done in PS. Short of a complete re-write, LR has been painted, somewhat, into a box. I can't figure out how Capture One can do a lot of similar things (though granted, for me, not as easily and straightforward) and not grind to a halt. My 5Ds are the benchmark for me. Capture One flies through them, by comparison. But I still like many of the results of LR better and so I still with it.


I also find turning on GPU support is almost detrimental to my performance. I have NEVER, on any of my three computers, found it useful.
Logged

jrsforums

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 1288
Re: Lightroom video card suggestion
« Reply #13 on: July 23, 2018, 11:29:42 pm »

LR, from what I was told, DOES use a sort of scratch disk, though not in the same way as PS. It uses it to cache aspects of the program and file you are working on. But no, not a scratch disk in the same way. I also read the article about doing all your local edits and THEN doing your global adjustments. Stupid approach. But the reasoning was that for any local adjustments you make, LR then has to apply all the various global adjustments as well.

I think the way LR was written, there were never supposed to be local adjustments. It was supposed to convert your RAW files, based on global edits, and then anything else was supposed to be done in PS. Short of a complete re-write, LR has been painted, somewhat, into a box. I can't figure out how Capture One can do a lot of similar things (though granted, for me, not as easily and straightforward) and not grind to a halt. My 5Ds are the benchmark for me. Capture One flies through them, by comparison. But I still like many of the results of LR better and so I still with it.


I also find turning on GPU support is almost detrimental to my performance. I have NEVER, on any of my three computers, found it useful.

I found that enabling the gpu (nvidia 1080) caused performance problems (stuttering sliders, etc).  I unchecked “use smart previews instead of originals for image editing” and the problem went away.  I have no idea why.

All other performance did not suffer and probably improved, though I did not measure. 

Environment is win 10 with 2 hi-res displays attached, both (Windows and displays) making good use of gpu being engaged.
Logged
John

Hoggy

  • Full Member
  • ***
  • Offline Offline
  • Posts: 207
  • Never take life, or anything in it, too seriously.
Re: Lightroom video card suggestion
« Reply #14 on: July 24, 2018, 02:01:48 am »

I also find turning on GPU support is almost detrimental to my performance. I have NEVER, on any of my three computers, found it useful.

Like Jrsforums, I also noticed a rather hefty speed increase by turning on the GPU..  On this circa 2012 Win10 AMD laptop - which has integrated graphics.  I'm not sure if it was due to the shared memory aspects and/or APU integration and/or due to the fact that I'm using an aftermarket LCD panel that is Adobe-gamut 1080P vs. the 720(?) sRGB-gamut OEM panel.

Of course, ever since 2015.10 or so, GPU use has been "disabled due to errors"..  So this hobby has been put on hold until my new supercomputer arrives soon, hopefully working this time around.  :-/
Logged
Cams: Pentax K-3, K-30 & Canon G7X, S100
Firm supporter of DNG, throwing away originals.
It's the hash, man..  That good hash!
Pages: [1]   Go Up