Pages: [1] 2   Go Down

Author Topic: Video Card for Mac Pro  (Read 12925 times)

Josh-H

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 2079
    • Wild Nature Photo Travel
Video Card for Mac Pro
« on: September 25, 2008, 12:15:17 am »

Is there any need or reason to upgrade the standard video card that comes in the latest mac pros for working on RAW's and Tiffs? Tiff's up to and over 1 Gig and RAWS from 21 and 31pmpx cameras.

Reason for asking is the mac pro I have ordered is being delayed by the video card, which I had upgraded from the stock ATI Radeon HD 2600 XT with 256MB to the VIDIA GeForce 8800 GT 512MB.

If there is need need for the upgraded card Ill swap back to the stock unit and save time and $.

I wont be editing video on this machine *gulp* famous last words I expect...

Ta.
Logged
Wild Nature Photo Travel

kaelaria

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 2223
    • http://www.bgpictures.com
Video Card for Mac Pro
« Reply #1 on: September 25, 2008, 09:59:58 am »

No, it has nothing to do with it.
Logged

Schewe

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 6229
    • http:www.schewephoto.com
Video Card for Mac Pro
« Reply #2 on: September 25, 2008, 11:55:35 am »

Quote
Is there any need or reason to upgrade the standard video card that comes in the latest mac pros for working on RAW's and Tiffs? Tiff's up to and over 1 Gig and RAWS from 21 and 31pmpx cameras.
[a href=\"index.php?act=findpost&pid=224190\"][{POST_SNAPBACK}][/a]


For Photoshop CS4, GPU (and having a really big, fast vid card) will make quite a bit of difference...not so much in terms of actual processing time–that's CPU bound (and ram and disk i/o) but for interface, zooming and other appearances of display pixels, CS4 will take very good advantage of a high end video card.
Logged

Josh-H

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 2079
    • Wild Nature Photo Travel
Video Card for Mac Pro
« Reply #3 on: September 25, 2008, 07:11:35 pm »

Thanks - I'll suck it up and wait for delivery with the better video card.
Logged
Wild Nature Photo Travel

Jack Flesher

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 2592
    • www.getdpi.com
Video Card for Mac Pro
« Reply #4 on: September 25, 2008, 07:36:29 pm »

First off, the "basic" card is no slouch, will drive 2 dual-DVI monitors flawlessly and renders 2-D apps instantly.  So basically, you are not going to gain much of anything with the more expensive card for your stated purposes.  If OTOH you plan on high-end 3-D gaming with two or three monitors, by all means go with the optional card.  FWIW you can always add the bigger card later and run both cards and four monitors at a total added cost of about $100...
Logged
Jack
[url=http://forum.getdpi.com/forum/

Schewe

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 6229
    • http:www.schewephoto.com
Video Card for Mac Pro
« Reply #5 on: September 25, 2008, 10:21:54 pm »

Again, for CS4 (not talking the current Photoshop CS3, but CS4 that WILL be coming out soon), a beefy vid card with AT LEAST 512MB (more is better) WILL improve certain functions and performance because CS4 is very seriously trying to leverage the vid card GPU. Bigger and faster  video card _WILL_ benefit CS4, but not any version of Photoshop before CS4.

So, if you plan on getting CS4 soon'ish, and you are specing a machine now'ish, it would definitely make sense to go for a higher end vid card now.
« Last Edit: September 25, 2008, 11:10:50 pm by Schewe »
Logged

kaelaria

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 2223
    • http://www.bgpictures.com
Video Card for Mac Pro
« Reply #6 on: September 25, 2008, 10:54:30 pm »

Good to know!
Logged

bjanes

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 3387
Video Card for Mac Pro
« Reply #7 on: September 26, 2008, 09:33:22 am »

Quote
Again, for CS4 (not talking the current Photoshop CS3, but CS4 that WILL be coming out soon), a beefy vid card with AT LEAST 512MB (more is better) WILL improve certain functions and performance because CS4 is very seriously trying to leverage the vid card GPU. Bigger and faster  video card _WILL_ benefit CS4, but not any version of Photoshop before CS4.

So, if you plan on getting CS4 soon'ish, and you are specing a machine now'ish, it would definitely make sense to go for a higher end vid card now.
[{POST_SNAPBACK}][/a]

[a href=\"http://blogs.adobe.com/jnack/2008/09/photoshop_gpu_a.html]John Nack PSCS4 GPU Advice[/url] gives some suggestions for GPUs that will take advantage of the new features of PSCS4. Basically, he recommends a card that is Shader Model 3.0 and OpenGL 2.0 compatible with 128-512MB RAM.

The New Egg site makes some useful comments on GPUs and breaks them down into the following categories to which I have added my own comments for starting discussion:

GAMING: ATI Radeon X1000 series or above, NVIDIA GeForce 6 series or above

These guys are crazy and use overclocked and probably unstable machines. A high frame rate and good video shading are needed, but accurate color rendering and a high quality still image may be secondary considerations.

VIDEO and MULTIMEDIA: ATI Radeon X1000 series or above, NVIDIA GeForce 6 series or above

Here we need a high frame rate and hardware encoding and decoding of compressed video formats along with I/O ports for video. This is not necessary for PSCS4.

PROFESSIONAL 2D: ATI FireMV series or NVIDIA Quadro NVS series

These cards were fine for PSCS3 but not ideal for PSCS4.

PROFESSIONAL 3D: ATI FireGL series or NVIDIA Quadro FX series

The professional designation sounds good, but we are not doing CAD in Photoshop. Are these cards advantageous for PS and worth the additional cost?

Finally, there is a new generation of displays with a large color gamut and 30 bit video look up tables, such as the new HP 30 bit color display.

To take full advantage of such a display, one needs 30 bit support in the operating system, software application, and video card. I have read that PS does not currently support 30 bit color, nor do current widely available video cards and operating systems used for PS.
Logged

jerryrock

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 608
    • The Grove Street Photographer
Video Card for Mac Pro
« Reply #8 on: September 27, 2008, 07:40:07 pm »

Quote
Finally, there is a new generation of displays with a large color gamut and 30 bit video look up tables, such as the new HP 30 bit color display.

To take full advantage of such a display, one needs 30 bit support in the operating system, software application, and video card. I have read that PS does not currently support 30 bit color, nor do current widely available video cards and operating systems used for PS.
[a href=\"index.php?act=findpost&pid=224594\"][{POST_SNAPBACK}][/a]

Most modern video cards output 10 bit color which is all that is required for this monitor.  The 30 bit color of the monitor you mentioned is deceiving because it is a RGB LED monitor which takes the 10 bit output from the video card and adjusts it for each color channel with the internal LUT.

"Q . What is a “10-bit panel?” Why is it important for image quality?

A . The HP DreamColor LP2480zx display’s internal electronics can adjust the brightness of red,
green or blue to any of 1024 levels (210 hence, “10 bits/color” or “30 bits/pixel). This
allows the internal calibration tables (for color space and tone response) to work their magic
without introducing banding artifacts or loss of detail. Furthermore, with the use of the
DisplayPort 1.1 or HDMI 1.3 inputs provided on the HP DreamColor LP2480zx display, the
computer is able to send 10-bit/color images to the monitor without losing detail.

Q . Do I need a 30-bit graphics card (also called 10-bit) to use the HP DreamColor LP2480zx
display?

A . No. The HP DreamColor LP2480zx display has 36-bit precision in the HP DreamColor
Engine. A full 30-bit pixel is sent from the DreamColor Engine to be displayed on the HP 30-
bit LCD panel with no dithering or frame rate control. However, even with an 8-bit per color
channel, 24-bits per pixel graphics card, the user benefits from the HP 30-bit LCD panel
because the HP DreamColor Engine still selects colors from the full 1.07 billion color palette.
The benefit is more accurate gamut control and tone response and the virtual elimination of
visual artifacts such as banding or contouring."
Logged
Gerald J Skrocki

bjanes

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 3387
Video Card for Mac Pro
« Reply #9 on: September 28, 2008, 08:28:16 am »

Quote
Most modern video cards output 10 bit color which is all that is required for this monitor.  The 30 bit color of the monitor you mentioned is deceiving because it is a RGB LED monitor which takes the 10 bit output from the video card and adjusts it for each color channel with the internal LUT.

[{POST_SNAPBACK}][/a]


Just because you have a 10 bit video card does not mean that you system can display 10 bit images properly, as [a href=\"http://www.prophotohome.com/forum/open-photography-discussion/74081-eizo-monitor-matrox-10-bit.html#post449307]Andrew Rodney[/url] points out in this post. That post was a year ago, and perhaps things have changed since then. If the DigitalDog is monitoring this thread, perhaps he can update us.

Bill
Logged

bjanes

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 3387
Video Card for Mac Pro
« Reply #10 on: September 28, 2008, 10:27:07 am »

Quote
Just because you have a 10 bit video card does not mean that you system can display 10 bit images properly, as Andrew Rodney points out in this post. That post was a year ago, and perhaps things have changed since then. If the DigitalDog is monitoring this thread, perhaps he can update us.

Bill
[{POST_SNAPBACK}][/a]


This [a href=\"http://luminous-landscape.com/forum/index.php?showtopic=9613&view=findpost&p=54301]Karl Lang Quote 2005[/url] contains additional information regarding this matter. It is three years old and technology may have advanced since then and I am eagerly awaiting updates.

Bill
Logged

jerryrock

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 608
    • The Grove Street Photographer
Video Card for Mac Pro
« Reply #11 on: September 28, 2008, 10:38:52 am »

Quote
Just because you have a 10 bit video card does not mean that you system can display 10 bit images properly, as Andrew Rodney points out in this post. That post was a year ago, and perhaps things have changed since then. If the DigitalDog is monitoring this thread, perhaps he can update us.

Bill
[a href=\"index.php?act=findpost&pid=225165\"][{POST_SNAPBACK}][/a]

Please note that Photoshop CS4 requires a 16 bit video card as a minimum requirement.


Jerry
Logged
Gerald J Skrocki

bjanes

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 3387
Video Card for Mac Pro
« Reply #12 on: September 28, 2008, 10:54:12 am »

Quote
Please note that Photoshop CS4 requires a 16 bit video card as a minimum requirement.
Jerry
[{POST_SNAPBACK}][/a]

Jerry,

Is that 16 bits per channel or 16 bits in all three channels? I was not aware that video cards with 16 bits per channel (48 bits total) existed. See this [a href=\"http://www.scantips.com/basics11.html]16 bit video[/url] link for details.

Who makes 16 bit video cards?

Bill
« Last Edit: September 28, 2008, 10:54:59 am by bjanes »
Logged

jerryrock

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 608
    • The Grove Street Photographer
Video Card for Mac Pro
« Reply #13 on: September 28, 2008, 11:18:59 am »

Quote
Jerry,

Is that 16 bits per channel or 16 bits in all three channels? I was not aware that video cards with 16 bits per channel (48 bits total) existed. See this 16 bit video link for details.

Who makes 16 bit video cards?

Bill
[{POST_SNAPBACK}][/a]

That's why I mentioned in my previous post that the description of the "30 bit " monitor is confusing.
I am assuming that the CS4 video card requirement describes the combined bit depth. ex. 32bit=(8+8+8+alpha channel) = 16.7 million colors.

Any of the ATI HD series video cards will fully support the 30 bit HP monitor.

[a href=\"http://ati.amd.com/products/Radeonhd4800/specs.html]http://ati.amd.com/products/Radeonhd4800/specs.html[/url]

Jerry
« Last Edit: September 28, 2008, 02:58:20 pm by jerryrock »
Logged
Gerald J Skrocki

GregW

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 306
    • http://
Video Card for Mac Pro
« Reply #14 on: September 28, 2008, 02:07:49 pm »

Apple's Aperture will also make use of a 'beefier' video card; it relies on Apple's Core Image technology for a lot of processing.
« Last Edit: September 28, 2008, 02:10:58 pm by GregW »
Logged

madmanchan

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 2115
    • Web
Video Card for Mac Pro
« Reply #15 on: September 28, 2008, 03:40:01 pm »

Internally, most modern video cards (e.g., those from ATI and NVIDIA) support 32 bits per component, for up to 128 bits (R, G, B, and alpha).

This is separate from the output bit depth (i.e., what comes out the back end of the card when scanned out by a display).
Logged
Eric Chan

bjanes

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 3387
Video Card for Mac Pro
« Reply #16 on: September 28, 2008, 03:52:41 pm »

Quote
Internally, most modern video cards (e.g., those from ATI and NVIDIA) support 32 bits per component, for up to 128 bits (R, G, B, and alpha).

This is separate from the output bit depth (i.e., what comes out the back end of the card when scanned out by a display).
[a href=\"index.php?act=findpost&pid=225263\"][{POST_SNAPBACK}][/a]

Eric,

That is good to know, but is the operating system, Photoshop, or the DVI output limiting the bit depth of the system?

Bill
Logged

jerryrock

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 608
    • The Grove Street Photographer
Video Card for Mac Pro
« Reply #17 on: September 28, 2008, 04:52:13 pm »

Quote
Eric,

That is good to know, but is the operating system, Photoshop, or the DVI output limiting the bit depth of the system?

Bill
[{POST_SNAPBACK}][/a]

The video card output determines the bit depth that goes to the display.
 
For example, my MacPro (2007) came with the ATI X1900XT video card. This card will take whatever bit depth is fed to it and convert it to 10 bits per color (30 bit output) if the display supports it. If not, it can also downsample (dither) the output to accommodate an 8 or 6 bit display.

[a href=\"http://ati.amd.com/products/RadeonX1900/specs.html]http://ati.amd.com/products/RadeonX1900/specs.html[/url]

"flexible display support

Dual integrated dual-link DVI transmitters
DVI 1.0 compliant / HDMI interoperable and HDCP ready*
Dual integrated 10 bit per channel 400 MHz DACs
16 bit per channel floating point HDR and 10 bit per channel DVI output
Programmable piecewise linear gamma correction, color correction, and color space conversion (10 bits per color)
Complete, independent color controls and video overlays for each display
High quality pre- and post-scaling engines, with underscan support for all outputs
Content-adaptive de-flicker filtering for interlaced displays
Xilleon™ TV encoder for high quality analog output
YPrPb component output for direct drive of HDTV displays
Spatial/temporal dithering enables 10-bit color quality on 8-bit and 6-bit displays
Fast, glitch-free mode switching
VGA mode support on all outputs
Drive two displays simultaneously with independent resolutions and refresh rates"

Photoshop CS4 also features native 16bit printing on Mac OSX which would suggest that both the OS and Photoshop can handle the bit depth.
Logged
Gerald J Skrocki

bjanes

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 3387
Video Card for Mac Pro
« Reply #18 on: September 28, 2008, 06:16:30 pm »

Quote
The video card output determines the bit depth that goes to the display.
 
Photoshop CS4 also features native 16bit printing on Mac OSX which would suggest that both the OS and Photoshop can handle the bit depth.
[{POST_SNAPBACK}][/a]

Perhaps so, I don't know but that contradicts Andrew Rodney's post one year ago. Perhaps things have changed: [a href=\"http://www.prophotohome.com/forum/open-photography-discussion/74081-eizo-monitor-matrox-10-bit.html#post449307]Click here[/url]

At that time he stated that neither the operating system or application supported 10 bits per channel.

Bill
Logged

digitaldog

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 20646
  • Andrew Rodney
    • http://www.digitaldog.net/
Video Card for Mac Pro
« Reply #19 on: September 29, 2008, 09:30:31 am »

Quote
Perhaps so, I don't know but that contradicts Andrew Rodney's post one year ago. Perhaps things have changed: Click here

At that time he stated that neither the operating system or application supported 10 bits per channel.

Bill
[a href=\"index.php?act=findpost&pid=225318\"][{POST_SNAPBACK}][/a]

Nothing has changed thus far.
Logged
http://www.digitaldog.net/
Author "Color Management for Photographers".
Pages: [1] 2   Go Up