Pages: [1]   Go Down

Author Topic: I guess Apple will never go to 10 bit  (Read 1420 times)

Lundberg02

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 379
I guess Apple will never go to 10 bit
« on: April 23, 2015, 02:36:00 am »


I've read lots of pro and con about 10bit/channel video and that Windows has it but it's problematic. Apple is offering support for various 4k and 5k monitors, but says nothing about 10 bit. Moviemakers are pulling way ahead with some of the new cameras and displays, color spaces, and other things. Some computer monitors offer 10 and 12 bit (I think), both real and fake.
I don't understand why Apple doesn't offer a position on this.
Logged

Czornyj

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 1950
    • zarzadzaniebarwa.pl
Re: I guess Apple will never go to 10 bit
« Reply #1 on: April 23, 2015, 03:03:52 am »

Maybe because it doesn't really matter?
Logged
Marcin Kałuża | [URL=http://zarzadzaniebarwa

hjulenissen

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 2051
Re: I guess Apple will never go to 10 bit
« Reply #2 on: April 23, 2015, 05:21:07 am »

While it may or may not matter today, rec. 2020 (included in various 4k/uhd standards) might make it matter. It might also render in-display calibration less relevant, as heavy nonlinear processing could take place in the comfort of computer software.

For audio, 24 bit playback offers no audible benefit over 16 bits, yet manufacturers choose to implement it. Perhaps because it comes at zero cost, perhaps because they want to be on the safe side, perhaps because marketing demands it.

-h
Logged

joofa

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 544
Re: I guess Apple will never go to 10 bit
« Reply #3 on: April 23, 2015, 02:30:57 pm »

For audio, 24 bit playback offers no audible benefit over 16 bits, yet manufacturers choose to implement it. Perhaps because it comes at zero cost, perhaps because they want to be on the safe side, perhaps because marketing demands it.

Well, a 24-bit Analog to Digital conversion using subtractive dither, where the signal has 16 bits and dither 8 bits, has advantages over pure 16 bit signal. I don't know if manufacturers implement such systems.
Logged
Joofa
http://www.djjoofa.com
Download Photoshop and After Effects plugins

hjulenissen

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 2051
Re: I guess Apple will never go to 10 bit
« Reply #4 on: April 24, 2015, 04:32:29 am »

Well, a 24-bit Analog to Digital conversion using subtractive dither, where the signal has 16 bits and dither 8 bits, has advantages over pure 16 bit signal. I don't know if manufacturers implement such systems.
Note that I did say "playback". For recording, there are no limits to how much precision (might) be beneficial once you allow creative mangling of the sound files, much like the case in photography.

I dont think that anyone are able to distinguish (even naiively) truncated 16-bit audio from 16-bit properly dithered audio in a properly blinded and "relevant" test?

In laymans terms, I think that the very best 24+ bit audio ADC/DACs out there can be compared to an ideal 20-bit (or so) converter. In theory, having 20 rather than 16 bits of accuracy is a good thing, but once you put it into a room of 60dB or 80dB difference between peak levels and noise floor, the difference gets buried very deep into the noise floor.

Of some relevance to this question, see e.g.:
"Audibility of a CD-Standard A/D/A Loop Inserted into High-Resolution Audio Playback*", Meyer & Moran, J. Audio Eng. Soc., Vol. 55, No. 9, 2007 September
http://drewdaniels.com/audible.pdf

-h
« Last Edit: April 24, 2015, 04:34:00 am by hjulenissen »
Logged
Pages: [1]   Go Up