Pages: 1 [2]   Go Down

Author Topic: Does the emperor have clothes?  (Read 14545 times)

Tim Lookingbill

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 2436
Re: Does the emperor have clothes?
« Reply #20 on: March 29, 2016, 03:30:34 pm »

Now that this thread has touched upon LUTs, Da Vinci Resolve and accurate color vs interpreted what do you all think of this observation on the differences of color rendering between a Sony F55 and ARRI Alexa...

http://www.dvinfo.net/article/post/making-the-sony-f55-look-filmic-with-resolve-9.html

I found it very informative in more ways than one especially on the subject of the "filmic" look the ARRI seems to have mastered brilliantly. It also confirms to me the color engine behavior of Adobe Camera Raw editing I see over other Raw converters in regards to hue/saturation changes when applying contrast either through curves or sliders.

The half and half at mid gray point of subtractive behavior in the highlights above mid gray and additive behavior below is what I observed quite often doing color separations by hand for commercial presses where dot gain in colored shadows showed additive behavior and less dot gain showed subtractive behavior (less saturation) in the upper mids to highlights.

The author of the linked article included a part 2 to his observation with this piece...

http://www.dvinfo.net/article/production/camgear/what-alexa-and-watercolors-have-in-common.html#prettyPhoto
Logged

John Brawley

  • Jr. Member
  • **
  • Offline Offline
  • Posts: 77
Re: Does the emperor have clothes?
« Reply #21 on: March 29, 2016, 10:12:25 pm »


[/i]
A target space is defined and hittable, gretag cards are defined, 709 is defined.. im sure people in camera development labs have much better tools than a simple card.. one reason that they should do the work of developing the transform to a target, not the end user.


The problem is that REC 709 is pretty limited these days and REC 2020 isn't really defined.

And you guys all probably work in SRGB which doesn't have a place in video.  You're all looking at computer screens but working to a broadcast standard designed for television screens (and gamuts) and for starter those spaces are all radically different.

I'll speak about BM because I know how they work as I talk to the engineers.  They made what they call a 709-ish LUT.  They say ISH, because believe me, you'd hate what it would look like as a straight REC709 transform, mostly because it's an OLD standard and based on TELEVISION and not computer screens.

So in keeping the spirit of 709 but making it "nicer" they made their own version.

The targets are many.  The input variables are many.  There's honestly no "right" way because the sensor captures WAY more than the target and then the variables of lighting optics, wb etc mean that the best they can make is a rough fit.  And that's what it is.  You're expecting something matchematically precise. and I think I'm saying thats the same thing, but you're assuming it's a lot more precise than it actually is.

You've used the colour checker / card match function in Resolve I assume ?  While I like a chip chart as much as anyone, you do know that there's just not enough data points on there to actually really and truly do what you are thinking is the right thing to do......some only have 6 patches...! 

I just recently learned that Olympus uses an even older standard, REC 601 for their INTERNAL recording gamut in video, but then use REC 709 for their HDMI output !  Crazy.  They both come form the same source yet they look different on the same image.  See where this is going ?  They use their own internal LUT to get them to that space for each defined space but they look different....



and these are often created from theoretical models of sensor response, not photometric testing. 

One would hope any maker of anything would test in the real world before launching to public sale.


You're not a manufacturer....:-)

They ALL do it because there is a lot of variation in sensor response, sensor to sensor in each camera.  Most manufactures have a QA process that means they try to calibrate each sensor and then add secret "corrections" to the file that are specific to each sensor. 

When you're trying to do calculations based on an idealised outcome, how do you choose which of your sensors is the GOLD standard when no 2 are the same ?



REC 709 for example has a kind of limited amount of DR and colour gamut, even though that's the standard we all use for mastering to TV.

As does SRGb.. but in raw stills we know that we have the data shown on screen and more available via the exposure/temp/tint sliders and other tools, the small space is not a problem in developing a transform to that space.

Yes but if you shoot on different cameras in the SRGB profile they don't end up making the exact same images do they ?  Their spectral response is different at a sensor level and they can try to re-map that but it will never be the same.


Arri have worked well with LUTs and this has really helped their strong foothold IMO


No.  ProRes is why they work well.  And look at the current trend in advertising.  They all look like they're not even grading anymore, they just put the LOG files to air....:-)

Logged

Morgan_Moore

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 2356
    • sammorganmoore.com
Re: Does the emperor have clothes?
« Reply #22 on: March 30, 2016, 03:11:03 am »

OK you make some good points about basically the whole thing being a bit 'wooly' (I learned something :) )

BTW Im not a complete numbers man Im happy to go on feel.

Check this by Charles Papert..



Here if you pretend you were a client who sold blue jumpers (english word? top clothing made of blue wool) you can see that the rendition of the Arri and the Panny Varicam is completely different. As a client you would freak out. Is the jumper electric blue (Panny) or subtle blue (arri) ???

My understanding is the 'neutral' lut/grade on the Panny was made by a colourist, probably a pro, now when I loaded the Panny LUT from the website I got a much closer match to both the Arri and the numbers from the test chart and the 'feel test'.

Without the Panny LUT I would not feel confident in using the Panny on a commercial shoot.

On not being numbers I can 'smell' and image that 'feels right', whatever my monitor movie trailers tend to look good, iphone videos bad, so a slight monitor shift is not the issue. In that case the Arri feels right, the jumper, her hair whatever.

When I load a 5d still or a nikon still into Capture One it 'feels right'.

When I load a FS7 Slog clip into Resolve and add an Scurve and some sat it feels very wrong, when I load the factory LUT it starts to feel 80% right. 80% is OK but not really good enough, the Arri seems to feel 100% right just with the addition of the LUT

As for 'SRGB has no place in video'.. well I guess we are hitting 50-80% of all media being watched on computers and 'devices'. With an especially high % for non drama or show style television.

If this makes you unhappy.. then Aces is the solution you can move to the small space at the last step :)

Back to my overall point - I feel a good factory LUT is really important part of the camera/package.


Or it certainly is with Sony..

Here is the sony slog3 FS7, untreated, treated, with a saturation boost and Scurve, and treated with a factory LUT, to me getting a good image without the LUT would be a significant challenge..

https://www.dropbox.com/s/kdomto5r9lv84jj/tracksuit.jpg?dl=0

S



« Last Edit: March 30, 2016, 03:20:21 am by Morgan_Moore »
Logged
Sam Morgan Moore Bristol UK

Tim Lookingbill

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 2436
Re: Does the emperor have clothes?
« Reply #23 on: March 30, 2016, 02:48:49 pm »

I'll speak about BM because I know how they work as I talk to the engineers.  They made what they call a 709-ish LUT.  They say ISH, because believe me, you'd hate what it would look like as a straight REC709 transform, mostly because it's an OLD standard and based on TELEVISION and not computer screens.

So in keeping the spirit of 709 but making it "nicer" they made their own version.

The targets are many.  The input variables are many.  There's honestly no "right" way because the sensor captures WAY more than the target and then the variables of lighting optics, wb etc mean that the best they can make is a rough fit.  And that's what it is.  You're expecting something matchematically precise. and I think I'm saying thats the same thing, but you're assuming it's a lot more precise than it actually is.

How does this not bring into question the ability of the BM engineers to map color to a REC 709 preview from a supposedly characterized/measured digital sensor's spectral response to a mathematically described color target of REC 709. Or else how do you explain Morgan's ARRI Amira spot on color rendering verses the Panny?

I'm assuming that Amira shot was not edited to look correct because I am viewing it on a display whose color is measured as sRGB-ish in a color managed browser. According to your BM explanation it should be way off.

My ten years or so observation based analysis of digital color technology is that a device's color rendering of what the human eyes see is either WAY off from standards (due to less precise measuring and thus mapping to preview) or it's close enough as in the ARRI rendering which doesn't make this an accident due to sensor differences.

You haven't given any evidence or explanation that it's not BM's fault. I have a $50 Epson printer from Walmart that can nail sRGB color pretty consistently across a wide range of still images. We're talking liquid inkjet ink spraying onto a substrate and they seem to get it right, but engineers at a major camera manufacturer has issues with mapping measured spectral sensor responses to a known standard? You'll have to forgive me but I'm having a hard time understanding BM's explanation.
Logged

Morgan_Moore

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 2356
    • sammorganmoore.com
Re: Does the emperor have clothes?
« Reply #24 on: March 31, 2016, 04:23:42 am »

Tim - BMC are not unique in 'interpreting' 709.(as the SonySlog3-LC709 lut seen in your article above show!)

With a proper contrast ratios the 709 gamma supports about 7 stops of dynamic range.

Now consider designing a transform (LUT) from a camera that can record 11 stops to that small space

The correct transform would be to simply clip off stops 1 and 2 and clip off stops 10 and 11. (let us call that a STD, standard, LUT)

Now what if you added a 'foot' and 'knee' to your transform maths.. rather than just clip stops 10 and 11 you could slowly roll the highlights and make full use of the data (but deviate from the exact response laid out in the tech specs)

Now what if you did not like 'the video look' not only could you roll the highlights a little but desaturate them too. You could call this a Filmic 709 LUT (F709)

Put a real world shots in front of an audience with both STD and F709 luts applied and the audience will pick F709 every time, or at least the non X Factor audience will.

So most of these cinema cameras (including Arri) have LUTs that are not STD but F709 of some nature.

As it happens Sony issue two luts for Slog3, '709' and 'LC709' they do both and make clear in their paperwork the function of each, the latter LC is Low Contrast. Im not saying they do it well (far from it) but at least they do it with some clarity.

Do I approve of F709 luts? Im not sure. Simply because they hold highlights better usually you can get a 'pretty' result more quickly than taking STD and 'burning' your highlights back with a complex 'secondary' process. In stills we 'burn' skies every day, but working on a motion sequence doing that is a lot more painful.

I guess being a 'feel' man Im pretty happy with most LUTs that are accurate in the middle five or six stops contrast wise, and have good colour rendition from bottom to maybe 10-20% below clip.

If you learned colour work doing stills and use photshop or C1/LR,  a 'knee' in your LUT can be a bit problematic, because the process of really working (if you have time) the digital motion file will involve both a 'burn' of the highlights and also a twist of the gamma curve at the top end to bring the contrast ratio back.

Ive battled in transferring my raw+photoshop 'skills' to motion grading for the last few years, this dealing with knee, either built in to the camera, or applied as a lut, is one of the challenges.
 
I would therefore be most happy with manufacturers that issued a 'true' 709 LUT and a F709 Lut.

Not actually hitting perfect 709 is, therefore, not a sign of incompetence of the camera maker.. just them doing what the market wants.


« Last Edit: March 31, 2016, 04:38:49 am by Morgan_Moore »
Logged
Sam Morgan Moore Bristol UK

Tim Lookingbill

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 2436
Re: Does the emperor have clothes?
« Reply #25 on: March 31, 2016, 03:50:42 pm »

Then we've talked at cross purposes when you refer to tonemapping of contrast and definition back into a low contrast LogC type preview without considering saturation/hue errors which was what I was referring to regarding Rec 709 standards. I've become very familiar with the desaturated greenish yellow and sickly looking gold CC chart patches in your posted Panny example profiling my scanner and digital camera back when Adobe first offered camera profiling software.

And similarly to you I've also cut my teeth tonemapping a lot of dynamic range back into at least 1000 of my Raw stills in CS5 and LR4 so I do understand your point about difficulties "knee" tweaking definition back into highlights with motion video.

I recall asking years back watching early GoPro sample footage on YouTube why when the camera panned across a daylight lit landscape as it approached the sun the entire scene darkened on the fly so quickly and gracefully compared to my still camera's sensor's less graceful rendering approaching full saturation according to the blown highlights in the histogram. 1/3 stop increase got a big highlight spike while the rest of the image didn't brighten much. I always thought it was just a way too extreme in-camera contrast curve causing this.

Then I found out most video that looks too good to be true online is edited in post to look that way. I was just unfamiliar with the software that enabled the graceful "knee" control in giving the impression of "tasteful" highlight clipping.

Color was never an issue with the GoPro and most online video footage. Your Panny example is quite puzzling in this regard.
Logged

John Brawley

  • Jr. Member
  • **
  • Offline Offline
  • Posts: 77
Re: Does the emperor have clothes?
« Reply #26 on: April 01, 2016, 03:11:44 am »

My lay understanding of a lot of display standards like REC 709 is that it's merely a specification for a gamma curve and a colour GAMUT, that is....a BUCKET, that you can fill with whatever colour you like.  It doesn't have to be the right color, just as long as you don't overfill the bucket. 

sRGB apparently is based on REC 709,  It being the COMPUTER version, where as 709 is the VIDEO version.  Most video Processing is also done NOT as RBG, but as YUV, or component video and that is a SIGNIFICANT difference.  In YUV video brightness information is encoded discretely in the Y (green) channel, with only colour difference signals being encoded in the U and V part.  With RGB the brightness is encoded in each channel itself.   

Maybe you guys already know, but in VIDEO anything below 16 and above 235 is considered an "illegal" value and is truncated or rounded.  You also often get transcoding errors when you convert between RGB and YUV.  Typically, you might be RGB form the sensor, into Resolve where you then output to ProRes (for example) which is a YUV codec. (Though there is one RBG specified flavour)  So when you watch your YUV video on a computer screen, you're not really 'correctly" viewing it for the display technology it's intended for.  You should output it to an actual television, where it will then display correctly.  This is super simple to test.  Get a high Dynamic range image with a lot of coulur and look at it on your desktop, laptop, device, then try on your ipAd and THEN send it to your TV via Apple TV or chromecast, or compare vimeo on Apple TV to a computer screen.

The actual DISPLAY technology / processing is different.

So when you're colour correcting, a more "professional" setup is to ACTUALLY grade on a TV, not a computer screen, because if that's how people will watch it (as broadcast) then why would you use a different display technology ?

Same for a cinema finish.  A DI will typically use a projector because again, the way a projector works as a display technology is different.

There's also difference from print because I believe because print is REFLECTIVE, it has to reflect the light hitting it in the right colour whereas a display technology is direct, it has to GENERATE that colour and brightness variation OVER time.  You also get differences in display technology, contrast range, viewing angles, refresh rate etc.

Also, if I read the REC 709 "standard" correctly, it only specifies for 10 bits.  Many video cameras now shoot in excess of 10 bits of DR, so how do you squeeze that DR into the file ?

Video is also dynamic.  It changes over time, and needs to do so seamlessly.  Your GoPro example honestly sounds like the auto exposure tracking the exposure over time.  GoPro only recently added manual exposure override !

ACES is meant to be a unifier of all these issues, but I doubt it will make them go away.

jb





« Last Edit: April 01, 2016, 03:16:40 am by John Brawley »
Logged

Tim Lookingbill

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 2436
Re: Does the emperor have clothes?
« Reply #27 on: April 01, 2016, 08:46:54 am »

The actual DISPLAY technology / processing is different.

So when you're colour correcting, a more "professional" setup is to ACTUALLY grade on a TV, not a computer screen, because if that's how people will watch it (as broadcast) then why would you use a different display technology ?

jb

You've actually seen this done in a professional setup?

I would think color grading would be done on a high end calibrated system. I just can't see editing color video on a TV would repurpose effectively to other viewing devices. And if they had to re-edit for each device, I would think would be a very inefficient and expensive workflow.

Rec 709 is a standard for a reason in a non-color managed environment that appears to be the case in motion pictures. You edit once and convert/map the results to a standard that appears from viewing the same content I've seen at my local movie theater, broadcast on cable TV on my HDTV/Blu-Ray and the YouTube versions viewed on my computer screen to be a standard that's working pretty well.

I'm not seeing wacky color distortions as in the Panny example which is a source device issue where as viewing devices are an output issue. Both show different color change inconsistencies that are separate from each other.

And I'm quite aware of the minor color distortions created from YUV (with analog) and YCbCr (with digital) color sampling ratios shown/explained here...

http://www.red.com/learn/red-101/video-chroma-subsampling



« Last Edit: April 01, 2016, 08:53:29 am by Tim Lookingbill »
Logged

Morgan_Moore

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 2356
    • sammorganmoore.com
Re: Does the emperor have clothes?
« Reply #28 on: April 01, 2016, 10:29:25 am »

Tim there are two ways to deliver to differing devices..

1) grade on them all on the specific device/Take a punt.. (poor plan but often done)

2) Use ACES, which is a grade on a super wide gamut high end reference monitor and then the graded content is downgraded to deliverables space on export using an ODT (Output Delivery Transform)

http://www.oscars.org/science-technology/sci-tech-projects/aces

S

« Last Edit: April 01, 2016, 12:22:54 pm by Morgan_Moore »
Logged
Sam Morgan Moore Bristol UK

Morgan_Moore

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 2356
    • sammorganmoore.com
Re: Does the emperor have clothes?
« Reply #29 on: April 01, 2016, 12:26:06 pm »

My lay understanding of a lot of display standards like REC 709

Did you see the Pani/Arri video. Irrelevant of device the Panny just looked off, but had not been treated with a factory LUT, unlike the Arri.

BTW most people on this board are not likely to be delivering for broadcast television but web commercials and advertorials, so sRGB is pretty relevant here.

S
Logged
Sam Morgan Moore Bristol UK

John Brawley

  • Jr. Member
  • **
  • Offline Offline
  • Posts: 77
Re: Does the emperor have clothes?
« Reply #30 on: April 01, 2016, 03:00:08 pm »

You've actually seen this done in a professional setup?

I would think color grading would be done on a high end calibrated system. I just can't see editing color video on a TV would repurpose effectively to other viewing devices. And if they had to re-edit for each device, I would think would be a very inefficient and expensive workflow.

If you watch the making of video on the bottom left, you'll notice towards the end of the video there's an interview with a colourist.

Have a look at what's on the bench behind him.  It's a 10 year old Sony CRT HD television.

https://www.blackmagicdesign.com/products/blackmagicursamini

CRT.

This is at Company 3, one of the best and most renowned grading facilities in the world, where we were grading 4.6K RAW footage on a TV that can't even display 4K or UHD material.  These guys grade Oscar winning films and the very most high end commercials.

It's not just an "old" monitor, it's an old technology.

But Company 3 are holding onto those monitors because nothing touches them for their colour accuracy and fidelity compared to new display technology.  Some of the newer Dolby monitors are getting close, but they still have problems with viewing angles. (not great in a client oriented colour critical environment)

CO3 bought all the remaining tubes from Sony when these monitors went EOL a few years ago and they've nearly run out of replacement CRTs so they'll have to move to something elses soon, but they don't want to, because they haven't found anything they like.

I honestly prefer the newer Dolby monitors and some of the newer professional OLEDs are ok too, but colourists who look at pictures every single day and just grade full time....they all look at Televisions, and they all prefer the old technology, especially the BVM CRTs.

If I walked in to do a grade at a new facility, and they colourist only had a computer screen to work from, I'd know that they were "amateurs" and not a professional facility.



Rec 709 is a standard for a reason in a non-color managed environment that appears to be the case in motion pictures.


No.  I mean yes, but often there are many masters delivered, P3 tends to be the CINEMA / Projection master colour space for then creating a DCI from.

REC 709 is how the files that end up on iTunes and dvd's will be, but they are often graded in a colour space that's GREATER than that and then squashed into that container.

And for that mater, iTunes is a great example.

iTunes means you can buy content to watch on multiple devices.  Buy something you graded and know well and watch it on a laptop or a TV through Apple TV, and they all will look pretty different.


You edit once and convert/map the results to a standard that appears from viewing the same content I've seen at my local movie theater, broadcast on cable TV on my HDTV/Blu-Ray and the YouTube versions viewed on my computer screen to be a standard that's working pretty well.



I don't think it does, it's terrible actually.  My images never look as good as they do when I grade them.  Once they leave mastering it's all downhill and the wild west.




I'm not seeing wacky color distortions as in the Panny example which is a source device issue where as viewing devices are an output issue. Both show different color change inconsistencies that are separate from each other.

The gamma is totally different.  It's easy to see on the same content.

And I'm quite aware of the minor color distortions created from YUV (with analog) and YCbCr (with digital) color sampling ratios shown/explained here...

http://www.red.com/learn/red-101/video-chroma-subsampling


I think you're confusing chroma subsampling with YUV / Vs RGB sampling.  They're different issues.  ONce you're in YUV, the issue you raise is a form of compression or way of trying to deal with processing WITHIN YUV.  That's still a step below the fact that you're IN YUV / component space.

Most people don't know, and most don't care, and think that's just how it is.

JB
Logged

Tim Lookingbill

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 2436
Re: Does the emperor have clothes?
« Reply #31 on: April 01, 2016, 04:16:26 pm »

REC 709 is how the files that end up on iTunes and dvd's will be, but they are often graded in a colour space that's GREATER than that and then squashed into that container.

At 3:50 mark in the finished REC 709 video of the car and rocky cliffs is quite correct looking compared to the orangish saturated Colourist's previews of that scene on his computer and Sony CRT display which all strangely match but don't match the actual footage. Is the color mismatch on account it's not color graded or is that the effect of the BM cam's color gamut capturing that scene of the colorist interview being greater than the display's which basically looks like assigning AdobeRGB profile to an sRGB encoded image in Photoshop.

I take it there is no way ever in all the professionally captured and graded motion picture footage I've viewed will I ever see a 000RGB black and 255RGB white or anything close to it. On my calibrated sRGB-ish LG 1920x1080 computer screen connected to a Mac Mini, Apple DigitalColor Meter reads a max black Lab luminance of 20 in the folds of the man's black pants in shadows and the sun highlight maxes out at 236RGB. So much for 15 stops of detail. I take it all video is going to have milky shadows and dull highlights until that legal broadcast standard takes a dive.

Lopping off the ends of the dynamic range just to (I'm assuming to meet some legal broadcast standard) is far worse than color gamut squashing. There wasn't any color in that video that wouldn't fit nicely in Rec 709 or sRGB. But I now understand why the colorist needs the Sony TV. His edit is actually reducing the dynamic range and color gamut to fit those viewing devices and broadcast standard. I take it then the digital intermediate as an archived master retains ALL the DR and gamut to take advantage of future advanced viewing device technology.

Separating color gamut from luminance dynamics is confusing enough attempting to understand why that Panny screwed up color. And it isn't on account of a gamma curve mismatch IMO. It needs a camera profile which applies an HSL (Adobe Camera Raw panel) like color correction table to the video stream.
Logged

Tim Lookingbill

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 2436
Re: Does the emperor have clothes?
« Reply #32 on: April 01, 2016, 04:26:28 pm »

Tim there are two ways to deliver to differing devices..

1) grade on them all on the specific device/Take a punt.. (poor plan but often done)

2) Use ACES, which is a grade on a super wide gamut high end reference monitor and then the graded content is downgraded to deliverables space on export using an ODT (Output Delivery Transform)

http://www.oscars.org/science-technology/sci-tech-projects/aces

S

Morgan, good info. Thanks!

I was wondering if the motion picture industry was going to adopt some kind of color management system. Hope it can integrate with all imaging software vendors. I noticed the video didn't list Adobe among the list of camera and software vendors like Sony and Canon. 
Logged

Morgan_Moore

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 2356
    • sammorganmoore.com
Re: Does the emperor have clothes?
« Reply #33 on: April 01, 2016, 04:29:38 pm »

Lopping off the ends of the dynamic range just to (I'm assuming to meet some legal broadcast standard) is far worse than color gamut squashing.

When outputting a graded file from resolve you can choose 'video levels' or 'data levels'

Video Levels put black at 16 and white at 235

Data Levels puts out 0-255

Obviously you should make the right deliverable and watch it on the right device.

'The Bridge' was of note to me.. on the web we had milky blacks and then I saw it on the Mrs CRT sony.. lovely pure black.. simply the outputted file was 'video levels' and should have been broadcast but not put onto the web for viewing on Srgb monitors. The web needs a 'data levels' file

S
« Last Edit: April 01, 2016, 04:38:53 pm by Morgan_Moore »
Logged
Sam Morgan Moore Bristol UK

John Brawley

  • Jr. Member
  • **
  • Offline Offline
  • Posts: 77
Re: Does the emperor have clothes?
« Reply #34 on: April 01, 2016, 09:38:58 pm »

Tim....

The scene you refer to is a faked scene. We didn't even grade in that suite and the "B roll" was shot weeks after the actual grade was done.

The footage on screen isn't representative of anything. The point was that in professional and even rudimentary post facilities, TELEVISIONs are used to grade motion images NOT computer screens.

Morgan the problem with outputting data range files is that you might watch it on a computer but you might also watch it on a TV. Same problems. (Like Apple TV)

JB.

Logged

Tim Lookingbill

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 2436
Re: Does the emperor have clothes?
« Reply #35 on: April 01, 2016, 11:31:36 pm »

When outputting a graded file from resolve you can choose 'video levels' or 'data levels'

Video Levels put black at 16 and white at 235

Data Levels puts out 0-255

Obviously you should make the right deliverable and watch it on the right device.

'The Bridge' was of note to me.. on the web we had milky blacks and then I saw it on the Mrs CRT sony.. lovely pure black.. simply the outputted file was 'video levels' and should have been broadcast but not put onto the web for viewing on Srgb monitors. The web needs a 'data levels' file

S

That answers a lot of other questions I've had viewing movies on a Sony 4K projector at my local theater. "The Witch" was really hard to watch and downright irritating with far worse milky blacks. Scenes taken in overcast daylight had highlights of white garments reading mid gray. It was like watching a dark movie through sunglasses.

I started reading the wiki on DCI/DCP on what might have caused this when other movies had well defined shadows and dense black points but still lighter than the black matt surround and overscan area which is the case with my Samsung HDTV, still no complaints about picture quality.

I read that some movies are downloaded directly to theater's 4K projector's servers through broadband which might have sent a video level signal instead of a data level. Nice to know it can be chosen in a grading software such as Resolve. There's got to be some type of flip switch embedded for TV vs Web/Projectors in the final release form because color management can only do so much.

Thanks for the insightful info Morgan and John.
Logged

Tim Lookingbill

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 2436
Re: Does the emperor have clothes?
« Reply #36 on: April 02, 2016, 10:02:57 am »

John, you're right about the B-roll black point differences.

I watched the actual promo video in Firefox which is color managed and checked black point levels of single frames with Apple DigitalColor Meter and I got readings down to 2 in the green channel so that video is output at "data levels" for web viewing on computer screens and I'm seeing a very robust dynamic range. There's quite a bit of shadow detail in low lit trees and I could make a slight distinction between the man's belt and black trousers when he's leaning on the glass balcony railing looking at the sun and surf.

I was thinking my own display's black levels were too light so I moved the video browser page over to my black bezel surround and compared 000RGB to the man's trousers and see my black density is pretty much the same as what I've seen on Sony 4K movies compared to its overscan which is a bit darker and screen surround which is pitch black. Black display density can never be as dark as the surround. I took a shot of my display to show these black density relationship variances that I'm assuming is the new normal in display dynamics.

I've seen 4K TV's at Best Buy that have even more dense blacks but the demo video just looks odd.
Logged

John Brawley

  • Jr. Member
  • **
  • Offline Offline
  • Posts: 77
Re: Does the emperor have clothes?
« Reply #37 on: April 02, 2016, 03:00:47 pm »

Tim....

The footage was designed and graded to be watched on a television.

It doesn't matter if your web browser is colour managed.

You can actually download the UHD file.  But you can also download the source DNG files. Why don't you try this.

Watch the UHD file on a television.

Grade the DNGs you have on your computer screen to match what you see on your television.

Render them out.

Watch on television.

Tell me how that turns out for you.

JB.
Logged

Tim Lookingbill

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 2436
Re: Does the emperor have clothes?
« Reply #38 on: April 02, 2016, 10:30:51 pm »

Tim....

The footage was designed and graded to be watched on a television.

It doesn't matter if your web browser is colour managed.

You can actually download the UHD file.  But you can also download the source DNG files. Why don't you try this.

Watch the UHD file on a television.

Grade the DNGs you have on your computer screen to match what you see on your television.

Render them out.

Watch on television.

Tell me how that turns out for you.

JB.

John, I don't understand why you'ld want me to try that.

The video looks great on my computer screen. Or are you saying going by what I posted of the shot of my display black point the video is too dark or too light? Or do you just want me to see the levels change just to compare?

Besides, my Samsung HDtv is eyeball calibrated to render tonality similar to what I see on my computer screen including white balance though a bit on the blue side. I have all Black level adjusts turned off. Also I play through my HDMI connect and the HDMI black point menu setting is now unselectable and defaults to normal. When I hooked it up to my Mac it's selectable but it crushed black point and produced posterized artifacts in the shadows as if there wasn't enough RGB levels available to roll out smoothly out of black.

When you say the video was designed to be watched on a television is that in reference to the levels (video vs data)?
Logged
Pages: 1 [2]   Go Up