Luminous Landscape Forum

Equipment & Techniques => Motion & Video => Topic started by: fredjeang2 on March 21, 2016, 10:25:28 am

Title: Does the emperor have clothes?
Post by: fredjeang2 on March 21, 2016, 10:25:28 am
Weired, extremely weired...it makes think.

Same Lut on same footage. The "oficial" BMPCC. Applied ROTB.
2 different color engines.

Left: Lightworks interpretation of the LUT
right: Resolve interpretation of the LUT.
Footage: Prores 442, flat.

Right clips highlights more and retains less information as well as the skin tone is not correct. (I know exactly how the woman's skin really is because she's my best friend. Left is correct, right ain't: orangish), also check the background buildings walls...this image tells it all.

If Resolve color engine is more problematic and less accurate than LW: Houston we have a problem!

Mmmm...seems to me than the emperor may have not so high fashion clothes.
Title: Re: Does the emperor have clothes?
Post by: Christopher Sanderson on March 21, 2016, 11:13:03 am
But can you adjust the Resolve image to give the same result as the LW?

If you can, it simply means that the LUT is starting from a different base point (engine) and you therefore need to use an 'adjusted' LUT for Resolve no?
Title: Re: Does the emperor have clothes?
Post by: fredjeang2 on March 21, 2016, 11:20:31 am
But can you adjust the Resolve image to give the same result as the LW?

If you can, it simply means that the LUT is starting from a different base point (engine) and you therefore need to use an 'adjusted' LUT for Resolve no?
Yes...but that's one more step! (I'm lazy)
And then, the lack of information in the highlights (or the lost informations) ain't that easy to recover. It's tricking buttons....
Title: Re: Does the emperor have clothes?
Post by: John Brawley on March 21, 2016, 11:37:47 am
What is the intent of the LUT ?

Who made the LUT ?

What precision is the LUT ?

There's a zillion reasons why this could be happening.  Lut's are kind of like end stage "filters" and if you are applying a "filter" that's expecting a different set of input characteristics then you'll get unexpected results.

Is this a LUT you have generated ?


JB
Title: Re: Does the emperor have clothes?
Post by: fredjeang2 on March 21, 2016, 12:01:42 pm
What is the intent of the LUT ?

Who made the LUT ?

What precision is the LUT ?

There's a zillion reasons why this could be happening.  Lut's are kind of like end stage "filters" and if you are applying a "filter" that's expecting a different set of input characteristics then you'll get unexpected results.

Is this a LUT you have generated ?


JB
No, I didn't create the LUT. It is the BM lut that comes by default for flat. Equivalent to the Arri logC-to-Rec709. There is just one (that I know) in what BM is concerned, it's always the same one. The question is that this LUT gives me a relatively accurate base in LW while the Resolve's interpretation is not that accurate as a starting point and needs more work.
So the LUT itself seems to be OK. It is an engine's implementation (apparently) what I'm questionning. Maybe there are more oscur reasons for this.

That I know (I don't know all), BM does not have an oficial generator such as Arri does, and is not Cineon either like Red is (redlogfilm). So there is nothing "oficial".
Title: Re: Does the emperor have clothes?
Post by: fredjeang2 on March 21, 2016, 12:17:59 pm
For godness sake: once again a super basic english mistake. Official is with 2 f. I'm getting crazy with my english. Seems like the more I write, the worse.

I need a sexy english teacher in high-heels and stockings. ;D
Title: Re: Does the emperor have clothes?
Post by: Morgan_Moore on March 21, 2016, 06:53:02 pm
have you played with resolve settings.. video vs data levels also there is a checkbox 'equalise for mac' or something (which should be off AFAIK)

S
Title: Re: Does the emperor have clothes?
Post by: fredjeang2 on March 21, 2016, 08:29:31 pm
have you played with resolve settings.. video vs data levels also there is a checkbox 'equalise for mac' or something (which should be off AFAIK)

S
Thanks Morgan. I'll check this "equalize for Mac" see if it comes from that. I guess it has something to do with gamma. That could explains the clipping. I ignored this option.
Title: Re: Does the emperor have clothes?
Post by: John Brawley on March 22, 2016, 10:27:13 am
So there are several BMD luts really, before we even get into how they can be interpreted. Each camera in RAW for example has it's own specific transforms in the RAW tab.  There are also differences between RAW and ProRes, plus yes, you can change between VIDEO range and DATA range for each clip as well.

I'm guessing you're in ProRes using the 709LUT which isn't a very great looking transform even inside Resolve. Have you tried  Hook's free LUTS ?

LUT's are like a kind of cheat to get you somewhere fast, and usually are specific to a certain set of circumstances.  They rarely interoperate well because of the ways applications processing pipelines works.  So you often see different LUTS that do the same job for different applications. 

I presume you want a LUT that enables you to put a nice look onto your rushes from your log footage in LW ?  You might be better off creating your own (easy to do in Resolve) and that way you can season to taste.

And actually, it's a good way to TEST what's happening. Make a LUT and then try it out in in LW....compare the two images.

JB
Title: Re: Does the emperor have clothes?
Post by: Rob C on March 22, 2016, 11:15:46 am
For godness sake: once again a super basic english mistake. Official is with 2 f. I'm getting crazy with my english. Seems like the more I write, the worse.

I need a sexy english teacher in high-heels and stockings. ;D


If you find her, send her over here when you've finished: have cameras waiting.

Rob C
Title: Re: Does the emperor have clothes?
Post by: fredjeang2 on March 23, 2016, 12:11:08 am

If you find her, send her over here when you've finished: have cameras waiting.

Rob C
But it's not going to be "B initials"

 ;)

PS: Too bad I didn't keep in touch with those 3 crazy models I met a few years ago. They were from your island. One was really hot and not shy (the way we like). it was fun how she started to strip in front of me while in the dressing room (she left on purpose the door half-open) doing naughty things with her tongue...
Title: Re: Does the emperor have clothes?
Post by: Morgan_Moore on March 28, 2016, 02:25:58 pm

LUT's are like a kind of cheat to get you somewhere fast,
JB

I am not in this camp at all.

All cameras are intimately known by their manufacturers, the LUT is the route from Log (or whatever) to 709 curve full colour saturation.

I would not buy a (log) camera that did not have good LUT support from the manufacturer.

Examining what LUTs do is complex and often way more than an Scurve and Sat boost.

Life is too short for building ones own LUT.

(those of us from stills are used to Capture One or whatever supporting our camera fully via metadata.. the image just arrives looking good to be messed up by us if we want)

Back in the day when choosing between the F3 and the BMC4k for my film the Lutless F3 was left for dust by the BMC with full LUT support in Resolve.

S





Title: Re: Does the emperor have clothes?
Post by: Tim Lookingbill on March 28, 2016, 02:38:07 pm
I am not in this camp at all.

All cameras are intimately known by their manufacturers, the LUT is the route from Log (or whatever) to 709 curve full colour saturation.

I would not buy a (log) camera that did not have good LUT support from the manufacturer.

Examining what LUTs do is complex and often way more than an Scurve and Sat boost.

Life is too short for building ones own LUT.

(those of us from stills are used to Capture One or whatever supporting our camera fully via metadata.. the image just arrives looking good to be messed up by us if we want)

Back in the day when choosing between the F3 and the BMC4k for my film the Lutless F3 was left for dust by the BMC with full LUT support in Resolve.

S

Been interested in this subject for quite awhile in a attempt to understand the function of the LUT used in digital cinema cameras. As a comparison to Raw stills I'm assuming is similar to the Default settings in ACR/LR and basically function as a LUT. Is this so?

I'm guessing the difference with digital cine-cams over still-cams is that their Raw rendering doesn't start out as a dark linear preview but the flat/dull in the middle of the histogram LogC file or is the LogC flat rendering another form of a different LUT to get that flat look.

Or all digital sensor cameras both still and motion have to start out with a dark linear data in their Raw format?
Title: Re: Does the emperor have clothes?
Post by: John Brawley on March 28, 2016, 05:52:49 pm
I am not in this camp at all.

All cameras are intimately known by their manufacturers, the LUT is the route from Log (or whatever) to 709 curve full colour saturation.

I would not buy a (log) camera that did not have good LUT support from the manufacturer.

Examining what LUTs do is complex and often way more than an Scurve and Sat boost.

Life is too short for building ones own LUT.

(those of us from stills are used to Capture One or whatever supporting our camera fully via metadata.. the image just arrives looking good to be messed up by us if we want)

Back in the day when choosing between the F3 and the BMC4k for my film the Lutless F3 was left for dust by the BMC with full LUT support in Resolve.

S


A LUT is simply a mathematical transform, it's a table of numbers. 

It's the post production equivalent of the AUTO switch.  It makes assumptions about the input image AND the output space.  Many create these as shortcuts to get to a certain look very quickly in post where many shots can quickly overwhelm a production.  Of course, generally all the shooting is done with the ASSUMPTION that the LUT will be applied, and it's often EXPOSED and MONITORED on set using the same LUT applied to on-set monitoring.

You can get LUT's that work really well, but to really and truly get the most of any image, you will generally want to at the least build your own set of LUTs for a show.

For example on many of the TV drama show's I've done, we'll discuss a look in pre, shoot some tests, sit with a colourist and create a set of LUT's for the show.

I usually start with a DAYLIGHT WB and a TUNGSTEN WB LUT.  I often use lot's of different cameras so I'll then make sure I have a version of the same LUT for each of those WB's for EACH camera.

What this means is that in editorial the rushes form the day will generally look consistent for the edit, and are 80% of the final look of the show no matter which camera they were shot on.

Once the episode is locked off we return to the grade and do our final colouring.  For 50% of the shots the colourist will use the same show LUT that was applied and adjust post LUT to make some tweaks.  Sometimes they switch the LUT off and start again.  It just depends on what's happening.

I'm not ANTI lut per se, but it's really important to understand what they are doing and used for and that they are useful for some situations like bulk processing, but for best results are usually not, simply because each shot has different WB and exposure setting even within the same scene.

JB


Title: Re: Does the emperor have clothes?
Post by: Morgan_Moore on March 28, 2016, 06:31:30 pm

A LUT is simply a mathematical transform, ...
JB

Ok before we wander in to drama 'looks' let me explain what I think to be the 'basic function' of a camera; To reproduce colour, tone and saturation with reasonable accuracy, speed and simplicity

For example a mid grey should have a value of 125-125-125. The skin patch on a gretag card should have a value of 194-150-130.

(assuming you have shot under high quality daylight)

To achieve this manually from log can be a huge mission if you 'manually' grade an image.

Ergo getting a log camera without a LUT to achieve its 'basic function' can become a 'huge mission' - a fail in 'simplicity' which is a fail in being a camera.

Stills cameras/raw software have had this 'basic function' sorted for years. It wasn't sorted at all in the early years as many digi-back owners on this board (including me) will testify. It is however sorted now.

I believe that a log camera maker that does not issue a lut to move from however they record the image to colour and tone with reasonable accuracy is only selling half of a camera. A camera lacking in simple basic functionality.

The Sony F3 was a prime example of a camera that was just not fit for purpose in any way when used in unsupported (lut-wise from Sony) Slog1. You could spend a week tweaking the curves and not produce colour and tone with reasonable accuracy yes you could make something cool (groovy), something The Bridge, but you could not film a simple violet vermillion or puce jumper for a clothes company.. not in a million years.

Freds problem of the lutted image looking different between softwares is also IMO completely unacceptable assuming his setup is correct and the NLE and the camera maker support each other

Once 'basic function' is achieved of course one can, and should, be able then add 'artistic looks' in camera in a non destructive manner but hold the artistic choice in meta data and pass it to your post team to use or abandon. Again raw stills cameras have done this for years with looks like 'Sepia' or 'Vibrant Party'.. (which of course no 'pro' ever touches.. I dont know if you can load your own too.. probably)

While you are shooting admirable look based drama Im trying to match a drinks label to a pantone colour that we got off the client.. and without a fully functional log->709 LUT that is a huge mission. Try some :) (https://www.danmurphys.com.au/product/DM_795216/thatchers-gold-english-cider-500ml.jsp;jsessionid=8F2615618DE600C496A2C14BB2CCBF21.ncdlmorasp1306?bmUID=leA4yKO) .. the picture is shot in my studio, when you get the product you will find the label is indeed yellow.

I need that mathematical transform.

S








 


Title: Re: Does the emperor have clothes?
Post by: Morgan_Moore on March 28, 2016, 06:43:26 pm
Been interested in this subject for quite awhile in a attempt to understand the function of the LUT used in digital cinema cameras. As a comparison to Raw stills I

Luts are (in my view) the toddler version of what happens under the hood in your raw software.. the transform between the captured data and the image that is viewable with proper tone and contrast (or an artistic interpretation)

They are particularly toddler in that they dont stick to the meta data so get completely lost when your footage hits your NLE. This is being improved upon fast though.

Log is captured differently from raw. It is captured by cramming 12 stops of DR into a the visible signal range but that should not really matter.. as your workflow ideally would not show you this low contrast image any more than you see whatever a raw capture looks like.

Viewing log can be useful, however, as you can see the clip points of the file you are recording (unlike raw stills where I just 'know' ive got a bit more headroom

So yep IMO a lut equates pretty well to the default look business your raw software performs unless you tell it otherwise.

S

Title: Re: Does the emperor have clothes?
Post by: Christopher Sanderson on March 28, 2016, 10:48:26 pm
As I see it, a LUT is the motion equivalent of a profile. A look-up table that 'translates' one set of numbers into another to compensate for transitional changes in colour & tone due to process.

Take a 'standard' X: a set of RGB values representing a camera sensor's representation of 'reality'.

For the purposes of practicality (bandwidth), the camera's processor may have squeezed X into a diminished space Y using, amongst other things a LOG curve giving an very different recorded look of low saturation & contrast

The LUT attempts to translate that recorded Y back to the X that the camera originally 'saw' before in camera processing

Lots of steps and additional factors missing here  :o
Title: Re: Does the emperor have clothes?
Post by: John Brawley on March 28, 2016, 11:53:05 pm


I need that mathematical transform.

S


You're confusing a few concepts I think.

Camera manufacturer's make LUT's that THEY think are somewhat accurate representations of a given target colour space, say REC709, and these are often created from theoretical models of sensor response, not photometric testing.  REC 709 for example has a kind of limited amount of DR and colour gamut, even though that's the standard we all use for mastering to TV.

But it's still in the end, subjective, and it's TOTALLY dependent on the variables of exposure and white point when you're shooting, not to mention what colours are being reflected and what colour is in your lighting source.

You're relying on the constant of a LUT in the camera showing you the same as what you see on set, but a sensor isn't capturing what you see with a LUT, that's just a "profile" or version of what the camera captures.

Byt the way the true power of REDCODE is that it DOES take the LUT along as a little side file, and remembers that you make changes to it and re-interprets it as you change variables like ISO.

A LUT is really useful but they are a shortcut.

A seasoned knowledgeable colourist in Resolve will always get the most from a file by not using a LUT, or if they do, the LUT does a lot of the heavy lifting the already do on that particular file / exposure / setup type.

And creating a LUT is actually really easy.

JB

Title: Re: Does the emperor have clothes?
Post by: Morgan_Moore on March 29, 2016, 04:40:18 am
You're confusing a few concepts I think.

Maybe. Help me unravel them

Camera manufacturer's make LUT's that THEY think are somewhat accurate representations of a given target colour space, say REC709,

A target space is defined and hittable, gretag cards are defined, 709 is defined.. im sure people in camera development labs have much better tools than a simple card.. one reason that they should do the work of developing the transform to a target, not the end user.

An example of a space being defined can be found here. https://en.wikipedia.org/wiki/SRGB

Even 'ISO' is defined.. funnily enough by the International Standards Organisation, colloquially known as the ISO.

and these are often created from theoretical models of sensor response, not photometric testing. 

One would hope any maker of anything would test in the real world before launching to public sale.


REC 709 for example has a kind of limited amount of DR and colour gamut, even though that's the standard we all use for mastering to TV.

As does SRGb.. but in raw stills we know that we have the data shown on screen and more available via the exposure/temp/tint sliders and other tools, the small space is not a problem in developing a transform to that space.

Of course manufacturers should not only be across 709.. they should be issuing for Srgb, Rec2020 and good Aces IDTs for starters.

But it's still in the end, subjective, and it's TOTALLY dependent on the variables of exposure and white point when you're shooting, not to mention what colours are being reflected and what colour is in your lighting source.

My quote above mentioned the controlled target and the controlled light source. It did not however mention exposure point. Of course this is a valid variable in the pie. Sony for one publish targets such as 'grey at 38%', (or whatever)

You're relying on the constant of a LUT in the camera showing you the same as what you see on set, but a sensor isn't capturing what you see with a LUT, that's just a "profile" or version of what the camera captures.

Indeed it is just a version.. a very useful one.. the basic thing a camera should do, provide an image that hits some known targets. Im fully aware that in post we can use the extra data in the image to work it to fit our creative desires.

By the way the true power of REDCODE is that it DOES take the LUT along as a little side file, and remembers that you make changes to it and re-interprets it as you change variables like ISO.

Of course Red has always handled meta data well as do raw stills (since about 2005 but not before), this is, IMO, crucial to simplifing post especially when work is handed off.

A LUT is really useful but they are a shortcut.

Indeed. Im in this business to work fast (as is the definition of being in business?) and the first thing I need is a shortcut to make my camera hit some published targets.

A bit of history for you. Sinar basically made a fantastic digiback with the Dalsa chip but the post flow was terrible, as the file need deep tinkering to pull into a basic shape. Phase One took the same chip, and developed Capture One which made the post workflow solid. Sinar a very old company is now completely dominated by newbies Phase One. Phase one delivered shortcuts that paid dividends the business practice of the end user.

Arri have worked well with LUTs and this has really helped their strong foothold IMO

A seasoned knowledgeable colourist in Resolve will always get the most from a file by not using a LUT, or if they do, the LUT does a lot of the heavy lifting the already do on that particular file / exposure / setup type.

Im not sure that they will cost effectively hit published targets in a cost effective manner compared to using a (proper) factory lut, indeed the heavy lifting is this and the colourist job should be the adding the artistic sparkle on top, correcting on set errors and matching scenes for continuity, not 'mending' the camera.

The team at Resolve have tried to make a 'match to card' facility. This still does not work reliably.. to expect a lone colourist to do what the team at davinci cannot is somewhat optimistic.

And creating a LUT is actually really easy.

They physical act (or digital act) of 'save as LUT' is simple indeed. The work taking a log file and making it hit published targets is far from easy.
Title: Re: Does the emperor have clothes?
Post by: Morgan_Moore on March 29, 2016, 05:15:43 am
BTW John Im not trying to internet 'bait' you, hopefully you know I love your work, just disagree with you on this bit of techie stuff.

I have really struggled with my Sony F3 (sold it!) and while Im far better off with the FS7 it is stills a significant journey to deliver consistent bright accurate repeatable commercial colour especially if I dont have input into the post process.

So my journey with post impacts heavily on my need or not to spend money on changing camera systems.

S
Title: Re: Does the emperor have clothes?
Post by: Tim Lookingbill on March 29, 2016, 03:30:34 pm
Now that this thread has touched upon LUTs, Da Vinci Resolve and accurate color vs interpreted what do you all think of this observation on the differences of color rendering between a Sony F55 and ARRI Alexa...

http://www.dvinfo.net/article/post/making-the-sony-f55-look-filmic-with-resolve-9.html

I found it very informative in more ways than one especially on the subject of the "filmic" look the ARRI seems to have mastered brilliantly. It also confirms to me the color engine behavior of Adobe Camera Raw editing I see over other Raw converters in regards to hue/saturation changes when applying contrast either through curves or sliders.

The half and half at mid gray point of subtractive behavior in the highlights above mid gray and additive behavior below is what I observed quite often doing color separations by hand for commercial presses where dot gain in colored shadows showed additive behavior and less dot gain showed subtractive behavior (less saturation) in the upper mids to highlights.

The author of the linked article included a part 2 to his observation with this piece...

http://www.dvinfo.net/article/production/camgear/what-alexa-and-watercolors-have-in-common.html#prettyPhoto
Title: Re: Does the emperor have clothes?
Post by: John Brawley on March 29, 2016, 10:12:25 pm

[/i]
A target space is defined and hittable, gretag cards are defined, 709 is defined.. im sure people in camera development labs have much better tools than a simple card.. one reason that they should do the work of developing the transform to a target, not the end user.


The problem is that REC 709 is pretty limited these days and REC 2020 isn't really defined.

And you guys all probably work in SRGB which doesn't have a place in video.  You're all looking at computer screens but working to a broadcast standard designed for television screens (and gamuts) and for starter those spaces are all radically different.

I'll speak about BM because I know how they work as I talk to the engineers.  They made what they call a 709-ish LUT.  They say ISH, because believe me, you'd hate what it would look like as a straight REC709 transform, mostly because it's an OLD standard and based on TELEVISION and not computer screens.

So in keeping the spirit of 709 but making it "nicer" they made their own version.

The targets are many.  The input variables are many.  There's honestly no "right" way because the sensor captures WAY more than the target and then the variables of lighting optics, wb etc mean that the best they can make is a rough fit.  And that's what it is.  You're expecting something matchematically precise. and I think I'm saying thats the same thing, but you're assuming it's a lot more precise than it actually is.

You've used the colour checker / card match function in Resolve I assume ?  While I like a chip chart as much as anyone, you do know that there's just not enough data points on there to actually really and truly do what you are thinking is the right thing to do......some only have 6 patches...! 

I just recently learned that Olympus uses an even older standard, REC 601 for their INTERNAL recording gamut in video, but then use REC 709 for their HDMI output !  Crazy.  They both come form the same source yet they look different on the same image.  See where this is going ?  They use their own internal LUT to get them to that space for each defined space but they look different....



and these are often created from theoretical models of sensor response, not photometric testing. 

One would hope any maker of anything would test in the real world before launching to public sale.


You're not a manufacturer....:-)

They ALL do it because there is a lot of variation in sensor response, sensor to sensor in each camera.  Most manufactures have a QA process that means they try to calibrate each sensor and then add secret "corrections" to the file that are specific to each sensor. 

When you're trying to do calculations based on an idealised outcome, how do you choose which of your sensors is the GOLD standard when no 2 are the same ?



REC 709 for example has a kind of limited amount of DR and colour gamut, even though that's the standard we all use for mastering to TV.

As does SRGb.. but in raw stills we know that we have the data shown on screen and more available via the exposure/temp/tint sliders and other tools, the small space is not a problem in developing a transform to that space.

Yes but if you shoot on different cameras in the SRGB profile they don't end up making the exact same images do they ?  Their spectral response is different at a sensor level and they can try to re-map that but it will never be the same.


Arri have worked well with LUTs and this has really helped their strong foothold IMO


No.  ProRes is why they work well.  And look at the current trend in advertising.  They all look like they're not even grading anymore, they just put the LOG files to air....:-)

Title: Re: Does the emperor have clothes?
Post by: Morgan_Moore on March 30, 2016, 03:11:03 am
OK you make some good points about basically the whole thing being a bit 'wooly' (I learned something :) )

BTW Im not a complete numbers man Im happy to go on feel.

Check this by Charles Papert..

https://vimeo.com/159843345

Here if you pretend you were a client who sold blue jumpers (english word? top clothing made of blue wool) you can see that the rendition of the Arri and the Panny Varicam is completely different. As a client you would freak out. Is the jumper electric blue (Panny) or subtle blue (arri) ???

My understanding is the 'neutral' lut/grade on the Panny was made by a colourist, probably a pro, now when I loaded the Panny LUT from the website I got a much closer match to both the Arri and the numbers from the test chart and the 'feel test'.

Without the Panny LUT I would not feel confident in using the Panny on a commercial shoot.

On not being numbers I can 'smell' and image that 'feels right', whatever my monitor movie trailers tend to look good, iphone videos bad, so a slight monitor shift is not the issue. In that case the Arri feels right, the jumper, her hair whatever.

When I load a 5d still or a nikon still into Capture One it 'feels right'.

When I load a FS7 Slog clip into Resolve and add an Scurve and some sat it feels very wrong, when I load the factory LUT it starts to feel 80% right. 80% is OK but not really good enough, the Arri seems to feel 100% right just with the addition of the LUT

As for 'SRGB has no place in video'.. well I guess we are hitting 50-80% of all media being watched on computers and 'devices'. With an especially high % for non drama or show style television.

If this makes you unhappy.. then Aces is the solution you can move to the small space at the last step :)

Back to my overall point - I feel a good factory LUT is really important part of the camera/package.


Or it certainly is with Sony..

Here is the sony slog3 FS7, untreated, treated, with a saturation boost and Scurve, and treated with a factory LUT, to me getting a good image without the LUT would be a significant challenge..

https://www.dropbox.com/s/kdomto5r9lv84jj/tracksuit.jpg?dl=0

S



Title: Re: Does the emperor have clothes?
Post by: Tim Lookingbill on March 30, 2016, 02:48:49 pm
I'll speak about BM because I know how they work as I talk to the engineers.  They made what they call a 709-ish LUT.  They say ISH, because believe me, you'd hate what it would look like as a straight REC709 transform, mostly because it's an OLD standard and based on TELEVISION and not computer screens.

So in keeping the spirit of 709 but making it "nicer" they made their own version.

The targets are many.  The input variables are many.  There's honestly no "right" way because the sensor captures WAY more than the target and then the variables of lighting optics, wb etc mean that the best they can make is a rough fit.  And that's what it is.  You're expecting something matchematically precise. and I think I'm saying thats the same thing, but you're assuming it's a lot more precise than it actually is.

How does this not bring into question the ability of the BM engineers to map color to a REC 709 preview from a supposedly characterized/measured digital sensor's spectral response to a mathematically described color target of REC 709. Or else how do you explain Morgan's ARRI Amira spot on color rendering verses the Panny?

I'm assuming that Amira shot was not edited to look correct because I am viewing it on a display whose color is measured as sRGB-ish in a color managed browser. According to your BM explanation it should be way off.

My ten years or so observation based analysis of digital color technology is that a device's color rendering of what the human eyes see is either WAY off from standards (due to less precise measuring and thus mapping to preview) or it's close enough as in the ARRI rendering which doesn't make this an accident due to sensor differences.

You haven't given any evidence or explanation that it's not BM's fault. I have a $50 Epson printer from Walmart that can nail sRGB color pretty consistently across a wide range of still images. We're talking liquid inkjet ink spraying onto a substrate and they seem to get it right, but engineers at a major camera manufacturer has issues with mapping measured spectral sensor responses to a known standard? You'll have to forgive me but I'm having a hard time understanding BM's explanation.
Title: Re: Does the emperor have clothes?
Post by: Morgan_Moore on March 31, 2016, 04:23:42 am
Tim - BMC are not unique in 'interpreting' 709.(as the SonySlog3-LC709 lut seen in your article above show!)

With a proper contrast ratios the 709 gamma supports about 7 stops of dynamic range.

Now consider designing a transform (LUT) from a camera that can record 11 stops to that small space

The correct transform would be to simply clip off stops 1 and 2 and clip off stops 10 and 11. (let us call that a STD, standard, LUT)

Now what if you added a 'foot' and 'knee' to your transform maths.. rather than just clip stops 10 and 11 you could slowly roll the highlights and make full use of the data (but deviate from the exact response laid out in the tech specs)

Now what if you did not like 'the video look' not only could you roll the highlights a little but desaturate them too. You could call this a Filmic 709 LUT (F709)

Put a real world shots in front of an audience with both STD and F709 luts applied and the audience will pick F709 every time, or at least the non X Factor audience will.

So most of these cinema cameras (including Arri) have LUTs that are not STD but F709 of some nature.

As it happens Sony issue two luts for Slog3, '709' and 'LC709' they do both and make clear in their paperwork the function of each, the latter LC is Low Contrast. Im not saying they do it well (far from it) but at least they do it with some clarity.

Do I approve of F709 luts? Im not sure. Simply because they hold highlights better usually you can get a 'pretty' result more quickly than taking STD and 'burning' your highlights back with a complex 'secondary' process. In stills we 'burn' skies every day, but working on a motion sequence doing that is a lot more painful.

I guess being a 'feel' man Im pretty happy with most LUTs that are accurate in the middle five or six stops contrast wise, and have good colour rendition from bottom to maybe 10-20% below clip.

If you learned colour work doing stills and use photshop or C1/LR,  a 'knee' in your LUT can be a bit problematic, because the process of really working (if you have time) the digital motion file will involve both a 'burn' of the highlights and also a twist of the gamma curve at the top end to bring the contrast ratio back.

Ive battled in transferring my raw+photoshop 'skills' to motion grading for the last few years, this dealing with knee, either built in to the camera, or applied as a lut, is one of the challenges.
 
I would therefore be most happy with manufacturers that issued a 'true' 709 LUT and a F709 Lut.

Not actually hitting perfect 709 is, therefore, not a sign of incompetence of the camera maker.. just them doing what the market wants.


Title: Re: Does the emperor have clothes?
Post by: Tim Lookingbill on March 31, 2016, 03:50:42 pm
Then we've talked at cross purposes when you refer to tonemapping of contrast and definition back into a low contrast LogC type preview without considering saturation/hue errors which was what I was referring to regarding Rec 709 standards. I've become very familiar with the desaturated greenish yellow and sickly looking gold CC chart patches in your posted Panny example profiling my scanner and digital camera back when Adobe first offered camera profiling software.

And similarly to you I've also cut my teeth tonemapping a lot of dynamic range back into at least 1000 of my Raw stills in CS5 and LR4 so I do understand your point about difficulties "knee" tweaking definition back into highlights with motion video.

I recall asking years back watching early GoPro sample footage on YouTube why when the camera panned across a daylight lit landscape as it approached the sun the entire scene darkened on the fly so quickly and gracefully compared to my still camera's sensor's less graceful rendering approaching full saturation according to the blown highlights in the histogram. 1/3 stop increase got a big highlight spike while the rest of the image didn't brighten much. I always thought it was just a way too extreme in-camera contrast curve causing this.

Then I found out most video that looks too good to be true online is edited in post to look that way. I was just unfamiliar with the software that enabled the graceful "knee" control in giving the impression of "tasteful" highlight clipping.

Color was never an issue with the GoPro and most online video footage. Your Panny example is quite puzzling in this regard.
Title: Re: Does the emperor have clothes?
Post by: John Brawley on April 01, 2016, 03:11:44 am
My lay understanding of a lot of display standards like REC 709 is that it's merely a specification for a gamma curve and a colour GAMUT, that is....a BUCKET, that you can fill with whatever colour you like.  It doesn't have to be the right color, just as long as you don't overfill the bucket. 

sRGB apparently is based on REC 709,  It being the COMPUTER version, where as 709 is the VIDEO version.  Most video Processing is also done NOT as RBG, but as YUV, or component video and that is a SIGNIFICANT difference.  In YUV video brightness information is encoded discretely in the Y (green) channel, with only colour difference signals being encoded in the U and V part.  With RGB the brightness is encoded in each channel itself.   

Maybe you guys already know, but in VIDEO anything below 16 and above 235 is considered an "illegal" value and is truncated or rounded.  You also often get transcoding errors when you convert between RGB and YUV.  Typically, you might be RGB form the sensor, into Resolve where you then output to ProRes (for example) which is a YUV codec. (Though there is one RBG specified flavour)  So when you watch your YUV video on a computer screen, you're not really 'correctly" viewing it for the display technology it's intended for.  You should output it to an actual television, where it will then display correctly.  This is super simple to test.  Get a high Dynamic range image with a lot of coulur and look at it on your desktop, laptop, device, then try on your ipAd and THEN send it to your TV via Apple TV or chromecast, or compare vimeo on Apple TV to a computer screen.

The actual DISPLAY technology / processing is different.

So when you're colour correcting, a more "professional" setup is to ACTUALLY grade on a TV, not a computer screen, because if that's how people will watch it (as broadcast) then why would you use a different display technology ?

Same for a cinema finish.  A DI will typically use a projector because again, the way a projector works as a display technology is different.

There's also difference from print because I believe because print is REFLECTIVE, it has to reflect the light hitting it in the right colour whereas a display technology is direct, it has to GENERATE that colour and brightness variation OVER time.  You also get differences in display technology, contrast range, viewing angles, refresh rate etc.

Also, if I read the REC 709 "standard" correctly, it only specifies for 10 bits.  Many video cameras now shoot in excess of 10 bits of DR, so how do you squeeze that DR into the file ?

Video is also dynamic.  It changes over time, and needs to do so seamlessly.  Your GoPro example honestly sounds like the auto exposure tracking the exposure over time.  GoPro only recently added manual exposure override !

ACES is meant to be a unifier of all these issues, but I doubt it will make them go away.

jb





Title: Re: Does the emperor have clothes?
Post by: Tim Lookingbill on April 01, 2016, 08:46:54 am
The actual DISPLAY technology / processing is different.

So when you're colour correcting, a more "professional" setup is to ACTUALLY grade on a TV, not a computer screen, because if that's how people will watch it (as broadcast) then why would you use a different display technology ?

jb

You've actually seen this done in a professional setup?

I would think color grading would be done on a high end calibrated system. I just can't see editing color video on a TV would repurpose effectively to other viewing devices. And if they had to re-edit for each device, I would think would be a very inefficient and expensive workflow.

Rec 709 is a standard for a reason in a non-color managed environment that appears to be the case in motion pictures. You edit once and convert/map the results to a standard that appears from viewing the same content I've seen at my local movie theater, broadcast on cable TV on my HDTV/Blu-Ray and the YouTube versions viewed on my computer screen to be a standard that's working pretty well.

I'm not seeing wacky color distortions as in the Panny example which is a source device issue where as viewing devices are an output issue. Both show different color change inconsistencies that are separate from each other.

And I'm quite aware of the minor color distortions created from YUV (with analog) and YCbCr (with digital) color sampling ratios shown/explained here...

http://www.red.com/learn/red-101/video-chroma-subsampling



Title: Re: Does the emperor have clothes?
Post by: Morgan_Moore on April 01, 2016, 10:29:25 am
Tim there are two ways to deliver to differing devices..

1) grade on them all on the specific device/Take a punt.. (poor plan but often done)

2) Use ACES, which is a grade on a super wide gamut high end reference monitor and then the graded content is downgraded to deliverables space on export using an ODT (Output Delivery Transform)

http://www.oscars.org/science-technology/sci-tech-projects/aces

S

Title: Re: Does the emperor have clothes?
Post by: Morgan_Moore on April 01, 2016, 12:26:06 pm
My lay understanding of a lot of display standards like REC 709

Did you see the Pani/Arri video. Irrelevant of device the Panny just looked off, but had not been treated with a factory LUT, unlike the Arri.

BTW most people on this board are not likely to be delivering for broadcast television but web commercials and advertorials, so sRGB is pretty relevant here.

S
Title: Re: Does the emperor have clothes?
Post by: John Brawley on April 01, 2016, 03:00:08 pm
You've actually seen this done in a professional setup?

I would think color grading would be done on a high end calibrated system. I just can't see editing color video on a TV would repurpose effectively to other viewing devices. And if they had to re-edit for each device, I would think would be a very inefficient and expensive workflow.

If you watch the making of video on the bottom left, you'll notice towards the end of the video there's an interview with a colourist.

Have a look at what's on the bench behind him.  It's a 10 year old Sony CRT HD television.

https://www.blackmagicdesign.com/products/blackmagicursamini

CRT.

This is at Company 3, one of the best and most renowned grading facilities in the world, where we were grading 4.6K RAW footage on a TV that can't even display 4K or UHD material.  These guys grade Oscar winning films and the very most high end commercials.

It's not just an "old" monitor, it's an old technology.

But Company 3 are holding onto those monitors because nothing touches them for their colour accuracy and fidelity compared to new display technology.  Some of the newer Dolby monitors are getting close, but they still have problems with viewing angles. (not great in a client oriented colour critical environment)

CO3 bought all the remaining tubes from Sony when these monitors went EOL a few years ago and they've nearly run out of replacement CRTs so they'll have to move to something elses soon, but they don't want to, because they haven't found anything they like.

I honestly prefer the newer Dolby monitors and some of the newer professional OLEDs are ok too, but colourists who look at pictures every single day and just grade full time....they all look at Televisions, and they all prefer the old technology, especially the BVM CRTs.

If I walked in to do a grade at a new facility, and they colourist only had a computer screen to work from, I'd know that they were "amateurs" and not a professional facility.



Rec 709 is a standard for a reason in a non-color managed environment that appears to be the case in motion pictures.


No.  I mean yes, but often there are many masters delivered, P3 tends to be the CINEMA / Projection master colour space for then creating a DCI from.

REC 709 is how the files that end up on iTunes and dvd's will be, but they are often graded in a colour space that's GREATER than that and then squashed into that container.

And for that mater, iTunes is a great example.

iTunes means you can buy content to watch on multiple devices.  Buy something you graded and know well and watch it on a laptop or a TV through Apple TV, and they all will look pretty different.


You edit once and convert/map the results to a standard that appears from viewing the same content I've seen at my local movie theater, broadcast on cable TV on my HDTV/Blu-Ray and the YouTube versions viewed on my computer screen to be a standard that's working pretty well.



I don't think it does, it's terrible actually.  My images never look as good as they do when I grade them.  Once they leave mastering it's all downhill and the wild west.




I'm not seeing wacky color distortions as in the Panny example which is a source device issue where as viewing devices are an output issue. Both show different color change inconsistencies that are separate from each other.

The gamma is totally different.  It's easy to see on the same content.

And I'm quite aware of the minor color distortions created from YUV (with analog) and YCbCr (with digital) color sampling ratios shown/explained here...

http://www.red.com/learn/red-101/video-chroma-subsampling


I think you're confusing chroma subsampling with YUV / Vs RGB sampling.  They're different issues.  ONce you're in YUV, the issue you raise is a form of compression or way of trying to deal with processing WITHIN YUV.  That's still a step below the fact that you're IN YUV / component space.

Most people don't know, and most don't care, and think that's just how it is.

JB
Title: Re: Does the emperor have clothes?
Post by: Tim Lookingbill on April 01, 2016, 04:16:26 pm
REC 709 is how the files that end up on iTunes and dvd's will be, but they are often graded in a colour space that's GREATER than that and then squashed into that container.

At 3:50 mark in the finished REC 709 video of the car and rocky cliffs is quite correct looking compared to the orangish saturated Colourist's previews of that scene on his computer and Sony CRT display which all strangely match but don't match the actual footage. Is the color mismatch on account it's not color graded or is that the effect of the BM cam's color gamut capturing that scene of the colorist interview being greater than the display's which basically looks like assigning AdobeRGB profile to an sRGB encoded image in Photoshop.

I take it there is no way ever in all the professionally captured and graded motion picture footage I've viewed will I ever see a 000RGB black and 255RGB white or anything close to it. On my calibrated sRGB-ish LG 1920x1080 computer screen connected to a Mac Mini, Apple DigitalColor Meter reads a max black Lab luminance of 20 in the folds of the man's black pants in shadows and the sun highlight maxes out at 236RGB. So much for 15 stops of detail. I take it all video is going to have milky shadows and dull highlights until that legal broadcast standard takes a dive.

Lopping off the ends of the dynamic range just to (I'm assuming to meet some legal broadcast standard) is far worse than color gamut squashing. There wasn't any color in that video that wouldn't fit nicely in Rec 709 or sRGB. But I now understand why the colorist needs the Sony TV. His edit is actually reducing the dynamic range and color gamut to fit those viewing devices and broadcast standard. I take it then the digital intermediate as an archived master retains ALL the DR and gamut to take advantage of future advanced viewing device technology.

Separating color gamut from luminance dynamics is confusing enough attempting to understand why that Panny screwed up color. And it isn't on account of a gamma curve mismatch IMO. It needs a camera profile which applies an HSL (Adobe Camera Raw panel) like color correction table to the video stream.
Title: Re: Does the emperor have clothes?
Post by: Tim Lookingbill on April 01, 2016, 04:26:28 pm
Tim there are two ways to deliver to differing devices..

1) grade on them all on the specific device/Take a punt.. (poor plan but often done)

2) Use ACES, which is a grade on a super wide gamut high end reference monitor and then the graded content is downgraded to deliverables space on export using an ODT (Output Delivery Transform)

http://www.oscars.org/science-technology/sci-tech-projects/aces

S

Morgan, good info. Thanks!

I was wondering if the motion picture industry was going to adopt some kind of color management system. Hope it can integrate with all imaging software vendors. I noticed the video didn't list Adobe among the list of camera and software vendors like Sony and Canon. 
Title: Re: Does the emperor have clothes?
Post by: Morgan_Moore on April 01, 2016, 04:29:38 pm
Lopping off the ends of the dynamic range just to (I'm assuming to meet some legal broadcast standard) is far worse than color gamut squashing.

When outputting a graded file from resolve you can choose 'video levels' or 'data levels'

Video Levels put black at 16 and white at 235

Data Levels puts out 0-255

Obviously you should make the right deliverable and watch it on the right device.

'The Bridge' was of note to me.. on the web we had milky blacks and then I saw it on the Mrs CRT sony.. lovely pure black.. simply the outputted file was 'video levels' and should have been broadcast but not put onto the web for viewing on Srgb monitors. The web needs a 'data levels' file

S
Title: Re: Does the emperor have clothes?
Post by: John Brawley on April 01, 2016, 09:38:58 pm
Tim....

The scene you refer to is a faked scene. We didn't even grade in that suite and the "B roll" was shot weeks after the actual grade was done.

The footage on screen isn't representative of anything. The point was that in professional and even rudimentary post facilities, TELEVISIONs are used to grade motion images NOT computer screens.

Morgan the problem with outputting data range files is that you might watch it on a computer but you might also watch it on a TV. Same problems. (Like Apple TV)

JB.

Title: Re: Does the emperor have clothes?
Post by: Tim Lookingbill on April 01, 2016, 11:31:36 pm
When outputting a graded file from resolve you can choose 'video levels' or 'data levels'

Video Levels put black at 16 and white at 235

Data Levels puts out 0-255

Obviously you should make the right deliverable and watch it on the right device.

'The Bridge' was of note to me.. on the web we had milky blacks and then I saw it on the Mrs CRT sony.. lovely pure black.. simply the outputted file was 'video levels' and should have been broadcast but not put onto the web for viewing on Srgb monitors. The web needs a 'data levels' file

S

That answers a lot of other questions I've had viewing movies on a Sony 4K projector at my local theater. "The Witch" was really hard to watch and downright irritating with far worse milky blacks. Scenes taken in overcast daylight had highlights of white garments reading mid gray. It was like watching a dark movie through sunglasses.

I started reading the wiki on DCI/DCP on what might have caused this when other movies had well defined shadows and dense black points but still lighter than the black matt surround and overscan area which is the case with my Samsung HDTV, still no complaints about picture quality.

I read that some movies are downloaded directly to theater's 4K projector's servers through broadband which might have sent a video level signal instead of a data level. Nice to know it can be chosen in a grading software such as Resolve. There's got to be some type of flip switch embedded for TV vs Web/Projectors in the final release form because color management can only do so much.

Thanks for the insightful info Morgan and John.
Title: Re: Does the emperor have clothes?
Post by: Tim Lookingbill on April 02, 2016, 10:02:57 am
John, you're right about the B-roll black point differences.

I watched the actual promo video in Firefox which is color managed and checked black point levels of single frames with Apple DigitalColor Meter and I got readings down to 2 in the green channel so that video is output at "data levels" for web viewing on computer screens and I'm seeing a very robust dynamic range. There's quite a bit of shadow detail in low lit trees and I could make a slight distinction between the man's belt and black trousers when he's leaning on the glass balcony railing looking at the sun and surf.

I was thinking my own display's black levels were too light so I moved the video browser page over to my black bezel surround and compared 000RGB to the man's trousers and see my black density is pretty much the same as what I've seen on Sony 4K movies compared to its overscan which is a bit darker and screen surround which is pitch black. Black display density can never be as dark as the surround. I took a shot of my display to show these black density relationship variances that I'm assuming is the new normal in display dynamics.

I've seen 4K TV's at Best Buy that have even more dense blacks but the demo video just looks odd.
Title: Re: Does the emperor have clothes?
Post by: John Brawley on April 02, 2016, 03:00:47 pm
Tim....

The footage was designed and graded to be watched on a television.

It doesn't matter if your web browser is colour managed.

You can actually download the UHD file.  But you can also download the source DNG files. Why don't you try this.

Watch the UHD file on a television.

Grade the DNGs you have on your computer screen to match what you see on your television.

Render them out.

Watch on television.

Tell me how that turns out for you.

JB.
Title: Re: Does the emperor have clothes?
Post by: Tim Lookingbill on April 02, 2016, 10:30:51 pm
Tim....

The footage was designed and graded to be watched on a television.

It doesn't matter if your web browser is colour managed.

You can actually download the UHD file.  But you can also download the source DNG files. Why don't you try this.

Watch the UHD file on a television.

Grade the DNGs you have on your computer screen to match what you see on your television.

Render them out.

Watch on television.

Tell me how that turns out for you.

JB.

John, I don't understand why you'ld want me to try that.

The video looks great on my computer screen. Or are you saying going by what I posted of the shot of my display black point the video is too dark or too light? Or do you just want me to see the levels change just to compare?

Besides, my Samsung HDtv is eyeball calibrated to render tonality similar to what I see on my computer screen including white balance though a bit on the blue side. I have all Black level adjusts turned off. Also I play through my HDMI connect and the HDMI black point menu setting is now unselectable and defaults to normal. When I hooked it up to my Mac it's selectable but it crushed black point and produced posterized artifacts in the shadows as if there wasn't enough RGB levels available to roll out smoothly out of black.

When you say the video was designed to be watched on a television is that in reference to the levels (video vs data)?