Luminous Landscape Forum

Equipment & Techniques => Medium Format / Film / Digital Backs – and Large Sensor Photography => Topic started by: evgeny on October 05, 2012, 08:06:33 pm

Title: Dynamic range test
Post by: evgeny on October 05, 2012, 08:06:33 pm
Hi, does this unprocessed photo illustrate dynamic range of Hasselblad H4D-40?

Thanks
Evgeny
Title: Re: Dynamic range test
Post by: ErikKaffehr on October 05, 2012, 11:15:01 pm
Hi,

In now way.


DR is hard to measure. The best way to do it is to use a technique developed by Arri: http://www.arri.com/camera/tutorials/dynamic_range_test_charts.html

The second best is to use a "Stouffer Wedge", expose correctly to the right and try to recover as many dark steps as possible. Problem is that you need to reduce lens flare and light leaks.

Best regards
Erik
Title: Re: Dynamic range test
Post by: EricWHiss on October 06, 2012, 01:25:50 am
If you had a couple cameras and took the same picture and then set the input curves to linear and used neutral input profiles you could use a picture like this to compare DR of the cameras.    Probably there are people with enough experience to be able to make a broad judgement on an image such as yours. 

Definitely in the bright mid day sun if you can catch everything without blocking the highlights or shadows, it seems you have enough DR!
Title: Re: Dynamic range test
Post by: MrSmith on October 06, 2012, 02:09:54 am
It shows you used a fill in flash.
Title: Re: Dynamic range test
Post by: EricWHiss on October 06, 2012, 03:28:48 am
It shows you used a fill in flash.

Good catch! Looking at the red eyes now how could I have missed that!   And I think I missed the joke too, correct Evgeny?   
Title: Re: Dynamic range test
Post by: Dick Roadnight on October 06, 2012, 04:02:03 am
There is a difference between a practical "real world" illustration and a scientific test?

...and, if fill-flash was not used, and the was a white building in direct sunlight in the picture, it would have been a good illustration... except that sunlight is very inconsistent... depending on the cloud-fill.

For real-world assessment of a camera, post=processing would be used, and Hasselblad phocus is very useful.

This was Hasselblad H4D-50 (before I Upgraded to the 60) with cloud fill and pp with phocus... I thought that I might need HDR, so I took several pictures, but phocus managed.
Title: Re: Dynamic range test
Post by: evgeny on October 06, 2012, 04:07:10 am
I kept it simple.
Used a built-in -0.7 flash for fill.
I don't see red eyes at 100% magnification, but my eyes are not perfect. ;)

I probably need to grab a 35mm camera to make a better comparison.  ;D

Thanks
Evgeny
Title: Re: Dynamic range test
Post by: michele on October 06, 2012, 05:52:25 am
I think a better test can be trying to open the shadows... Here is an image taken with a 5d2... Exposed for the highlights, then processed 2 times, 1 for the sky, then for the shadows (+around 2 stops in capture one) and then blended together in photoshop...
Title: Re: Dynamic range test
Post by: FredBGG on October 06, 2012, 04:44:25 pm
Hi, does this unprocessed photo illustrate dynamic range of Hasselblad H4D-40?

Thanks
Evgeny

Mom in a white silk embroidered shirt next to the lovely kid would have been better example.....
Title: Re: Dynamic range test
Post by: Wayne Fox on October 06, 2012, 06:49:18 pm
Hi, does this unprocessed photo illustrate dynamic range of Hasselblad H4D-40?

How did you insure it was "unprocessed"?  By looking at the image, I'm pretty sure some "default" processing was applied by whatever program created the file.  Just because you made no additional changes yourself, it doesn't mean it was "unprocessed".

It's not that easy to create an image without any processing from the raw file, and the resulting image doesn't look like much.
Title: Re: Dynamic range test
Post by: BernardLanguillier on October 06, 2012, 08:04:44 pm
This was Hasselblad H4D-50 (before I Upgraded to the 60) with cloud fill and pp with phocus... I thought that I might need HDR, so I took several pictures, but phocus managed.

If I may ask, why did you have concerns about the DR of this scene?

Cheers,
Bernard
Title: Re: Dynamic range test
Post by: ErikKaffehr on October 07, 2012, 01:51:41 am
Hi,

If a subject has wide dynamic range and its is processed into an image it will either look flat or harsh, because screen or print cannot reproduce a wide dynamic range. So you need to apply some kind of tone mapping to an image to liven it up.

Some programs may do it on their own.

The image below is not extremely wide in tonal range, is just very boring:
(http://echophoto.dnsalias.net/ekr/images/Psychadelic/InitialImport-7524.jpg)

The image has been reprocessed with tone mapping and a technique for darkening the sky:
(http://echophoto.dnsalias.net/ekr/images/Psychadelic/FinalAdjustmentInLR-7524.jpg)

The third image has been processed in LR4, with the sky darkened in Photoshop using a duplicate layer, multiply and a luminosity mask based on the blue channel.
(http://echophoto.dnsalias.net/ekr/images/Psychadelic/FinalAdjustmentInLR-7524_small.jpg)


Now these adjustment may go over the top, see it more as an illustration than a good example.

Best regards
Erik




How did you insure it was "unprocessed"?  By looking at the image, I'm pretty sure some "default" processing was applied by whatever program created the file.  Just because you made no additional changes yourself, it doesn't mean it wasn't unprocessed.

It's not that easy to create an image without any processing from the raw file, and the resulting image doesn't look like much.
Title: Re: Dynamic range test
Post by: Dick Roadnight on October 07, 2012, 08:40:45 am
If I may ask, why did you have concerns about the DR of this scene?

Cheers,
Bernard

Hi, Bernard...

Of course you can ask.

There is a combination of deep shadows and dark foliage... and white painted buildings (and white van) in direct sunlight... I think that without the cloud fill it would have been difficult. (fill flash tends not to work too well at distances between 100 and 500 meters).

¿Has anyone got an app to wirelessly control their cloud fill reflectors from and iPad?
Title: Re: Dynamic range test
Post by: Guillermo Luijk on October 07, 2012, 08:54:58 am
It's not that easy to create an image without any processing from the raw file, and the resulting image doesn't look like much.

I think it is very easy, just using a RAW developer that only applies white balance and demosaicing. E.g. DCRAW, or RAW Therapee and ACR with all settings to 0.

Title: Re: Dynamic range test
Post by: ErikKaffehr on October 07, 2012, 10:05:33 am
Hi,

I don't think that scene was very challenging. Of course, you need to expose for the highlights, and that may leave some noise in the shadows. With LR, PV2012 seems to achieve some magic in highlight restoration.

Best regards
Erik


Hi, Bernard...

Of course you can ask.

There is a combination of deep shadows and dark foliage... and white painted buildings (and white van) in direct sunlight... I think that without the cloud fill it would have been difficult. (fill flash tends not to work too well at distances between 100 and 500 meters).

¿Has anyone got an app to wirelessly control their cloud fill reflectors from and iPad?
Title: Re: Dynamic range test
Post by: deejjjaaaa on October 07, 2012, 10:16:37 am
I think it is very easy, just using a RAW developer that only applies white balance and demosaicing. E.g. DCRAW, or RAW Therapee and ACR with all settings to 0.

ACR applies color transformations (camera profile)... so you might want to cook a special profile then, right ?
Title: Re: Dynamic range test
Post by: Guillermo Luijk on October 07, 2012, 10:18:40 am
ACR applies color transformations (camera profile)... so you might want to cook a special profile then, right ?

DCRAW applies them as well, in fact using Adobe's matrices. But this doesn't affect DR calculations since they are linear.
Title: Re: Dynamic range test
Post by: Doug Peterson on October 07, 2012, 10:26:38 am
I think it is very easy, just using a RAW developer that only applies white balance and demosaicing. E.g. DCRAW, or RAW Therapee and ACR with all settings to 0.

There are no true "zeros" in a raw processor/viewer. Every "zero" or "default slider location" is an active choice made by an image-processing engineer. There are some established standards (e.g. what RGB value should "median gray" have) and some loosely followed conventions (e.g. how strong of an s-curve to apply to the linear output of the sensor). But both the goals and methods of getting there are entirely up to the developers of the processor/viewer and are proprietary, often patented, and vary in surprisingly large ways from each other.

What's worse is I feel this misconception hurts photographers. In the days of film, photographers would select a film and processor based on the subject, intention of the shoot, client needs, and intended style; i.e. they'd pick it for it's look (color, tonality, range, grain, etc) and technical attributes. Now, photographers often shoot digital raw files starting with the assumption that the "default" settings (whether the in-camera LCD/JPG or in their preferred raw processor) are a sort of "truth".

One negative impact of this mentality is that photographers assume different raw processors are just different interfaces to show you the one "true" image. Whereas in fact even the "default" look between raw processors can vary a lot. Everything about the image can vary between processors, from the amount of absolute detail, to the amount and structure of the grain, to the way tonal transitions are rendered, to the way highlights and shadows roll off (i.e. the placement/severity/shape of the toe and shoulder), to the color of the image. The fact is those "default" settings and "0" positions on the slider are largely arbitrary.

Moreover, the math in any given processor is varied from one camera to the next. Both the apparent-to-the-user slider positions and the fundamental behind-the-scenes math. e.g. Capture One does not apply the same underlying math to an IQ180 raw as it does to a Canon G10 raw - it would be stupid for it to do so; the sensors (and the characteristics of the data they put out) are radically different and demand to be treated differently.

I think cooking is a good analogy. Give two chefs the same ingredients, tell them to cook an Ohio-style beef casserole, and you'll get two different meals. Even if you give them a specific point of reference, say for instance your grandma's casserole (e.g. Fuji Astia with a 1/3rd stop pull), you won't get the same meal.

From unscientific surveys at raw-processing workshops I've taught I'd guess that roughly half of serious photographers have never bothered to open a raw file in more than one processor to compare the results. Compare this to the percentage of photographers who only ever shot one emulsion of film with one processor and I hope you'll agree that we've lost something; I attribute much of this loss to this misconception that "defaults = true image".  
Title: Re: Dynamic range test
Post by: Doug Peterson on October 07, 2012, 10:48:26 am
DCRAW applies them as well, in fact using Adobe's matrices. But this doesn't affect DR calculations since they are linear.

Dynamic range, as defined photographically rather than in engineering terms ("how much shadow and highlight detail can I render in a pleasing way") is very heavily dependent on the raw processor. There is a lot of development in this regard in the last few years, and you need only take a 1Ds II file and process it in Capture One 3 and Capture One 6 to see that difference.

In my experience the difference in how much range can be rendered in a photographically useful way can vary as much as two stops between raw processors.

But then again you could just use dXo's charts to tell you what the DR of a particular camera is - assuming that what you use cameras for is to generate charts and not pictures it will be perfectly accurate.
Title: Re: Dynamic range test
Post by: Guillermo Luijk on October 07, 2012, 10:54:16 am
There are no true "zeros" in a raw processor/viewer. Every "zero" or "default slider location" is an active choice made by an image-processing engineer.

Not really. DCRAW is a good reference for a true neutral RAW development (even the white balance and colour profiling can be avoided). It has not zeros, it simply only works with 0 processing in a linear workflow. By setting everthing to 0 in ACR (including the curve to linear), the result is identical to DCRAW. However if there is any doubt of what's going on under the hood, the solution is as simple as using DCRAW which is a universal and neutral RAW developer.


Dynamic range, as defined photographically rather than in engineering terms ("how much shadow and highlight detail can I render in a pleasing way") is very heavily dependent on the raw processor..

IMO DR is a camera sensor feature. What a RAW processor can or cannot do is somewhat arbitrary and depends on how the software was implemented. My camera sensor's DR won't be worse just because a particular RAW processor can't extract the most of my RAW files. DR is a hardware feature; how we manage and output it to the final image is a software process.

Title: Re: Dynamic range test
Post by: Doug Peterson on October 07, 2012, 11:08:56 am
IMO DR is a camera sensor feature. What a RAW processor can or cannot do is somewhat arbitrary and depends on how the software was implemented. My camera sensor's DR won't be worse just because a particular RAW processor can't extract the most of my RAW files. DR is a hardware feature; how we manage and output it to the final image is a software process.

If an image detail is rendered in the woods and no one is around to view it - does it count as DR?

There is no inherent photographic truth in a raw file. Unless you plan on displaying the original Red-only, Blue-only, Green-only pixels, 1:1, with a monitor tuned to specifically match the sensors original pixels, then everything you do to view a raw file is an interpretation. That interpretation may be geared towards neutral (in the same way Astia was geared towards neutral) but it is an interpretation none the less.

Moreover the improvement had by better raw processors does not benefit all cameras equally. Some cameras are on relatively equal footing when you look only at very basic raw-processing of their files, but become radically different in quality when each raw file is handled as well as possible (with sophisticated math catered to get the most out of each).

You can't judge any chain by an individual link.  It's neither the sensor alone, nor the software alone which defines photographically useful DR: it is only the entire image-chain (filter>lens>sensor>software and a dozen other factors) that produces a viewable image in which you can judge useful DR.

If two cars have the same exact engine/tires/chassis etc but have different algorithms controlling their automatic transmission they will finish the 0-60 in different times. Is the one that finishes earlier not faster? You don't drive an engine; you drive a car. You don't produce images with sensors, you produce them with camera systems.

Camera systems should always be judged and compared as systems. Because that's what you produce images with.
Title: Re: Dynamic range test
Post by: Guillermo Luijk on October 07, 2012, 11:10:39 am
If an image detail is rendered in the woods and no one is around to view it - does it count as DR?

Of course, things don't need to be viewed to exist (e.g. air).
Title: Re: Dynamic range test
Post by: Doug Peterson on October 07, 2012, 11:27:12 am
Of course, things don't need to be viewed to exist (e.g. air).

Ah! Then we can agree to disagree.

Because in photography I really do not care about sensors, microlens design, signal-to-noise ratios, gamut coverage, spectral responses, diffraction, or any of the other highly technical things I post frequently about. Knowing, testing, and discussing these things simply helps me provide better advice to customers about what tools will help them best achieve their photographic goals.

So from that point of view I think we are both right. You want to know about, compare, and debate the engineering characteristics of various sensors and I want to know about, compare, and debate how far into the shadow or highlight I (or a customer) can place a subject and still get beautiful result.

An avid reader might thoroughly study grammar and punctuation but judge books on whether the ideas were effectively conveyed. The grammar, punctuation and other technical-writing elements do matter - they can enhance or muddle the ideas. In fact just like in photography the experience of reading a book can be enhanced by "poor" grammar or punctuation (e.g. a rambling stream of consciousness may be better received by skipping periods and semicolons or capitalization); likewise a gritty emotionally wrenching image might be better shown with crushed featureless shadows and heavy grain. It helps if you understand grammar and punctuation so you know where and how you might leverage them appropriately. But I do not find much value in debating the Harvard comma as an academic question of grammatical semantics; I only care to debate it in the context of the readability and enjoyability of the writing it is used in.

I do believe I've gone off on a tangent. O well; off to take some pictures!
Title: Re: Dynamic range test
Post by: Dick Roadnight on October 07, 2012, 01:00:42 pm
Hi,

I don't think that scene was very challenging. Of course, you need to expose for the highlights, and that may leave some noise in the shadows. With LR, PV2012 seems to achieve some magic in highlight restoration.

Best regards
Erik
I do not want to start a war, but is LR a patch on Phocus? ...and has anyone done any real-world testing, including using the raw converter to cope with real-world contrasts in landscapes?

¿You can use LightRoom on Hasselblad raw files, so it would be possible to do a hardware independent test?
Title: Re: Dynamic range test
Post by: FredBGG on October 07, 2012, 01:30:09 pm
You don't produce images with sensors, you produce them with camera systems.

You produce images  in front of the camera.
This is done through composition, directing the subject or going to the right place at the right time.
Once the image is there to be photographed, THE LENS creates the image.
The sensor or film records that image. The recording is probably the most critical point of the process,
because it is what saves the image in time.
The camera (the box) itself is a facilitator rendering it easier or more functional to make this recording correctly.
In fluid situations the camera can dramatically change the recording possibilities.

The raw processor is simply the developing stage of the process, however it cannot produce dynamic range from this air.
It can only work with the recorded data.

On the other had there are methods of chemically developing film that alter dynamic range. For example the water batch process
for developing black and white film. However this process still has to work with the "recording" made during exposure.

While the lens is what creates the optical image... the projection of what is in front of the camera the recording device is
critical to a good result.

Many try to reduce the importance of dynamic range to how much you can pull out of the shadows, but it is more than that.
The more range you have the more recording capability you have. The more that is recorded the more flexibility you have later.

Another very significant improvement in sensor design is live view. Proper live view that is high quality and real time allows the photographer
to see things directly that would otherwise not be seen and would have to be predicted. It is also very effective if not essential to critical focus.
Another example of how useful it can be is shooting black and white. On many cameras you can set the camera to record black and white jpegs, but still record color raw files.
At the same time the camera can show you PROCESSED black and white in live view.
This is of particular importance if you are intending to do some heavy color filtration in the black and white conversion process.
Sometime certain skin tones can react strangely to strong color filtrations and it can be even worse with makeup.

While I still shoot a lot of film I will often use a digital camera as an advanced light meter and "scene analyzer" of sorts and polaroid replacement.

We have gone a long way form the simple ground glass screen, but at the same time there is a certain magic in looking at the image on
a ground glass screen... even if it is backwards and upside down ;D
Title: Re: Dynamic range test
Post by: ErikKaffehr on October 07, 2012, 03:30:16 pm
Hi,

PV 2012 is much better at restoring highlight than previous versions. If you send me the raw image I will gladly check it out in Lightroom.

Yes, I of course have tested Lightroom on real life landscapes, but with my own images. Lightroom is what I use so I have tested it with something like 50000 mages and about half is landscape.

Yes, LR works on Hasselblad files. Alex Koskolov made some raw images available for testing and I have looked at those, this topic: http://www.luminous-landscape.com/forum/index.php?topic=69391.0

Best regards
Erik


I do not want to start a war, but is LR a patch on Phocus? ...and has anyone done any real-world testing, including using the raw converter to cope with real-world contrasts in landscapes?

¿You can use LightRoom on Hasselblad raw files, so it would be possible to do a hardware independent test?
Title: Re: Dynamic range test
Post by: MrSmith on October 07, 2012, 03:42:25 pm
If an image detail is rendered in the woods and no one is around to view it - does it count as DR?



Either you have got a handle on the subtle slightly obtuse British sense of humour or I LOL'd at something that wasn't intentionally funny.
Title: Re: Dynamic range test
Post by: deejjjaaaa on October 07, 2012, 04:36:22 pm
DCRAW applies them as well, in fact using Adobe's matrices. But this doesn't affect DR calculations since they are linear.


but ACR/LR, might apply not only/just a matrix - you actually need to see what is the particular profile selected for a particular camera model and you shall not assume that will be just a matrix.
Title: Re: Dynamic range test
Post by: deejjjaaaa on October 07, 2012, 04:42:14 pm
There are no true "zeros" in a raw processor/viewer. Every "zero" or "default slider location" is an active choice made by an image-processing engineer.

and in some raw converters "zero" is a true zero by exactlly an active choice made by developer... and if it an open source code you can actually see that... and there are converters that allow you by choice, not to apply any WB, not to apply any curves, not to apply any color transforms to cameras RGB values and not to apply demosaick even...
Title: Re: Dynamic range test
Post by: Wayne Fox on October 08, 2012, 01:59:01 pm
and in some raw converters "zero" is a true zero by exactlly an active choice made by developer... and if it an open source code you can actually see that... and there are converters that allow you by choice, not to apply any WB, not to apply any curves, not to apply any color transforms to cameras RGB values and not to apply demosaick even...
Just curious which convertors?  I'm trying to figure out a way to get a straight image without anything other than de-mosaic and haven't had much success.  They all seem to apply at least some gamma adjustment.
Title: Re: Dynamic range test
Post by: Guillermo Luijk on October 08, 2012, 03:09:01 pm
Just curious which convertors?  I'm trying to figure out a way to get a straight image without anything other than de-mosaic and haven't had much success. They all seem to apply at least some gamma adjustment.

Commercial developers usually do, because integer TIFF files are not well suited for post processing in linear state (shadows quickly posterize due to the lack of levels).

But there is actually no reason not to linearly develop a RAW file as long as you visualize the resulting image in a program that takes into account that linearity. For instance I built linear versions of sRGB, AdobeRGB and ProPhoto RGB (to do that you just need setting gamma to 1.0 in PS native profiles) to properly render linear images (obtained with DCRAW) in PS. Their histogram is strongly shifted to the left, but they look the same as the gamma developed version.

I am no expert but I guess floating point HDR formats are all linear as well, in this case to make tone mapping calculations easier and faster.
Title: Re: Dynamic range test
Post by: deejjjaaaa on October 08, 2012, 05:43:02 pm
Just curious which convertors?  I'm trying to figure out a way to get a straight image without anything other than de-mosaic and haven't had much success.  They all seem to apply at least some gamma adjustment.

rpp ( http://www.raw-photo-processor.com/RPP/Overview.html ) ...

something like /Applications/Raw Photo Processor 64.app/Contents/MacOS/Raw Photo Processor 64 -NoCFA YES will run w/o demosaicking (useful for cameras w/ removed CFA filter for example or true mono... but I run it sometimes for a regular CFA raws if I need a particular look).

in addition you can

1) convert w/o applying any color transforms in camera's RGB (= w/o any camera "profile")
2) select gamma = 1
3) directly enter WB multipliers (and hence directly say - use UniWB during conversion, if you want so) - it does not operate w/ K/tint paradigm at all

for example (here still w/ demosaick applied) = no color transforms, gamma 1, no WB, output to 16bit camera's RGB tiff

(http://imageshack.us/a/img835/6379/rppm.jpg)

if you want you can also dump 32bit floating w/ gamma 1
Title: Re: Dynamic range test
Post by: Wayne Fox on October 08, 2012, 09:04:04 pm
Thanks.
Title: Re: Dynamic range test
Post by: deejjjaaaa on October 10, 2012, 09:33:30 am
Thanks.

I forgot to account for spaces in the command line example, sorry  = "/Applications/Raw\ Photo\ Processor\ 64.app/Contents/MacOS/Raw\ Photo\ Processor\ 64 -NoCFA YES"
Title: Re: Dynamic range test
Post by: Wayne Fox on October 11, 2012, 03:09:31 pm
I forgot to account for spaces in the command line example, sorry  = "/Applications/Raw\ Photo\ Processor\ 64.app/Contents/MacOS/Raw\ Photo\ Processor\ 64 -NoCFA YES"
Thanks.  Works great.  I was hoping the file would actually show each pixel as seen through it's corresponding filter (so basically a bunch of rggb dots). Showing each pixel as basically it's density should work well enough to demonstrate to the class what a raw file really is and why all raw files need "processing".  Results are a really bad, dark B&W with a wide contrast range, but very little midtones showing why a linear capture device needs correction to look normal to us.  I still have students that believe the conversion on the back of the camera is what the camera saw, and anything other than that is manipulation.
Title: Re: Dynamic range test
Post by: Doug Peterson on October 11, 2012, 03:51:00 pm
Results are a really bad, dark B&W with a wide contrast range, but very little midtones showing why a linear capture device needs correction to look normal to us.  I still have students that believe the conversion on the back of the camera is what the camera saw, and anything other than that is manipulation.

Don't feel bad; we have plenty of professional photographers as clients who believe the same thing.

Hence my earlier rant (http://www.luminous-landscape.com/forum/index.php?topic=71218.msg564814#msg564814).

Understanding that raw processing is an essential link in the creation of a digital image is vital to gaining the best control over that creation.
Title: Re: Dynamic range test
Post by: deejjjaaaa on October 11, 2012, 04:53:35 pm
Thanks.  Works great.  I was hoping the file would actually show each pixel as seen through it's corresponding filter (so basically a bunch of rggb dots).

if you want that (colorized dots), then you need rawdigger - http://www.rawdigger.com - which is not a raw converter exactly, but good enough for illustration purposes... it will show you RGGB undemosaicked dots colorized... rpp was intended in such mode specifically for bw cameras (like bayer cameras with CFA removed or monochrome MF/Leica) where you natually do not need any demosaicking and image was supposed to be b/w/greyscale


(http://imageshack.us/a/img854/1084/107061220121011165530ra.jpg)
Title: Re: Dynamic range test
Post by: deejjjaaaa on October 11, 2012, 05:01:51 pm
Don't feel bad; we have plenty of professional photographers as clients who believe the same thing.
but it is still an image... in that sense raw file is not different from any jpg or png or bmp... all of them are just bits and bytes and you need a software and certain math coded to display the content on your monitor... I 'd rather say that you need more math to display something from JPG actually vs to display something from raw data.
Title: Re: Dynamic range test
Post by: Wayne Fox on October 12, 2012, 05:22:45 pm
but it is still an image... in that sense raw file is not different from any jpg or png or bmp... all of them are just bits and bytes and you need a software and certain math coded to display the content on your monitor... I 'd rather say that you need more math to display something from JPG actually vs to display something from raw data.

Rawdigger is perfect for what I'm trying to do!  Thank you very much.

The point I'm trying to make to the students in the class is that the sensor just records levels of brightness so a raw file in a straight conversion would just show a bunch of grey dots.  Since we know what color of filter was over each sensel, we can add that information and get a mosaic, but that's still not very useful.  Add the fact that it's linear and basically without some type of post processing, a digital capture is useless.  So every image captured goes through a complex routine that makes it useful to us as humans to visualize, and the process is very interpretive, and only the person taking the photograph knows what they saw, and what they are trying to capture.  The point being whether you let the engineers that designed the firmware make those decisions or whether you want to take over that process.  Often the comeback is this isn't natural like it is with film launches into a second discussion as to how all of these properties are built into the film, but they are still there. 

My challenge is getting students to understand the significance and importance of shooting raw when they want maximum quality.  The issue is exacerbated because the college offers this course at two different times (I'm substituting for a good friend who teaches the night class), and the day class is taught by the schools "computer" guy ... someone who teaches Word, Excel, Windows etc.  He's stated that photography is not an art, and some of the questions he has come up with for the tests show he isn't at all qualified (we had a really good laugh at some of his questions and his answers, but sadly it's not a funny situation because his students really aren't getting what they are paying for). because he's on staff and PHotoshop is a computer program, the school seems to think he knows what he's talking about.  He tells the students that jpegs are more than enough and don't waste the storage space for raw files.

This is a new course of study at a local business college, and my friend who is a very respected and knowledgeable photographer just met with the administrator and I think they understand better about what is going on.  They've asked him and not the computer guy to plan the curriculum for the Intermediate Photography  courses the want to introduce next semester.  Unfortunately even though my friend is imminently qualified to teach the material, they can't hire him as full time staff because he doesn't have a "masters" degree.