Pages: [1] 2   Go Down

Author Topic: Sensors, pixels and histograms  (Read 6571 times)

David Sutton

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 1345
    • David Sutton Photography
Sensors, pixels and histograms
« on: January 19, 2010, 09:26:05 pm »

Hi. I've got a couple of really basic questions about digital images that I wonder if anybody can help me with?
Let me see if I've got this right:
A pixel (from picture element) is the smallest element in an image that can be controlled with photo editing software.
An photo's histogram shows its pixel brightness values, from darkest to lightest. So it's a pixel based histogram.
A raw file's histogram would show the number of photons counted, from some to heaps. It would not be pixel based, as there are no pixels yet because the file hasn't been demosaiced. In the same way as there is no “picture” until a film is developed.
If that's okay so far and I'm not confused, questions:
In a camera's sensor (Bayer array), what is the relationship between the number of photosensor elements and the number of pixels in the demosaiced image? Is it one to one?
If not, I'd prefer not to use the word “pixel” to describe a photon counter. What's an accurate word? Photosite? Photon receptor?
When I open a raw file in Lightroom, it's been demosaiced but not rendered. What am I seeing on screen? Is it a jpeg produced by the software in a similar way to the image on the camera lcd after shooting? So am I seeing a pixel based histogram or something else?
Thanks in advance, David
Logged

ErikKaffehr

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 11311
    • Echophoto
Sensors, pixels and histograms
« Reply #1 on: January 20, 2010, 01:23:26 am »

Hi,

See comments below.

A very major issue you don't discuss is white balancing, which decides the balance of the different channels. This is the major issue with in camera histograms. Changing the color balance can shift the RGB histograms significantly.

The demosaic process does not affect color, IMHO, but is important regarding sharpness and aliasing.

Best regards
Erik

Quote from: David Sutton
Hi. I've got a couple of really basic questions about digital images that I wonder if anybody can help me with?
Let me see if I've got this right:
A pixel (from picture element) is the smallest element in an image that can be controlled with photo editing software.
Yes
An photo's histogram shows its pixel brightness values, from darkest to lightest. So it's a pixel based histogram.
Yes
A raw file's histogram would show the number of photons counted, from some to heaps. It would not be pixel based, as there are no pixels yet because the file hasn't been demosaiced. In the same way as there is no “picture” until a film is developed.
Don't agree. You are right sort of. Demosaicing wouldn't affect histogram significantly. There are three channels RGB and those are for real. It's not photons we measure but numbers, although the numbers relate to photons.
If that's okay so far and I'm not confused, questions:
In a camera's sensor (Bayer array), what is the relationship between the number of photosensor elements and the number of pixels in the demosaiced image? Is it one to one?
Well 50% of the pixels is green, 25% blue and 25% red. Demosaic guesses the missing color information for each pixel. 2/3s of the information on each pixel is interplated.
If not, I'd prefer not to use the word “pixel” to describe a photon counter. What's an accurate word? Photosite? Photon receptor?
Photosite sounds fine to me.
When I open a raw file in Lightroom, it's been demosaiced but not rendered. What am I seeing on screen? Is it a jpeg produced by the software in a similar way to the image on the camera lcd after shooting? So am I seeing a pixel based histogram or something else?
It's a preview, that is rendered. I don't know about histogram.
Thanks in advance, David
Logged
Erik Kaffehr
 

David Sutton

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 1345
    • David Sutton Photography
Sensors, pixels and histograms
« Reply #2 on: January 20, 2010, 03:08:49 am »

Hi Erik. Thanks for the reply. Is the following any better?

A photosite measures the light falling on it and this is stored in the raw file as a number. A raw histogram would show the amount of information collected, from little to a lot. It would not be pixel based, as the pixels seen a demosaiced and rendered image haven't been created yet. But it would be close to a histogram of such an image prior to white balancing.

So if I have this right, each photosite corresponds to a pixel seen in the image on the screen on a 1:1 basis, minus maybe a few at the edge. The colour in each pixel is "guessed" from the information stored in the raw file taken from the corresponding photosite and surrounding ones.
 
I'm trying to keep my thinking clear. If I start to use the word "pixel" when I mean "photosite", then the above sentences becomes nonsense. Assuming they aren't to begin with.  


Regards, David
Logged

NikoJorj

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 1082
    • http://nikojorj.free.fr/
Sensors, pixels and histograms
« Reply #3 on: January 20, 2010, 06:56:41 am »

Quote from: David Sutton
A pixel (from picture element) is the smallest element in an image that can be controlled with photo editing software.
An photo's histogram shows its pixel brightness values, from darkest to lightest. So it's a pixel based histogram.
Yes, an histogram counts the number of pixels (Y value) of a given brightness value (X axis).

Quote
A raw file's histogram would show the number of photons counted, from some to heaps. It would not be pixel based, as there are no pixels yet because the file hasn't been demosaiced.
It's still very feasable to have a raw-based histogram, because these photosites have both a color and a brightness (the photon count), from which one can build three histograms, one for each color.

Quote
In a camera's sensor (Bayer array), what is the relationship between the number of photosensor elements and the number of pixels in the demosaiced image? Is it one to one?
Yes, by default, there is one pixel of the output image per photosite.
There are other ways to reconstruct a picture though, eg some cameras may have a reduced-resolution mode for high sensitivities where one pixel of the rendered image corresponds to a square of 4 photosites.

Quote
But it would be close to a histogram of such an image prior to white balancing.
Sort of... Keep in mind that a raw image without white balance does look strange, very green (search this forum or elsewhere for UniWB, which is in a sense similar to no white balance).
It's akin to looking at a color negative, if you see what I mean : very useful to see what margin of adjustment you've got while printing, but not so much to judge the image itself.

Quote
When I open a raw file in Lightroom, it's been demosaiced but not rendered.
What you see is a rendered image, with the parameters you specified (ie default parameters after the import). If you can see it (without wondering what that mess is), then it has been rendered. A raw image is not very human sight friendly.
And the histogram you see in Lightroom is based on the rendered image, so that you can see the effect of the develop parameters on it.
« Last Edit: January 20, 2010, 06:57:47 am by NikoJorj »
Logged
Nicolas from Grenoble
A small gallery

bjanes

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 3387
Sensors, pixels and histograms
« Reply #4 on: January 20, 2010, 07:47:47 am »

Quote from: David Sutton
Hi. I've got a couple of really basic questions about digital images that I wonder if anybody can help me with?
Let me see if I've got this right:
A pixel (from picture element) is the smallest element in an image that can be controlled with photo editing software.
An photo's histogram shows its pixel brightness values, from darkest to lightest. So it's a pixel based histogram.
A raw file's histogram would show the number of photons counted, from some to heaps. It would not be pixel based, as there are no pixels yet because the file hasn't been demosaiced. In the same way as there is no “picture” until a film is developed.
If that's okay so far and I'm not confused, questions:
In a camera's sensor (Bayer array), what is the relationship between the number of photosensor elements and the number of pixels in the demosaiced image? Is it one to one?
If not, I'd prefer not to use the word “pixel” to describe a photon counter. What's an accurate word? Photosite? Photon receptor?
When I open a raw file in Lightroom, it's been demosaiced but not rendered. What am I seeing on screen? Is it a jpeg produced by the software in a similar way to the image on the camera lcd after shooting? So am I seeing a pixel based histogram or something else?
Thanks in advance, David
Because of the complications you note with a Bayer array sensor, some authors use the term SENSEL to describe the individual elements in a Bayer array. A 12 MB Bayer sensor contains 6M green sensels, 3M blue sensels and 3M red sensels. The demosaiced image would have 12MB pixels and file size would be 36MB since there are 3 color channels.
Logged

David Sutton

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 1345
    • David Sutton Photography
Sensors, pixels and histograms
« Reply #5 on: January 20, 2010, 02:54:56 pm »

Thank you for your replies. It's most helpful. Perhaps I need to explain in a bit more detail. When I teach music I can draw on some 200 years of tradition. Teachers on my instrument have put in a lot of work over those year into what works didactically. When I say “this is a string and here are some ways to get a nice sound from it” I know what I'm doing. But with digital photography we seem have a lot of people floundering around, and much of the information is only “sort of” right.
For example, I don't usually use the term “raw image” as I doubt there is such a thing. If I can't see it and no one else can, I'd prefer to use the term “raw file”. The data is real, but the “pixels” aren't. My understanding is that most of us will not be able to find software to see the undemosaiced raw file. And my guess is that what I'm seeing on screen has also been rendered. Meaning I suppose converted to a jpeg or whatever and into a common colour space. In Lightroom the raw file will be in a form of prophoto rgb, but what am I seeing on screen? Is it a jpeg generated by the software to represent the file? I realised that I'm not even sure what I'm seeing when I look at an image on screen. It's like going to a concert of experimental music and not being told what I'm going to hear.
Most people with a digital camera are using it's software and hardware to convert the raw file into a jpeg, and are unaware that there is that intermediate step to produce their photos. When they join a camera society, it's an uphill battle to to get them to work with their raw files, and a lack of an accurate but simple way of describing what's happening doesn't help.
So I'm looking for descriptions of the fundamental processes in digital photography that are in simple words, and where I don't have to come back later and say “well that was only half right”
I may not necessarily want to teach this stuff, but I would like to accurately describe what I'm doing.
My beginning step has been to treat the raw data as information, and not as anything concrete. And then to call it an “image” or “photo” once it can be shared, meaning put into a common file format. I'm going to have to think some more about this.
As far as histograms go, what am I looking for in the lcd histogram on the camera.? I want to see how far to the right it goes to know something about the signal to noise ratio, and about highlight clipping. And I want to see how far to the left it goes to know something about my shadow clipping. I have a UniWb saved as a custom setting, and I can show someone what it looks like, but seldom use it as for most of what I do an approximation is good enough. Blue fungi are the exception. They are complete little b*st*rds as far as clipping in the blue channel goes.
Cheers, David
« Last Edit: January 20, 2010, 02:58:02 pm by David Sutton »
Logged

Luis Argerich

  • Full Member
  • ***
  • Offline Offline
  • Posts: 155
  • Astrolandscaper
    • http://www.luisargerich.com/
Sensors, pixels and histograms
« Reply #6 on: January 20, 2010, 03:30:18 pm »

David, I agree a lot with what you say, "RAW file" is certainly better than "RAW image" for most cases.
I think you can learn a lot downloading and playing with RawAnalyze.

http://www.cryptobola.com/PhotoBola/Rawnalyze.htm

Cheers,
Luigi

David Sutton

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 1345
    • David Sutton Photography
Sensors, pixels and histograms
« Reply #7 on: January 20, 2010, 03:43:44 pm »

Quote from: bjanes
Because of the complications you note with a Bayer array sensor, some authors use the term SENSEL to describe the individual elements in a Bayer array. A 12 MB Bayer sensor contains 6M green sensels, 3M blue sensels and 3M red sensels. The demosaiced image would have 12MB pixels and file size would be 36MB since there are 3 color channels.
I didn't know that. It seems to me that sensel is an unnecessary neologism when we already have photosite and photo receptor and so on. What do you think?
Logged

David Sutton

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 1345
    • David Sutton Photography
Sensors, pixels and histograms
« Reply #8 on: January 20, 2010, 03:45:25 pm »

Quote from: Luis Argerich
David, I agree a lot with what you say, "RAW file" is certainly better than "RAW image" for most cases.
I think you can learn a lot downloading and playing with RawAnalyze.

http://www.cryptobola.com/PhotoBola/Rawnalyze.htm

Cheers,
Luigi
Thanks for reminding me about RawAnalyse. I have it on my computer but have never played with it.
Logged

Bart_van_der_Wolf

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 8914
Sensors, pixels and histograms
« Reply #9 on: January 20, 2010, 04:29:09 pm »

Quote from: bjanes
Because of the complications you note with a Bayer array sensor, some authors use the term SENSEL to describe the individual elements in a Bayer array. A 12 MB Bayer sensor contains 6M green sensels, 3M blue sensels and 3M red sensels. The demosaiced image would have 12MB pixels and file size would be 36MB since there are 3 color channels.

That's right. I also prefer to use the term Sensel (from SENSor ELement) for the smallest spatially discrete units (representing the capture of either a single color pass band (e.g. Bayer CFA filtered or achromatic), or a stacked construction representing multiple color pass bands (e.g. Foveon)). The 'spatially discrete' part is important because it determines the sampling density, and thus the resolution (in terms of samples per unit distance). The photo-sensitive area of the complete imager chip is best described as sensor array, to differentiate from the individual sensor elements. Using the term sensel also avoids confusion with sloppy use of sensor (array). Sensel also hints at the photovoltaic sensitivity of the electronic circuit.

To avoid confusion with output pixels (the smallest spatial units that make up an image), I try to avoid the use of that term for the sensor elements, which usually require further processing of their data content before they can be reproduced as a color. Sensels are input, pixels are output (as in pixels per inch (PPI)). Sensels can also be electronically combined (AKA binning) before they are output as pixels, so there can be a difference in the number of sensels versus pixels.

Cheers,
Bart
Logged
== If you do what you did, you'll get what you got. ==

Daniel Browning

  • Full Member
  • ***
  • Offline Offline
  • Posts: 142
Sensors, pixels and histograms
« Reply #10 on: January 20, 2010, 04:57:10 pm »

Quote from: bjanes
The demosaiced image would have 12MB pixels and file size would be 36MB since there are 3 color channels.

Minor clarification: I think you meant "12 megapixels" instead of "12 MB pixels". Also, it may be useful to point out that the file size would only be three times the number of megapixels when using 8 bits per pixel uncompressed. Other bit depths (and/or compression) change the relationship.
Logged
--Daniel

Mark D Segal

  • Contributor
  • Sr. Member
  • *
  • Offline Offline
  • Posts: 12512
    • http://www.markdsegal.com
Sensors, pixels and histograms
« Reply #11 on: January 20, 2010, 05:10:16 pm »

David,

Asking questions on a web forum is fine, but to augment your understanding, there is of course a ton of material on the internet and in hard copy covering every aspect of the questions you are asking; to start with, the many excellent - and free - tutorials and "understanding" articles on this website, the Reichmann-Schewe videos (for purchase) on Camera Raw and Lightroom, another website called Cambridgeincolour, Jeff Schewe's book on Camera Raw, and on and on. The best way to approach this is to avoid hair-splitting over terminology except where it really matters, and focus on researching the fundamental concepts. That will help you to improve the guidance you give to your camera club colleagues.

Mark
Logged
Mark D Segal (formerly MarkDS)
Author: "Scanning Workflows with SilverFast 8....."

David Sutton

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 1345
    • David Sutton Photography
Sensors, pixels and histograms
« Reply #12 on: January 21, 2010, 03:26:55 am »

Quote from: BartvanderWolf
Sensels are input, pixels are output
Cool
Logged

David Sutton

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 1345
    • David Sutton Photography
Sensors, pixels and histograms
« Reply #13 on: January 21, 2010, 03:29:19 am »

Quote from: Mark D Segal
David,

Asking questions on a web forum is fine, but to augment your understanding, there is of course a ton of material on the internet and in hard copy covering every aspect of the questions you are asking; to start with, the many excellent - and free - tutorials and "understanding" articles on this website, the Reichmann-Schewe videos (for purchase) on Camera Raw and Lightroom, another website called Cambridgeincolour, Jeff Schewe's book on Camera Raw, and on and on. The best way to approach this is to avoid hair-splitting over terminology except where it really matters, and focus on researching the fundamental concepts. That will help you to improve the guidance you give to your camera club colleagues.

Mark

Hello Mark. Sometimes I'm a bit slow off the starting line but one thing I did get as soon as I picked up a digital camera was that we had entered a whole new world. Most photographers I meet haven't grasped this at a deep level. I see the camera more and more as a tool for working with information. Looking at my best prints I can see I have visualised the image and created it in Photoshop from information gathered in the field in the form of ones and zeros. Even Photoshop is a bit stuck in the past. (Ha Ha I just typed pasty). For example, dodge and burn. That's what I used to do forty years ago in a darkroom. What's it doing here? Thinking in terms of dodging and burning can limit our creativity and the possibilities of digital technology.
Most of the time I think our use of the camera is like using a Ferrari to do the shopping.
The resources you mention are good, and I have worked through them all with the exception of Jeff Schewe's new book. It's on the “to do” list. If you look at the two questions I asked, I don't believe they are addressed in any of this material. Though I could have easily missed it.  For me this discussion is not hair splitting but lies at the heart of developing a photographic vocabulary for the 21st century, and avoiding getting mired in old thinking. I am not trying to tell others how to think, nor am I interested in evangelising. But if my thinking is not clear how can I sharpen my skills?
David
Logged

Mark D Segal

  • Contributor
  • Sr. Member
  • *
  • Offline Offline
  • Posts: 12512
    • http://www.markdsegal.com
Sensors, pixels and histograms
« Reply #14 on: January 21, 2010, 09:54:13 am »

The world is new and the world is old. Some things change fundamentally, others marginally and others not at all. Dodging and burning remains as valid in digital image making as it is in the chemical darkroom. In fact no technique is invalid as long as it deliver the results you are looking for and doesn't desroy the planet   . And Photoshop is not at all stuck in the past. It is going into version CS5 and the people involved in developing wonderful new image editing tools certainly don't see themselves as mired in the past. They are working on the frontier of mathematics and programming techniques to bring us new and more efficient ways of doing the things we wish to do with our images.

I do believe the questions you are asking are adequately addressed in existing references for practical purposes, but I wasn't intending by that to throw any cold water on the discussion. I just see it wandering into semantics that are not central to a basic understanding of the fundamentals which matter to getting optimum results from a digital imaging workflow. But of course it should just carry on as people wish - it's a free world - at least here.

I wish you well in your research.
Logged
Mark D Segal (formerly MarkDS)
Author: "Scanning Workflows with SilverFast 8....."

bjanes

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 3387
Sensors, pixels and histograms
« Reply #15 on: January 21, 2010, 10:53:49 am »

Quote from: Daniel Browning
Minor clarification: I think you meant "12 megapixels" instead of "12 MB pixels". Also, it may be useful to point out that the file size would only be three times the number of megapixels when using 8 bits per pixel uncompressed. Other bit depths (and/or compression) change the relationship.
Correct. Thanks!
Logged

Guillermo Luijk

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 2005
    • http://www.guillermoluijk.com
Sensors, pixels and histograms
« Reply #16 on: January 21, 2010, 07:30:38 pm »

David, it seems you are interested in the inner workings of a digital camera. I strongly recommend you to learn using DCRAW, a command line RAW developer by David Coffin that will allow you to do and learn things no other RAW developer will.

With DCRAW you will be able to:
- Extract the embedded JPEG files found in the RAW file
- Extract and visualize the RAW data in the form of a Bayer pattern
- Get rid of all those clandestine transformations applied in commercial RAW developers (exposure and ISO correction, noise reduction, sharpening, hot pixels elimination,...)
- Plot true RAW histograms
- Substract dark frames in the RAW domain
- Learn and control one by one all the steps involved in RAW development:
  * black and saturation points of RAW files
  * white balance in terms of its genuine linear implementation (forget about Temp/Tint models)
  * demosaicing algorithms
  * highlight strategies for neutral clipped areas
  * colour profile conversions

Even the source code is available in case you want to learn some of the steps in depth from an implementation point of view.

Regards
« Last Edit: January 21, 2010, 07:39:37 pm by Guillermo Luijk »
Logged

bjanes

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 3387
Sensors, pixels and histograms
« Reply #17 on: January 21, 2010, 10:20:00 pm »

Quote from: Guillermo Luijk
David, it seems you are interested in the inner workings of a digital camera. I strongly recommend you to learn using DCRAW, a command line RAW developer by David Coffin that will allow you to do and learn things no other RAW developer will.
DCRaw is excellent, but it does have a command line interface which is inconvenient to many. Iris is a freeware program with many of the same features. It has not been updated for over a year and I hope Christian has not abandoned the project.
Logged

tived

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 714
    • http://
Sensors, pixels and histograms
« Reply #18 on: January 21, 2010, 11:43:59 pm »

Quote from: David Sutton
Hi. I've got a couple of really basic questions about digital images that I wonder if anybody can help me with?
Let me see if I've got this right:
A pixel (from picture element) is the smallest element in an image that can be controlled with photo editing software.
An photo's histogram shows its pixel brightness values, from darkest to lightest. So it's a pixel based histogram.
A raw file's histogram would show the number of photons counted, from some to heaps. It would not be pixel based, as there are no pixels yet because the file hasn't been demosaiced. In the same way as there is no “picture” until a film is developed.
If that's okay so far and I'm not confused, questions:
In a camera's sensor (Bayer array), what is the relationship between the number of photosensor elements and the number of pixels in the demosaiced image? Is it one to one?
If not, I'd prefer not to use the word “pixel” to describe a photon counter. What's an accurate word? Photosite? Photon receptor?
When I open a raw file in Lightroom, it's been demosaiced but not rendered. What am I seeing on screen? Is it a jpeg produced by the software in a similar way to the image on the camera lcd after shooting? So am I seeing a pixel based histogram or something else?
Thanks in advance, David

Hi David,

given the many responses to have had here, how have your view on the topic at hand change if at all. Are you able to define the elements on a sensor that captures the light and their relation to how they are presented in an image program/ raw converter?

When you look at a histogram - pixel, photo-sensor - all its telling you is the relation of the pixels in terms of density within 0-255 (8bit) or the equivalent in 16bit (RGB histogram gives you information of how each color is distributed and if clipping has occurred which colors are clipped - which in turn can help you address the issue or live with it. So, if you look at a histogram and seeing something else, then please tell, cos I would love to know.

Please enlighten me, I am just as curious as you

thanks

Henrik
Logged

David Sutton

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 1345
    • David Sutton Photography
Sensors, pixels and histograms
« Reply #19 on: January 22, 2010, 03:55:26 am »

Quote from: bjanes
DCRaw is excellent, but it does have a command line interface which is inconvenient to many. Iris is a freeware program with many of the same features. It has not been updated for over a year and I hope Christian has not abandoned the project.
Thanks for this link, and thanks too Guillermo for mentioning DCRaw. I'm travelling at present with a friend who uses UFRaw which is a front end for DCRaw. But I don't thing it has all DCRaw's functionality. Although I have been avoiding command line, I see no reason I can't learn learn the syntax in the manual pages in DCRaw.
Regards
Logged
Pages: [1] 2   Go Up