GR User Forum

The spot for all Ricoh GR camera users

Register for free, meet other Ricoh GR users, share your images, help others, have fun!

Tell your friends about us!

How is ISO controlled in digital cams?


New Member
I understand it is used towards speeding up the picture taking process. But, I can't seem to find reference information on how it's generally done in modern day digital cameras, on what the basic science is and the techniques used.

Here's what I'm suspecting is happening: 1) light hits the CMOS sensor 2) all pixels' sensory information is recorded 3) at ISO100, most (say 90%) of pixels are read from buffer 4) at ISO800, proportionally less is read 5) the picture is made from the data read;

Basically, the decision to use x number of pixels, as a factor of the ISO setting, out of the total is what contributes towards speeding up the picture taking. The resulting noise is then due to the picture making process, the assembly logic implemented in the camera's software is where the differences come from between various camera manufacturers.

So, I'm curious if this is correct, or if not, could somebody throw up a link for the real explanation. I can't seem to ask the right question from Google.

However, whatever the actual process is, it seems to me that this gain in speed by throwing away a select set of pixels is a rather artificial choice. Since the data is already all there, could it be just a question of hardware+software to process all the pixels all the time? I understand the traditional attitude towards ISO, but if the technology is there, why not process all the pictures at full on quality?
Unless ISO control in digital cams really means altering the how much light information is recorded in the first place. Meaning some pixels on the CMOS are actually turned off. That still is an artificial choice, no? It's done to affect processing speed, but sooner or later, the hw+sw combo should take care of this.

It's an amplification of voltage. The speed-up is no more exciting than the camera using a shorter shutter speed.

To get the most out of the sensor you'd want black to output zero voltage, and white to output max voltage from the sensor. This will require certain combinations of aperture and shutter speed. If you use half of that shutter speed, white will be only half of the max voltage from the sensor. But if you amplify the voltage coming out of the sensor by a factor of two, you'd get back the correct exposure again. The price to pay for this is that the amplification will also apply to whatever noise the sensor produces. That's why high iso pictures are more noisy.

(There are probably technicalities to this; the voltage may not be linear or something, I only know the principle.)
Tommy, what you're saying makes sense! Still, it only explains the signal processing part of it, right? Why would there be a lesser quality, or lower level signal to amplify, unless some sort of throttling was already applied BEFORE the signal was generated. That would only happen if the sensor elements were 'prompted' to regulate their voltage, basically reestablishing the baseline. That, I think, is what the resulting of a metering event would be, just prior to taking a picture. This is basically the equivalent of loading a different digital ISO film, when only aperture and/or exposure is controlled by the user. When all 3 are set manually, the camera tries its best and you get what you get.

Thanks again Tommy! I just needed to wrap my mind around the basics for some reason. It won't make me a better photog, but perhaps I'll be more appreciative of the engineering effort needed to get all of this working right.
Thanks to Tommy and KaRoy for sharing the technical stuff and enlighting us on the
ISO controlling parts in digital cameras. Useful info for non technie like me.

Just making a guess:
One day Ricoh will be producing a lower noise cam with ISO up to 3200
with a change in the design of the camera body. Hope their engineers already
have this on their drawing board. Watch for better things to come by the end of 2009
or even sooner.

I found a brief explanation at, but it doesn't explain in detail.

I think the sensor can not by itself adjust its output. It is the accumulated very energy of the photons hitting the pixel element on the sensor that _is_ the output level, like raindrops falling into a bucket, and the only way to change the sensor output is to allow more or less photons to hit the sensor, i.e., change aperture and shutter speed. So you can't really change "film" in a digital camera.

This is also why small high-megapixel sensors have more noise; the output levels from the sensor are lower (fewer photons in each little bucket), so the camera needs to amplify the sensor output more, which has the same effect as raising the ISO on a less-megapixel camera. A modern 2 MP sensor would probably have about the same image quality at ISO 800 as a current 8 MP sensor has at ISO 200.

I believe this multiplication factor applied at different ISO levels is the act of 'digital film changing'. As you know, there's a set of articles around your link on DPReview, originated by, that explain some of the basics of digital photography.
The articles that completed the picture for me reveal the process behind your original comment about amplification are the ones about Sensitivity and Noise.

BTW, the e-book has been on my back-burner, but now I'm pretty sure I'm gonna pull the trigger on it. Just so that I stop bothering other people with my ignorance... :mrgreen:

Which is what I was after in the first place. So, thanks for leading me to the source Tommy!