Does a sensor count the number of photons that hits it?Does this look like a sensor problem?Assuming a perfect sensor, what is the physical limit of ISO/pixel area?Why is the main sensor not used instead of the separate AF sensor to focus a DSLR?Does live view increases the number of shutter actuations?What *exactly* is white balance?How can cameras have the same number of megapixels even when the sensor size differs?How much of digital sensor noise is thermal?Light scratch on sensor: is that fixableCan a dSLR sensor be replaced with one that belongs to another model?Does the pixel count of the image sensor equate to the number of pixels of the image?

Email about missed connecting flight compensation 5 months after flight, is there a point?

Can I intentionally omit previous work experience or pretend it doesn't exist when applying for jobs?

How can I deal with a player trying to insert real-world mythology into my homebrew setting?

Why is dry soil hydrophobic? Bad gardener paradox

Is Arc Length always irrational between two rational points?

What is the Precise Definition of a “Complex Vector Space”?

A DVR algebra with weird automorphisms

As a DM, how to avoid unconscious metagaming when dealing with a high AC character?

Why does the autopilot disengage even when it does not receive pilot input?

How can one write good dialogue in a story without sounding wooden?

Cubic programming and beyond?

How can an advanced civilization forget how to manufacture its technology?

Is purchasing foreign currency before going abroad a losing proposition?

Why did the Japanese attack the Aleutians at the same time as Midway?

Optimising Table wrapping over a Select

Why does Hellboy file down his horns?

What would be the ideal melee weapon made of "Phase Metal"?

What's the point of this scene involving Flash Thompson at the airport?

Where is the USB2 OTG port on the RPi 4 Model B located?

What would the EU do if an EU member declared war on another EU member?

Was the Ford Model T black because of the speed black paint dries?

What is temperature on a quantum level

Is this floating-point optimization allowed?

Credit union holding car note, refuses to provide details of how payments have been applied



Does a sensor count the number of photons that hits it?


Does this look like a sensor problem?Assuming a perfect sensor, what is the physical limit of ISO/pixel area?Why is the main sensor not used instead of the separate AF sensor to focus a DSLR?Does live view increases the number of shutter actuations?What *exactly* is white balance?How can cameras have the same number of megapixels even when the sensor size differs?How much of digital sensor noise is thermal?Light scratch on sensor: is that fixableCan a dSLR sensor be replaced with one that belongs to another model?Does the pixel count of the image sensor equate to the number of pixels of the image?






.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty margin-bottom:0;








3















I'm interested in the grayscale image case. In a sensor there is an array of cavities which collect photons. (Source: Cambridge in Colour Digital Camera Sensors)



Does each cavity count the number of signals (or peaks) generated by each photon? Or is there one signal which is the sum of all photons (in which case the size of the signal should depend on photon energy presumably)?



And also I'm guessing each cavity correspond to a pixel?



Additional references would be appreciated.










share|improve this question
























  • Is it possible for photons to be counted by the limited computing capacity of a camera ? physics.stackexchange.com/questions/4799/how-to-count-photons AND en.wikipedia.org/wiki/Photomultiplier

    – Alaska Man
    Jul 6 at 20:26


















3















I'm interested in the grayscale image case. In a sensor there is an array of cavities which collect photons. (Source: Cambridge in Colour Digital Camera Sensors)



Does each cavity count the number of signals (or peaks) generated by each photon? Or is there one signal which is the sum of all photons (in which case the size of the signal should depend on photon energy presumably)?



And also I'm guessing each cavity correspond to a pixel?



Additional references would be appreciated.










share|improve this question
























  • Is it possible for photons to be counted by the limited computing capacity of a camera ? physics.stackexchange.com/questions/4799/how-to-count-photons AND en.wikipedia.org/wiki/Photomultiplier

    – Alaska Man
    Jul 6 at 20:26














3












3








3


2






I'm interested in the grayscale image case. In a sensor there is an array of cavities which collect photons. (Source: Cambridge in Colour Digital Camera Sensors)



Does each cavity count the number of signals (or peaks) generated by each photon? Or is there one signal which is the sum of all photons (in which case the size of the signal should depend on photon energy presumably)?



And also I'm guessing each cavity correspond to a pixel?



Additional references would be appreciated.










share|improve this question
















I'm interested in the grayscale image case. In a sensor there is an array of cavities which collect photons. (Source: Cambridge in Colour Digital Camera Sensors)



Does each cavity count the number of signals (or peaks) generated by each photon? Or is there one signal which is the sum of all photons (in which case the size of the signal should depend on photon energy presumably)?



And also I'm guessing each cavity correspond to a pixel?



Additional references would be appreciated.







dslr sensor physics






share|improve this question















share|improve this question













share|improve this question




share|improve this question








edited Jul 5 at 22:49









xiota

15.3k4 gold badges22 silver badges74 bronze badges




15.3k4 gold badges22 silver badges74 bronze badges










asked Jul 4 at 13:24









physicsnoob1000physicsnoob1000

1215 bronze badges




1215 bronze badges












  • Is it possible for photons to be counted by the limited computing capacity of a camera ? physics.stackexchange.com/questions/4799/how-to-count-photons AND en.wikipedia.org/wiki/Photomultiplier

    – Alaska Man
    Jul 6 at 20:26


















  • Is it possible for photons to be counted by the limited computing capacity of a camera ? physics.stackexchange.com/questions/4799/how-to-count-photons AND en.wikipedia.org/wiki/Photomultiplier

    – Alaska Man
    Jul 6 at 20:26

















Is it possible for photons to be counted by the limited computing capacity of a camera ? physics.stackexchange.com/questions/4799/how-to-count-photons AND en.wikipedia.org/wiki/Photomultiplier

– Alaska Man
Jul 6 at 20:26






Is it possible for photons to be counted by the limited computing capacity of a camera ? physics.stackexchange.com/questions/4799/how-to-count-photons AND en.wikipedia.org/wiki/Photomultiplier

– Alaska Man
Jul 6 at 20:26











5 Answers
5






active

oldest

votes


















10














Your link discusses how a CCD (charge coupled device) image sensor works. Note, CCDs have applications besides images sensors, but the vast majority of CCDs are used as image sensors, and that is the only primary application I will be discussing.



CCDs



In typical CCDs used for color image sensing each CCD cell has a color filter over it. The most commonly used pattern groups 4 cells together with one red filter, one blue filter, and two green filters. These filters only allow photons of their corresponding colors, in a certain frequency band through. A greyscale CCD just doesn't have these filters.



A CCD (when used as an image sensor) at its core is a photon counting device. A photon that is incident upon the active region of a CCD excites an electron through the photoelectric effect which is then stored within that cell of the CCD. This process continues as long as photons hit the cell causing electrons to accumulate within each cell.



Your camera lens projects an image of the scene you are taking a picture of onto the CCD. This is the same as in a film camera, except with film instead of a CCD. Each pixel corresponds to one cell within the CCD. In the case of a color image, each pixel is the the product of one or more filtered cells, depending on the algorithm and cell location. The simplest algorithm groups each set of 4 filtered cells into a single pixel. However it is common for interpolation schemes to increase the number of full color pixels to equal the number of CCD cells.



Photon Energy Dependence



The signal does depend on photon energy, but only as a threshold. In order for a photon to generate an electron through the photoelectric effect it must have a certain amount of energy. This amount of energy is the "bandgap" energy of the semiconductor. The bandgap energy of silicon is about 1.1 eV, meaning photons with a wavelength of about 1100 nm and lower will be detected. As you continue to increase photon energy the signal remains constant at one electron per photon. Once your photons have twice the bandgap energy, or more, an incident photon can generate two electrons, but it is fairly rare.



Once you have decided you are done taking your image the shutter is closed and it is time to read out what image was captured in the CCD. To read out the image the charge within each cell is shifted over one column within its row. The first column is then read out. This can be done by either measuring the current to discharge the cell, or measure the voltage of the cell while knowing the capacitance. Both of which can tell you how many electrons were stored in that cell. After the first column is read out, the cells are all shifted again, and this repeats until all cells have been read.



Non-Idealities



There are a number of factors that prevent typical CCDs from giving you an exact photon count. There is a significant amount of thermal noise that can only be reduced by lowering the temperature well below what is reasonable for a handheld camera to be capable of. There can be leakage within the CCD cells which can cause electrons to escape the cell, or move into nearby cells, which prevents an accurate count. There will also be photons that reflect off the cell, and therefore aren't counted.



However, none of this changes the fact that a CCD counts photons. It just means it isn't a very precise photon counter. More on this below.




Does a CCD Count Photons?



I believe it does, but it comes down to the definition of "count". Lets consider an analogy.



Alice, Bob, and Chris each own an apple orchard. They want to know how many apples have fallen off the trees in their orchards. To do this they use a Tennis Ball Coupled Device (TBCD). It might look like an ordinary basket, but trust me, its a TBCD. Alice, Bob, and Chris walk through their orchards putting a tennis ball in the TBCD for each apple they see on the ground. By the time they have finished, each has a number of tennis balls in the TBCD equal to the number of apples that fell off the trees.



To figure out how many apples fell off the trees, Alice, Bob, and Chris each use a different method. Alice proceeds to count out the number of tennis balls in her TBCD. When she is done, she knows exactly the number of apples she saw. Bob is not as patient as Alice and uses an advanced computer vision system to automatically count the apples in his TBCD. When he is done, he knows approximately the number of apples he saw, but there is a small error because the CV system isn't perfect. Chris can't afford such a system, nor is he as patient as Alice, so he weighs his TBCD and using the weight of a tennis ball can determine approximately how many tennis balls there are.



Now here is the question. Who of these people used a system that counted the number of apples that fell in their orchards? Each at one point had a number of tennis balls equal to the number of apples. Does the readout method impact whether or not the TBCD counts apples that fell onto the ground?



The TBCD is (unsurprisingly) directly comparable to a cell in a CCD. It stores a number of electrons equal to the number of photons it captured. This most certainly qualifies as a photon count. Then, depending on your readout circuit, you might get a more or less precise reading of this value. Is it a count? If my image sensor counts the number of photons, but doesn't tell anyone, did it still count the number of photons? As I said earlier, I think this comes down to your definition of count, but I believe a CCD qualifies as a photon counting device.






share|improve this answer

























  • So, are you saying 'Yes' or 'No' ?

    – Mike Brockington
    Jul 5 at 15:10






  • 2





    @MikeBrockington a CCD counts photons in the same was as a kitchen scale counts baryons, because it measures a value approximately proportional to the number of protons and neutrons in the stuff you place on in. This isn't the usual of definition of 'count'.

    – Pete Kirkham
    Jul 5 at 15:22











  • @Pete You should post that as an anwer - I read the one above, and as you can tell, couldn't decide whether Matt was saying yes or no.

    – Mike Brockington
    Jul 5 at 15:24






  • 5





    Detectors running on the same basic physics as CCDs can be (routinely are in particle physics contexts) instrumented in such a way that they can say "hey, three photons hit this element" and mean neither two nor four (five is right out). But when the number of hits grows larger the resolution begins to encompass nearby integers. That is, if you get a result of 103 from the same element it might mean anywhere from 100 to 106. So high-confidence integer-valued counts are not a yes/no phenomena, but depend on the backing electronics and the magnitude of the signal.

    – dmckee
    Jul 5 at 17:06






  • 1





    One major quibble with this answer: what you call sub pixels are actually full pixels. Say you have a sensor advertised as having 20 megapixels. This sensor will have 20 million individual pixel sites. 5 million will have red filters, 5 million will be blue and 10 million will be green, arranged in what’s known as a Bayer pattern. When the raw image is converted to JPEG (or other image format) the color of each pixel will be interpolated using the data at that image site and its neighbors.

    – Jim Garrison
    Jul 6 at 6:10



















7














No, you won't obtain the photon count directly. Also, a camera sensor has noise, not just from photon counting but also from electrical circuits.



Also, a DSLR has a color filter on top of the pixels, even if you take only grayscale images. It will probabilistically filter away some photons. If the photon is of the correct color, chances of it passing the filter are much higher than with an incorrect colored photon.



There are sensors that count incoming photons, but a DSLR sensor is not among them. DSLR sensor just gives a single "intensity" value per pixel that is full of noise. It is roughly similar to sum of all incoming photons, but because of the noise, you can't tell the number of photons exactly as an integer.



If you are looking for a photon counting detector, this may not be the best place to ask. There is no Scientific Instrument Stack Exchange, but Physics Stack Exchange may come close.



Typically, photon counting detectors are cooled with liquid nitrogen to really low temperatures to minimize electrical noise coming from thermal effects. Needless to say, a DSLR isn't designed to be cooled to such low temperatures.



At shorter wavelengths such as X rays, you can actually count photons using a room temperature sensor, so no cooling required. However, light has far longer wavelength than X rays. I'd say it would make a great question on Physics Stack Exchange to ask whether a visible light photon counting sensor can be made without requiring cooling.






share|improve this answer


















  • 1





    I haven’t seen liquid nitrogen cooled dslr, but there are a few companies making coolers targeted toward astrophotographers on the market.

    – Hueco
    Jul 4 at 18:16











  • You can count visible-light photons if the rate of arrival is low enough for sure. Eyes can do this (I think dark-adapted human eyes are sensitive on the order of 10 photons but ?cats? and ?frogs? can detect single photons.

    – tfb
    Jul 4 at 21:15











  • "whether a visible light photon counting sensor can be made without requiring cooling" - I'm sure if we could, we would.

    – whatsisname
    Jul 4 at 22:28











  • Relevant to @tfb's comment: biology.stackexchange.com/questions/85026/…

    – Carsten S
    Jul 5 at 9:21


















2














The sensor is an array of silicon photodiodes that simply convert light energy into electricity.
They are interchangeably called pixels, photo sites, photo diodes. Pixel actually refers to picture element and it is the least accurate term in this aspect. Because each pixel uses information from adjoining photosites to determine its' color/brightness value, even for greyscale output (other than possibly a monochrome camera like the Leica Monochrom). But the location of a photo site does directly correlate with the location of a pixel in the output image.






share|improve this answer






























    2















    Does each cavity count the number of signals (or peaks) generated by each photon?




    No. There are no individual peaks to count.




    Or is there one signal which is the sum of all photons (in which case the size of the signal should depend on photon energy presumably)?




    Yes, more or less, an electrical current is generated that corresponds to the summation of the energy of the photons that hit the sensel during the time period of interest.




    And also I'm guessing each cavity correspond to a pixel? ... in the grayscale image case.




    Yes, for grayscale. For color, additional color information is interpolated from surrounding sensels (demosaicing). But for most purposes, it's fine to think of it as having a 1-1 correspondence.






    share|improve this answer






























      0














      Yes. A pixel roughly counts the number of photons which fall onto it. However, that number of photons is subsequently processed before the digital grayscale number (0-255, for example) is reported. This digital number is reported in what may be referred to as analog digital units, ADUs. One must do some reverse math to convert the ADU number back into number of photons.



      The following math should illuminate the situation for you. Assume a pixel area of A, an exposure time of T. Suppose we are illuminating with monochromatic light with frequency W. Physics tells us that the intensity of light can be calculated as




      I = hbar * W * N


      hbar is Planck's constant and N is the number of photons passing through a unit per unit time. N is the photon flux.



      We can see that if the exposure time is T then the number of photons passing through a particular pixel will be given by




      N_pixcount = N * A * T = I * A * T/(hbar * W)


      So given pixel area, light intensity, exposure time, and light frequency it is possible to calculate the average number of photons which pass through a pixel.



      How does a sensor register photons? For an ideal sensor, every time a photon falls on the sensor a photoelectron e- would be created. However, because sensors have finite quantum efficiency QE, only a sub-unity fraction of photons are converted into photoelectrons.




      N_electron = QE * N_pixcount


      Practical sensors may have quantum efficiencies anywhere from 30% up to 95% or so.



      After the photoelectrons are created the electron count is 1) converted into a voltage (using a charge amplifier) and then this voltage is converted into a digital signal (usings an analog-to-digital converter, ADC). These two stages are described by a signal specification for a sensor called the gain, G. The gain is specified in e-/ADU. That is, how many electrons are necessary to increase the grayscale level by 1. Putting this together we can see:




      grayscale_level = N_electron/G = N_pixcount * QE/G = I * (QE * A * T)/(hbar * W * G)


      This is a rough overview of how photons are converted into digital counts for a digital sensor. There are few more things that I haven't covered here. Namely 1) there is noise introduced in each of these stages that should be considered if you care about image quality and 2) some sensors will have additional post-processing amplification stages. For example, different pixels may be put through gain stages with different gains to help homogenize the sensor response.



      Regarding your question about the dependence on photon energy. We can see that the photon wavelength comes in in the conversion between intensity and photon number flux. The core part of my answer is that sensors are photon counting devices, it doesn't matter the energy of the photons. However, it is important to bear in mind that the for all real sensors the quantum efficiency is wavelength dependent. This means that a blue photon may have a better chance of being detected than a red photon. Thus, to determine what a sensor will read you need to figure out the photon flux for each wavelength included in the illumination and multiply through by the relevant factors to figure out the overall contribution to the pixel count due to all illumination wavelengths present.






      share|improve this answer



























        Your Answer








        StackExchange.ready(function()
        var channelOptions =
        tags: "".split(" "),
        id: "61"
        ;
        initTagRenderer("".split(" "), "".split(" "), channelOptions);

        StackExchange.using("externalEditor", function()
        // Have to fire editor after snippets, if snippets enabled
        if (StackExchange.settings.snippets.snippetsEnabled)
        StackExchange.using("snippets", function()
        createEditor();
        );

        else
        createEditor();

        );

        function createEditor()
        StackExchange.prepareEditor(
        heartbeatType: 'answer',
        autoActivateHeartbeat: false,
        convertImagesToLinks: false,
        noModals: true,
        showLowRepImageUploadWarning: true,
        reputationToPostImages: null,
        bindNavPrevention: true,
        postfix: "",
        imageUploader:
        brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
        contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
        allowUrls: true
        ,
        noCode: true, onDemand: true,
        discardSelector: ".discard-answer"
        ,immediatelyShowMarkdownHelp:true
        );



        );













        draft saved

        draft discarded


















        StackExchange.ready(
        function ()
        StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fphoto.stackexchange.com%2fquestions%2f109313%2fdoes-a-sensor-count-the-number-of-photons-that-hits-it%23new-answer', 'question_page');

        );

        Post as a guest















        Required, but never shown

























        5 Answers
        5






        active

        oldest

        votes








        5 Answers
        5






        active

        oldest

        votes









        active

        oldest

        votes






        active

        oldest

        votes









        10














        Your link discusses how a CCD (charge coupled device) image sensor works. Note, CCDs have applications besides images sensors, but the vast majority of CCDs are used as image sensors, and that is the only primary application I will be discussing.



        CCDs



        In typical CCDs used for color image sensing each CCD cell has a color filter over it. The most commonly used pattern groups 4 cells together with one red filter, one blue filter, and two green filters. These filters only allow photons of their corresponding colors, in a certain frequency band through. A greyscale CCD just doesn't have these filters.



        A CCD (when used as an image sensor) at its core is a photon counting device. A photon that is incident upon the active region of a CCD excites an electron through the photoelectric effect which is then stored within that cell of the CCD. This process continues as long as photons hit the cell causing electrons to accumulate within each cell.



        Your camera lens projects an image of the scene you are taking a picture of onto the CCD. This is the same as in a film camera, except with film instead of a CCD. Each pixel corresponds to one cell within the CCD. In the case of a color image, each pixel is the the product of one or more filtered cells, depending on the algorithm and cell location. The simplest algorithm groups each set of 4 filtered cells into a single pixel. However it is common for interpolation schemes to increase the number of full color pixels to equal the number of CCD cells.



        Photon Energy Dependence



        The signal does depend on photon energy, but only as a threshold. In order for a photon to generate an electron through the photoelectric effect it must have a certain amount of energy. This amount of energy is the "bandgap" energy of the semiconductor. The bandgap energy of silicon is about 1.1 eV, meaning photons with a wavelength of about 1100 nm and lower will be detected. As you continue to increase photon energy the signal remains constant at one electron per photon. Once your photons have twice the bandgap energy, or more, an incident photon can generate two electrons, but it is fairly rare.



        Once you have decided you are done taking your image the shutter is closed and it is time to read out what image was captured in the CCD. To read out the image the charge within each cell is shifted over one column within its row. The first column is then read out. This can be done by either measuring the current to discharge the cell, or measure the voltage of the cell while knowing the capacitance. Both of which can tell you how many electrons were stored in that cell. After the first column is read out, the cells are all shifted again, and this repeats until all cells have been read.



        Non-Idealities



        There are a number of factors that prevent typical CCDs from giving you an exact photon count. There is a significant amount of thermal noise that can only be reduced by lowering the temperature well below what is reasonable for a handheld camera to be capable of. There can be leakage within the CCD cells which can cause electrons to escape the cell, or move into nearby cells, which prevents an accurate count. There will also be photons that reflect off the cell, and therefore aren't counted.



        However, none of this changes the fact that a CCD counts photons. It just means it isn't a very precise photon counter. More on this below.




        Does a CCD Count Photons?



        I believe it does, but it comes down to the definition of "count". Lets consider an analogy.



        Alice, Bob, and Chris each own an apple orchard. They want to know how many apples have fallen off the trees in their orchards. To do this they use a Tennis Ball Coupled Device (TBCD). It might look like an ordinary basket, but trust me, its a TBCD. Alice, Bob, and Chris walk through their orchards putting a tennis ball in the TBCD for each apple they see on the ground. By the time they have finished, each has a number of tennis balls in the TBCD equal to the number of apples that fell off the trees.



        To figure out how many apples fell off the trees, Alice, Bob, and Chris each use a different method. Alice proceeds to count out the number of tennis balls in her TBCD. When she is done, she knows exactly the number of apples she saw. Bob is not as patient as Alice and uses an advanced computer vision system to automatically count the apples in his TBCD. When he is done, he knows approximately the number of apples he saw, but there is a small error because the CV system isn't perfect. Chris can't afford such a system, nor is he as patient as Alice, so he weighs his TBCD and using the weight of a tennis ball can determine approximately how many tennis balls there are.



        Now here is the question. Who of these people used a system that counted the number of apples that fell in their orchards? Each at one point had a number of tennis balls equal to the number of apples. Does the readout method impact whether or not the TBCD counts apples that fell onto the ground?



        The TBCD is (unsurprisingly) directly comparable to a cell in a CCD. It stores a number of electrons equal to the number of photons it captured. This most certainly qualifies as a photon count. Then, depending on your readout circuit, you might get a more or less precise reading of this value. Is it a count? If my image sensor counts the number of photons, but doesn't tell anyone, did it still count the number of photons? As I said earlier, I think this comes down to your definition of count, but I believe a CCD qualifies as a photon counting device.






        share|improve this answer

























        • So, are you saying 'Yes' or 'No' ?

          – Mike Brockington
          Jul 5 at 15:10






        • 2





          @MikeBrockington a CCD counts photons in the same was as a kitchen scale counts baryons, because it measures a value approximately proportional to the number of protons and neutrons in the stuff you place on in. This isn't the usual of definition of 'count'.

          – Pete Kirkham
          Jul 5 at 15:22











        • @Pete You should post that as an anwer - I read the one above, and as you can tell, couldn't decide whether Matt was saying yes or no.

          – Mike Brockington
          Jul 5 at 15:24






        • 5





          Detectors running on the same basic physics as CCDs can be (routinely are in particle physics contexts) instrumented in such a way that they can say "hey, three photons hit this element" and mean neither two nor four (five is right out). But when the number of hits grows larger the resolution begins to encompass nearby integers. That is, if you get a result of 103 from the same element it might mean anywhere from 100 to 106. So high-confidence integer-valued counts are not a yes/no phenomena, but depend on the backing electronics and the magnitude of the signal.

          – dmckee
          Jul 5 at 17:06






        • 1





          One major quibble with this answer: what you call sub pixels are actually full pixels. Say you have a sensor advertised as having 20 megapixels. This sensor will have 20 million individual pixel sites. 5 million will have red filters, 5 million will be blue and 10 million will be green, arranged in what’s known as a Bayer pattern. When the raw image is converted to JPEG (or other image format) the color of each pixel will be interpolated using the data at that image site and its neighbors.

          – Jim Garrison
          Jul 6 at 6:10
















        10














        Your link discusses how a CCD (charge coupled device) image sensor works. Note, CCDs have applications besides images sensors, but the vast majority of CCDs are used as image sensors, and that is the only primary application I will be discussing.



        CCDs



        In typical CCDs used for color image sensing each CCD cell has a color filter over it. The most commonly used pattern groups 4 cells together with one red filter, one blue filter, and two green filters. These filters only allow photons of their corresponding colors, in a certain frequency band through. A greyscale CCD just doesn't have these filters.



        A CCD (when used as an image sensor) at its core is a photon counting device. A photon that is incident upon the active region of a CCD excites an electron through the photoelectric effect which is then stored within that cell of the CCD. This process continues as long as photons hit the cell causing electrons to accumulate within each cell.



        Your camera lens projects an image of the scene you are taking a picture of onto the CCD. This is the same as in a film camera, except with film instead of a CCD. Each pixel corresponds to one cell within the CCD. In the case of a color image, each pixel is the the product of one or more filtered cells, depending on the algorithm and cell location. The simplest algorithm groups each set of 4 filtered cells into a single pixel. However it is common for interpolation schemes to increase the number of full color pixels to equal the number of CCD cells.



        Photon Energy Dependence



        The signal does depend on photon energy, but only as a threshold. In order for a photon to generate an electron through the photoelectric effect it must have a certain amount of energy. This amount of energy is the "bandgap" energy of the semiconductor. The bandgap energy of silicon is about 1.1 eV, meaning photons with a wavelength of about 1100 nm and lower will be detected. As you continue to increase photon energy the signal remains constant at one electron per photon. Once your photons have twice the bandgap energy, or more, an incident photon can generate two electrons, but it is fairly rare.



        Once you have decided you are done taking your image the shutter is closed and it is time to read out what image was captured in the CCD. To read out the image the charge within each cell is shifted over one column within its row. The first column is then read out. This can be done by either measuring the current to discharge the cell, or measure the voltage of the cell while knowing the capacitance. Both of which can tell you how many electrons were stored in that cell. After the first column is read out, the cells are all shifted again, and this repeats until all cells have been read.



        Non-Idealities



        There are a number of factors that prevent typical CCDs from giving you an exact photon count. There is a significant amount of thermal noise that can only be reduced by lowering the temperature well below what is reasonable for a handheld camera to be capable of. There can be leakage within the CCD cells which can cause electrons to escape the cell, or move into nearby cells, which prevents an accurate count. There will also be photons that reflect off the cell, and therefore aren't counted.



        However, none of this changes the fact that a CCD counts photons. It just means it isn't a very precise photon counter. More on this below.




        Does a CCD Count Photons?



        I believe it does, but it comes down to the definition of "count". Lets consider an analogy.



        Alice, Bob, and Chris each own an apple orchard. They want to know how many apples have fallen off the trees in their orchards. To do this they use a Tennis Ball Coupled Device (TBCD). It might look like an ordinary basket, but trust me, its a TBCD. Alice, Bob, and Chris walk through their orchards putting a tennis ball in the TBCD for each apple they see on the ground. By the time they have finished, each has a number of tennis balls in the TBCD equal to the number of apples that fell off the trees.



        To figure out how many apples fell off the trees, Alice, Bob, and Chris each use a different method. Alice proceeds to count out the number of tennis balls in her TBCD. When she is done, she knows exactly the number of apples she saw. Bob is not as patient as Alice and uses an advanced computer vision system to automatically count the apples in his TBCD. When he is done, he knows approximately the number of apples he saw, but there is a small error because the CV system isn't perfect. Chris can't afford such a system, nor is he as patient as Alice, so he weighs his TBCD and using the weight of a tennis ball can determine approximately how many tennis balls there are.



        Now here is the question. Who of these people used a system that counted the number of apples that fell in their orchards? Each at one point had a number of tennis balls equal to the number of apples. Does the readout method impact whether or not the TBCD counts apples that fell onto the ground?



        The TBCD is (unsurprisingly) directly comparable to a cell in a CCD. It stores a number of electrons equal to the number of photons it captured. This most certainly qualifies as a photon count. Then, depending on your readout circuit, you might get a more or less precise reading of this value. Is it a count? If my image sensor counts the number of photons, but doesn't tell anyone, did it still count the number of photons? As I said earlier, I think this comes down to your definition of count, but I believe a CCD qualifies as a photon counting device.






        share|improve this answer

























        • So, are you saying 'Yes' or 'No' ?

          – Mike Brockington
          Jul 5 at 15:10






        • 2





          @MikeBrockington a CCD counts photons in the same was as a kitchen scale counts baryons, because it measures a value approximately proportional to the number of protons and neutrons in the stuff you place on in. This isn't the usual of definition of 'count'.

          – Pete Kirkham
          Jul 5 at 15:22











        • @Pete You should post that as an anwer - I read the one above, and as you can tell, couldn't decide whether Matt was saying yes or no.

          – Mike Brockington
          Jul 5 at 15:24






        • 5





          Detectors running on the same basic physics as CCDs can be (routinely are in particle physics contexts) instrumented in such a way that they can say "hey, three photons hit this element" and mean neither two nor four (five is right out). But when the number of hits grows larger the resolution begins to encompass nearby integers. That is, if you get a result of 103 from the same element it might mean anywhere from 100 to 106. So high-confidence integer-valued counts are not a yes/no phenomena, but depend on the backing electronics and the magnitude of the signal.

          – dmckee
          Jul 5 at 17:06






        • 1





          One major quibble with this answer: what you call sub pixels are actually full pixels. Say you have a sensor advertised as having 20 megapixels. This sensor will have 20 million individual pixel sites. 5 million will have red filters, 5 million will be blue and 10 million will be green, arranged in what’s known as a Bayer pattern. When the raw image is converted to JPEG (or other image format) the color of each pixel will be interpolated using the data at that image site and its neighbors.

          – Jim Garrison
          Jul 6 at 6:10














        10












        10








        10







        Your link discusses how a CCD (charge coupled device) image sensor works. Note, CCDs have applications besides images sensors, but the vast majority of CCDs are used as image sensors, and that is the only primary application I will be discussing.



        CCDs



        In typical CCDs used for color image sensing each CCD cell has a color filter over it. The most commonly used pattern groups 4 cells together with one red filter, one blue filter, and two green filters. These filters only allow photons of their corresponding colors, in a certain frequency band through. A greyscale CCD just doesn't have these filters.



        A CCD (when used as an image sensor) at its core is a photon counting device. A photon that is incident upon the active region of a CCD excites an electron through the photoelectric effect which is then stored within that cell of the CCD. This process continues as long as photons hit the cell causing electrons to accumulate within each cell.



        Your camera lens projects an image of the scene you are taking a picture of onto the CCD. This is the same as in a film camera, except with film instead of a CCD. Each pixel corresponds to one cell within the CCD. In the case of a color image, each pixel is the the product of one or more filtered cells, depending on the algorithm and cell location. The simplest algorithm groups each set of 4 filtered cells into a single pixel. However it is common for interpolation schemes to increase the number of full color pixels to equal the number of CCD cells.



        Photon Energy Dependence



        The signal does depend on photon energy, but only as a threshold. In order for a photon to generate an electron through the photoelectric effect it must have a certain amount of energy. This amount of energy is the "bandgap" energy of the semiconductor. The bandgap energy of silicon is about 1.1 eV, meaning photons with a wavelength of about 1100 nm and lower will be detected. As you continue to increase photon energy the signal remains constant at one electron per photon. Once your photons have twice the bandgap energy, or more, an incident photon can generate two electrons, but it is fairly rare.



        Once you have decided you are done taking your image the shutter is closed and it is time to read out what image was captured in the CCD. To read out the image the charge within each cell is shifted over one column within its row. The first column is then read out. This can be done by either measuring the current to discharge the cell, or measure the voltage of the cell while knowing the capacitance. Both of which can tell you how many electrons were stored in that cell. After the first column is read out, the cells are all shifted again, and this repeats until all cells have been read.



        Non-Idealities



        There are a number of factors that prevent typical CCDs from giving you an exact photon count. There is a significant amount of thermal noise that can only be reduced by lowering the temperature well below what is reasonable for a handheld camera to be capable of. There can be leakage within the CCD cells which can cause electrons to escape the cell, or move into nearby cells, which prevents an accurate count. There will also be photons that reflect off the cell, and therefore aren't counted.



        However, none of this changes the fact that a CCD counts photons. It just means it isn't a very precise photon counter. More on this below.




        Does a CCD Count Photons?



        I believe it does, but it comes down to the definition of "count". Lets consider an analogy.



        Alice, Bob, and Chris each own an apple orchard. They want to know how many apples have fallen off the trees in their orchards. To do this they use a Tennis Ball Coupled Device (TBCD). It might look like an ordinary basket, but trust me, its a TBCD. Alice, Bob, and Chris walk through their orchards putting a tennis ball in the TBCD for each apple they see on the ground. By the time they have finished, each has a number of tennis balls in the TBCD equal to the number of apples that fell off the trees.



        To figure out how many apples fell off the trees, Alice, Bob, and Chris each use a different method. Alice proceeds to count out the number of tennis balls in her TBCD. When she is done, she knows exactly the number of apples she saw. Bob is not as patient as Alice and uses an advanced computer vision system to automatically count the apples in his TBCD. When he is done, he knows approximately the number of apples he saw, but there is a small error because the CV system isn't perfect. Chris can't afford such a system, nor is he as patient as Alice, so he weighs his TBCD and using the weight of a tennis ball can determine approximately how many tennis balls there are.



        Now here is the question. Who of these people used a system that counted the number of apples that fell in their orchards? Each at one point had a number of tennis balls equal to the number of apples. Does the readout method impact whether or not the TBCD counts apples that fell onto the ground?



        The TBCD is (unsurprisingly) directly comparable to a cell in a CCD. It stores a number of electrons equal to the number of photons it captured. This most certainly qualifies as a photon count. Then, depending on your readout circuit, you might get a more or less precise reading of this value. Is it a count? If my image sensor counts the number of photons, but doesn't tell anyone, did it still count the number of photons? As I said earlier, I think this comes down to your definition of count, but I believe a CCD qualifies as a photon counting device.






        share|improve this answer















        Your link discusses how a CCD (charge coupled device) image sensor works. Note, CCDs have applications besides images sensors, but the vast majority of CCDs are used as image sensors, and that is the only primary application I will be discussing.



        CCDs



        In typical CCDs used for color image sensing each CCD cell has a color filter over it. The most commonly used pattern groups 4 cells together with one red filter, one blue filter, and two green filters. These filters only allow photons of their corresponding colors, in a certain frequency band through. A greyscale CCD just doesn't have these filters.



        A CCD (when used as an image sensor) at its core is a photon counting device. A photon that is incident upon the active region of a CCD excites an electron through the photoelectric effect which is then stored within that cell of the CCD. This process continues as long as photons hit the cell causing electrons to accumulate within each cell.



        Your camera lens projects an image of the scene you are taking a picture of onto the CCD. This is the same as in a film camera, except with film instead of a CCD. Each pixel corresponds to one cell within the CCD. In the case of a color image, each pixel is the the product of one or more filtered cells, depending on the algorithm and cell location. The simplest algorithm groups each set of 4 filtered cells into a single pixel. However it is common for interpolation schemes to increase the number of full color pixels to equal the number of CCD cells.



        Photon Energy Dependence



        The signal does depend on photon energy, but only as a threshold. In order for a photon to generate an electron through the photoelectric effect it must have a certain amount of energy. This amount of energy is the "bandgap" energy of the semiconductor. The bandgap energy of silicon is about 1.1 eV, meaning photons with a wavelength of about 1100 nm and lower will be detected. As you continue to increase photon energy the signal remains constant at one electron per photon. Once your photons have twice the bandgap energy, or more, an incident photon can generate two electrons, but it is fairly rare.



        Once you have decided you are done taking your image the shutter is closed and it is time to read out what image was captured in the CCD. To read out the image the charge within each cell is shifted over one column within its row. The first column is then read out. This can be done by either measuring the current to discharge the cell, or measure the voltage of the cell while knowing the capacitance. Both of which can tell you how many electrons were stored in that cell. After the first column is read out, the cells are all shifted again, and this repeats until all cells have been read.



        Non-Idealities



        There are a number of factors that prevent typical CCDs from giving you an exact photon count. There is a significant amount of thermal noise that can only be reduced by lowering the temperature well below what is reasonable for a handheld camera to be capable of. There can be leakage within the CCD cells which can cause electrons to escape the cell, or move into nearby cells, which prevents an accurate count. There will also be photons that reflect off the cell, and therefore aren't counted.



        However, none of this changes the fact that a CCD counts photons. It just means it isn't a very precise photon counter. More on this below.




        Does a CCD Count Photons?



        I believe it does, but it comes down to the definition of "count". Lets consider an analogy.



        Alice, Bob, and Chris each own an apple orchard. They want to know how many apples have fallen off the trees in their orchards. To do this they use a Tennis Ball Coupled Device (TBCD). It might look like an ordinary basket, but trust me, its a TBCD. Alice, Bob, and Chris walk through their orchards putting a tennis ball in the TBCD for each apple they see on the ground. By the time they have finished, each has a number of tennis balls in the TBCD equal to the number of apples that fell off the trees.



        To figure out how many apples fell off the trees, Alice, Bob, and Chris each use a different method. Alice proceeds to count out the number of tennis balls in her TBCD. When she is done, she knows exactly the number of apples she saw. Bob is not as patient as Alice and uses an advanced computer vision system to automatically count the apples in his TBCD. When he is done, he knows approximately the number of apples he saw, but there is a small error because the CV system isn't perfect. Chris can't afford such a system, nor is he as patient as Alice, so he weighs his TBCD and using the weight of a tennis ball can determine approximately how many tennis balls there are.



        Now here is the question. Who of these people used a system that counted the number of apples that fell in their orchards? Each at one point had a number of tennis balls equal to the number of apples. Does the readout method impact whether or not the TBCD counts apples that fell onto the ground?



        The TBCD is (unsurprisingly) directly comparable to a cell in a CCD. It stores a number of electrons equal to the number of photons it captured. This most certainly qualifies as a photon count. Then, depending on your readout circuit, you might get a more or less precise reading of this value. Is it a count? If my image sensor counts the number of photons, but doesn't tell anyone, did it still count the number of photons? As I said earlier, I think this comes down to your definition of count, but I believe a CCD qualifies as a photon counting device.







        share|improve this answer














        share|improve this answer



        share|improve this answer








        edited Jul 6 at 18:51

























        answered Jul 5 at 0:08









        MattMatt

        2244 bronze badges




        2244 bronze badges












        • So, are you saying 'Yes' or 'No' ?

          – Mike Brockington
          Jul 5 at 15:10






        • 2





          @MikeBrockington a CCD counts photons in the same was as a kitchen scale counts baryons, because it measures a value approximately proportional to the number of protons and neutrons in the stuff you place on in. This isn't the usual of definition of 'count'.

          – Pete Kirkham
          Jul 5 at 15:22











        • @Pete You should post that as an anwer - I read the one above, and as you can tell, couldn't decide whether Matt was saying yes or no.

          – Mike Brockington
          Jul 5 at 15:24






        • 5





          Detectors running on the same basic physics as CCDs can be (routinely are in particle physics contexts) instrumented in such a way that they can say "hey, three photons hit this element" and mean neither two nor four (five is right out). But when the number of hits grows larger the resolution begins to encompass nearby integers. That is, if you get a result of 103 from the same element it might mean anywhere from 100 to 106. So high-confidence integer-valued counts are not a yes/no phenomena, but depend on the backing electronics and the magnitude of the signal.

          – dmckee
          Jul 5 at 17:06






        • 1





          One major quibble with this answer: what you call sub pixels are actually full pixels. Say you have a sensor advertised as having 20 megapixels. This sensor will have 20 million individual pixel sites. 5 million will have red filters, 5 million will be blue and 10 million will be green, arranged in what’s known as a Bayer pattern. When the raw image is converted to JPEG (or other image format) the color of each pixel will be interpolated using the data at that image site and its neighbors.

          – Jim Garrison
          Jul 6 at 6:10


















        • So, are you saying 'Yes' or 'No' ?

          – Mike Brockington
          Jul 5 at 15:10






        • 2





          @MikeBrockington a CCD counts photons in the same was as a kitchen scale counts baryons, because it measures a value approximately proportional to the number of protons and neutrons in the stuff you place on in. This isn't the usual of definition of 'count'.

          – Pete Kirkham
          Jul 5 at 15:22











        • @Pete You should post that as an anwer - I read the one above, and as you can tell, couldn't decide whether Matt was saying yes or no.

          – Mike Brockington
          Jul 5 at 15:24






        • 5





          Detectors running on the same basic physics as CCDs can be (routinely are in particle physics contexts) instrumented in such a way that they can say "hey, three photons hit this element" and mean neither two nor four (five is right out). But when the number of hits grows larger the resolution begins to encompass nearby integers. That is, if you get a result of 103 from the same element it might mean anywhere from 100 to 106. So high-confidence integer-valued counts are not a yes/no phenomena, but depend on the backing electronics and the magnitude of the signal.

          – dmckee
          Jul 5 at 17:06






        • 1





          One major quibble with this answer: what you call sub pixels are actually full pixels. Say you have a sensor advertised as having 20 megapixels. This sensor will have 20 million individual pixel sites. 5 million will have red filters, 5 million will be blue and 10 million will be green, arranged in what’s known as a Bayer pattern. When the raw image is converted to JPEG (or other image format) the color of each pixel will be interpolated using the data at that image site and its neighbors.

          – Jim Garrison
          Jul 6 at 6:10

















        So, are you saying 'Yes' or 'No' ?

        – Mike Brockington
        Jul 5 at 15:10





        So, are you saying 'Yes' or 'No' ?

        – Mike Brockington
        Jul 5 at 15:10




        2




        2





        @MikeBrockington a CCD counts photons in the same was as a kitchen scale counts baryons, because it measures a value approximately proportional to the number of protons and neutrons in the stuff you place on in. This isn't the usual of definition of 'count'.

        – Pete Kirkham
        Jul 5 at 15:22





        @MikeBrockington a CCD counts photons in the same was as a kitchen scale counts baryons, because it measures a value approximately proportional to the number of protons and neutrons in the stuff you place on in. This isn't the usual of definition of 'count'.

        – Pete Kirkham
        Jul 5 at 15:22













        @Pete You should post that as an anwer - I read the one above, and as you can tell, couldn't decide whether Matt was saying yes or no.

        – Mike Brockington
        Jul 5 at 15:24





        @Pete You should post that as an anwer - I read the one above, and as you can tell, couldn't decide whether Matt was saying yes or no.

        – Mike Brockington
        Jul 5 at 15:24




        5




        5





        Detectors running on the same basic physics as CCDs can be (routinely are in particle physics contexts) instrumented in such a way that they can say "hey, three photons hit this element" and mean neither two nor four (five is right out). But when the number of hits grows larger the resolution begins to encompass nearby integers. That is, if you get a result of 103 from the same element it might mean anywhere from 100 to 106. So high-confidence integer-valued counts are not a yes/no phenomena, but depend on the backing electronics and the magnitude of the signal.

        – dmckee
        Jul 5 at 17:06





        Detectors running on the same basic physics as CCDs can be (routinely are in particle physics contexts) instrumented in such a way that they can say "hey, three photons hit this element" and mean neither two nor four (five is right out). But when the number of hits grows larger the resolution begins to encompass nearby integers. That is, if you get a result of 103 from the same element it might mean anywhere from 100 to 106. So high-confidence integer-valued counts are not a yes/no phenomena, but depend on the backing electronics and the magnitude of the signal.

        – dmckee
        Jul 5 at 17:06




        1




        1





        One major quibble with this answer: what you call sub pixels are actually full pixels. Say you have a sensor advertised as having 20 megapixels. This sensor will have 20 million individual pixel sites. 5 million will have red filters, 5 million will be blue and 10 million will be green, arranged in what’s known as a Bayer pattern. When the raw image is converted to JPEG (or other image format) the color of each pixel will be interpolated using the data at that image site and its neighbors.

        – Jim Garrison
        Jul 6 at 6:10






        One major quibble with this answer: what you call sub pixels are actually full pixels. Say you have a sensor advertised as having 20 megapixels. This sensor will have 20 million individual pixel sites. 5 million will have red filters, 5 million will be blue and 10 million will be green, arranged in what’s known as a Bayer pattern. When the raw image is converted to JPEG (or other image format) the color of each pixel will be interpolated using the data at that image site and its neighbors.

        – Jim Garrison
        Jul 6 at 6:10














        7














        No, you won't obtain the photon count directly. Also, a camera sensor has noise, not just from photon counting but also from electrical circuits.



        Also, a DSLR has a color filter on top of the pixels, even if you take only grayscale images. It will probabilistically filter away some photons. If the photon is of the correct color, chances of it passing the filter are much higher than with an incorrect colored photon.



        There are sensors that count incoming photons, but a DSLR sensor is not among them. DSLR sensor just gives a single "intensity" value per pixel that is full of noise. It is roughly similar to sum of all incoming photons, but because of the noise, you can't tell the number of photons exactly as an integer.



        If you are looking for a photon counting detector, this may not be the best place to ask. There is no Scientific Instrument Stack Exchange, but Physics Stack Exchange may come close.



        Typically, photon counting detectors are cooled with liquid nitrogen to really low temperatures to minimize electrical noise coming from thermal effects. Needless to say, a DSLR isn't designed to be cooled to such low temperatures.



        At shorter wavelengths such as X rays, you can actually count photons using a room temperature sensor, so no cooling required. However, light has far longer wavelength than X rays. I'd say it would make a great question on Physics Stack Exchange to ask whether a visible light photon counting sensor can be made without requiring cooling.






        share|improve this answer


















        • 1





          I haven’t seen liquid nitrogen cooled dslr, but there are a few companies making coolers targeted toward astrophotographers on the market.

          – Hueco
          Jul 4 at 18:16











        • You can count visible-light photons if the rate of arrival is low enough for sure. Eyes can do this (I think dark-adapted human eyes are sensitive on the order of 10 photons but ?cats? and ?frogs? can detect single photons.

          – tfb
          Jul 4 at 21:15











        • "whether a visible light photon counting sensor can be made without requiring cooling" - I'm sure if we could, we would.

          – whatsisname
          Jul 4 at 22:28











        • Relevant to @tfb's comment: biology.stackexchange.com/questions/85026/…

          – Carsten S
          Jul 5 at 9:21















        7














        No, you won't obtain the photon count directly. Also, a camera sensor has noise, not just from photon counting but also from electrical circuits.



        Also, a DSLR has a color filter on top of the pixels, even if you take only grayscale images. It will probabilistically filter away some photons. If the photon is of the correct color, chances of it passing the filter are much higher than with an incorrect colored photon.



        There are sensors that count incoming photons, but a DSLR sensor is not among them. DSLR sensor just gives a single "intensity" value per pixel that is full of noise. It is roughly similar to sum of all incoming photons, but because of the noise, you can't tell the number of photons exactly as an integer.



        If you are looking for a photon counting detector, this may not be the best place to ask. There is no Scientific Instrument Stack Exchange, but Physics Stack Exchange may come close.



        Typically, photon counting detectors are cooled with liquid nitrogen to really low temperatures to minimize electrical noise coming from thermal effects. Needless to say, a DSLR isn't designed to be cooled to such low temperatures.



        At shorter wavelengths such as X rays, you can actually count photons using a room temperature sensor, so no cooling required. However, light has far longer wavelength than X rays. I'd say it would make a great question on Physics Stack Exchange to ask whether a visible light photon counting sensor can be made without requiring cooling.






        share|improve this answer


















        • 1





          I haven’t seen liquid nitrogen cooled dslr, but there are a few companies making coolers targeted toward astrophotographers on the market.

          – Hueco
          Jul 4 at 18:16











        • You can count visible-light photons if the rate of arrival is low enough for sure. Eyes can do this (I think dark-adapted human eyes are sensitive on the order of 10 photons but ?cats? and ?frogs? can detect single photons.

          – tfb
          Jul 4 at 21:15











        • "whether a visible light photon counting sensor can be made without requiring cooling" - I'm sure if we could, we would.

          – whatsisname
          Jul 4 at 22:28











        • Relevant to @tfb's comment: biology.stackexchange.com/questions/85026/…

          – Carsten S
          Jul 5 at 9:21













        7












        7








        7







        No, you won't obtain the photon count directly. Also, a camera sensor has noise, not just from photon counting but also from electrical circuits.



        Also, a DSLR has a color filter on top of the pixels, even if you take only grayscale images. It will probabilistically filter away some photons. If the photon is of the correct color, chances of it passing the filter are much higher than with an incorrect colored photon.



        There are sensors that count incoming photons, but a DSLR sensor is not among them. DSLR sensor just gives a single "intensity" value per pixel that is full of noise. It is roughly similar to sum of all incoming photons, but because of the noise, you can't tell the number of photons exactly as an integer.



        If you are looking for a photon counting detector, this may not be the best place to ask. There is no Scientific Instrument Stack Exchange, but Physics Stack Exchange may come close.



        Typically, photon counting detectors are cooled with liquid nitrogen to really low temperatures to minimize electrical noise coming from thermal effects. Needless to say, a DSLR isn't designed to be cooled to such low temperatures.



        At shorter wavelengths such as X rays, you can actually count photons using a room temperature sensor, so no cooling required. However, light has far longer wavelength than X rays. I'd say it would make a great question on Physics Stack Exchange to ask whether a visible light photon counting sensor can be made without requiring cooling.






        share|improve this answer













        No, you won't obtain the photon count directly. Also, a camera sensor has noise, not just from photon counting but also from electrical circuits.



        Also, a DSLR has a color filter on top of the pixels, even if you take only grayscale images. It will probabilistically filter away some photons. If the photon is of the correct color, chances of it passing the filter are much higher than with an incorrect colored photon.



        There are sensors that count incoming photons, but a DSLR sensor is not among them. DSLR sensor just gives a single "intensity" value per pixel that is full of noise. It is roughly similar to sum of all incoming photons, but because of the noise, you can't tell the number of photons exactly as an integer.



        If you are looking for a photon counting detector, this may not be the best place to ask. There is no Scientific Instrument Stack Exchange, but Physics Stack Exchange may come close.



        Typically, photon counting detectors are cooled with liquid nitrogen to really low temperatures to minimize electrical noise coming from thermal effects. Needless to say, a DSLR isn't designed to be cooled to such low temperatures.



        At shorter wavelengths such as X rays, you can actually count photons using a room temperature sensor, so no cooling required. However, light has far longer wavelength than X rays. I'd say it would make a great question on Physics Stack Exchange to ask whether a visible light photon counting sensor can be made without requiring cooling.







        share|improve this answer












        share|improve this answer



        share|improve this answer










        answered Jul 4 at 13:46









        juhistjuhist

        1,9201 silver badge20 bronze badges




        1,9201 silver badge20 bronze badges







        • 1





          I haven’t seen liquid nitrogen cooled dslr, but there are a few companies making coolers targeted toward astrophotographers on the market.

          – Hueco
          Jul 4 at 18:16











        • You can count visible-light photons if the rate of arrival is low enough for sure. Eyes can do this (I think dark-adapted human eyes are sensitive on the order of 10 photons but ?cats? and ?frogs? can detect single photons.

          – tfb
          Jul 4 at 21:15











        • "whether a visible light photon counting sensor can be made without requiring cooling" - I'm sure if we could, we would.

          – whatsisname
          Jul 4 at 22:28











        • Relevant to @tfb's comment: biology.stackexchange.com/questions/85026/…

          – Carsten S
          Jul 5 at 9:21












        • 1





          I haven’t seen liquid nitrogen cooled dslr, but there are a few companies making coolers targeted toward astrophotographers on the market.

          – Hueco
          Jul 4 at 18:16











        • You can count visible-light photons if the rate of arrival is low enough for sure. Eyes can do this (I think dark-adapted human eyes are sensitive on the order of 10 photons but ?cats? and ?frogs? can detect single photons.

          – tfb
          Jul 4 at 21:15











        • "whether a visible light photon counting sensor can be made without requiring cooling" - I'm sure if we could, we would.

          – whatsisname
          Jul 4 at 22:28











        • Relevant to @tfb's comment: biology.stackexchange.com/questions/85026/…

          – Carsten S
          Jul 5 at 9:21







        1




        1





        I haven’t seen liquid nitrogen cooled dslr, but there are a few companies making coolers targeted toward astrophotographers on the market.

        – Hueco
        Jul 4 at 18:16





        I haven’t seen liquid nitrogen cooled dslr, but there are a few companies making coolers targeted toward astrophotographers on the market.

        – Hueco
        Jul 4 at 18:16













        You can count visible-light photons if the rate of arrival is low enough for sure. Eyes can do this (I think dark-adapted human eyes are sensitive on the order of 10 photons but ?cats? and ?frogs? can detect single photons.

        – tfb
        Jul 4 at 21:15





        You can count visible-light photons if the rate of arrival is low enough for sure. Eyes can do this (I think dark-adapted human eyes are sensitive on the order of 10 photons but ?cats? and ?frogs? can detect single photons.

        – tfb
        Jul 4 at 21:15













        "whether a visible light photon counting sensor can be made without requiring cooling" - I'm sure if we could, we would.

        – whatsisname
        Jul 4 at 22:28





        "whether a visible light photon counting sensor can be made without requiring cooling" - I'm sure if we could, we would.

        – whatsisname
        Jul 4 at 22:28













        Relevant to @tfb's comment: biology.stackexchange.com/questions/85026/…

        – Carsten S
        Jul 5 at 9:21





        Relevant to @tfb's comment: biology.stackexchange.com/questions/85026/…

        – Carsten S
        Jul 5 at 9:21











        2














        The sensor is an array of silicon photodiodes that simply convert light energy into electricity.
        They are interchangeably called pixels, photo sites, photo diodes. Pixel actually refers to picture element and it is the least accurate term in this aspect. Because each pixel uses information from adjoining photosites to determine its' color/brightness value, even for greyscale output (other than possibly a monochrome camera like the Leica Monochrom). But the location of a photo site does directly correlate with the location of a pixel in the output image.






        share|improve this answer



























          2














          The sensor is an array of silicon photodiodes that simply convert light energy into electricity.
          They are interchangeably called pixels, photo sites, photo diodes. Pixel actually refers to picture element and it is the least accurate term in this aspect. Because each pixel uses information from adjoining photosites to determine its' color/brightness value, even for greyscale output (other than possibly a monochrome camera like the Leica Monochrom). But the location of a photo site does directly correlate with the location of a pixel in the output image.






          share|improve this answer

























            2












            2








            2







            The sensor is an array of silicon photodiodes that simply convert light energy into electricity.
            They are interchangeably called pixels, photo sites, photo diodes. Pixel actually refers to picture element and it is the least accurate term in this aspect. Because each pixel uses information from adjoining photosites to determine its' color/brightness value, even for greyscale output (other than possibly a monochrome camera like the Leica Monochrom). But the location of a photo site does directly correlate with the location of a pixel in the output image.






            share|improve this answer













            The sensor is an array of silicon photodiodes that simply convert light energy into electricity.
            They are interchangeably called pixels, photo sites, photo diodes. Pixel actually refers to picture element and it is the least accurate term in this aspect. Because each pixel uses information from adjoining photosites to determine its' color/brightness value, even for greyscale output (other than possibly a monochrome camera like the Leica Monochrom). But the location of a photo site does directly correlate with the location of a pixel in the output image.







            share|improve this answer












            share|improve this answer



            share|improve this answer










            answered Jul 4 at 20:41









            Steven KerstingSteven Kersting

            7421 silver badge8 bronze badges




            7421 silver badge8 bronze badges





















                2















                Does each cavity count the number of signals (or peaks) generated by each photon?




                No. There are no individual peaks to count.




                Or is there one signal which is the sum of all photons (in which case the size of the signal should depend on photon energy presumably)?




                Yes, more or less, an electrical current is generated that corresponds to the summation of the energy of the photons that hit the sensel during the time period of interest.




                And also I'm guessing each cavity correspond to a pixel? ... in the grayscale image case.




                Yes, for grayscale. For color, additional color information is interpolated from surrounding sensels (demosaicing). But for most purposes, it's fine to think of it as having a 1-1 correspondence.






                share|improve this answer



























                  2















                  Does each cavity count the number of signals (or peaks) generated by each photon?




                  No. There are no individual peaks to count.




                  Or is there one signal which is the sum of all photons (in which case the size of the signal should depend on photon energy presumably)?




                  Yes, more or less, an electrical current is generated that corresponds to the summation of the energy of the photons that hit the sensel during the time period of interest.




                  And also I'm guessing each cavity correspond to a pixel? ... in the grayscale image case.




                  Yes, for grayscale. For color, additional color information is interpolated from surrounding sensels (demosaicing). But for most purposes, it's fine to think of it as having a 1-1 correspondence.






                  share|improve this answer

























                    2












                    2








                    2








                    Does each cavity count the number of signals (or peaks) generated by each photon?




                    No. There are no individual peaks to count.




                    Or is there one signal which is the sum of all photons (in which case the size of the signal should depend on photon energy presumably)?




                    Yes, more or less, an electrical current is generated that corresponds to the summation of the energy of the photons that hit the sensel during the time period of interest.




                    And also I'm guessing each cavity correspond to a pixel? ... in the grayscale image case.




                    Yes, for grayscale. For color, additional color information is interpolated from surrounding sensels (demosaicing). But for most purposes, it's fine to think of it as having a 1-1 correspondence.






                    share|improve this answer














                    Does each cavity count the number of signals (or peaks) generated by each photon?




                    No. There are no individual peaks to count.




                    Or is there one signal which is the sum of all photons (in which case the size of the signal should depend on photon energy presumably)?




                    Yes, more or less, an electrical current is generated that corresponds to the summation of the energy of the photons that hit the sensel during the time period of interest.




                    And also I'm guessing each cavity correspond to a pixel? ... in the grayscale image case.




                    Yes, for grayscale. For color, additional color information is interpolated from surrounding sensels (demosaicing). But for most purposes, it's fine to think of it as having a 1-1 correspondence.







                    share|improve this answer












                    share|improve this answer



                    share|improve this answer










                    answered Jul 5 at 22:38









                    xiotaxiota

                    15.3k4 gold badges22 silver badges74 bronze badges




                    15.3k4 gold badges22 silver badges74 bronze badges





















                        0














                        Yes. A pixel roughly counts the number of photons which fall onto it. However, that number of photons is subsequently processed before the digital grayscale number (0-255, for example) is reported. This digital number is reported in what may be referred to as analog digital units, ADUs. One must do some reverse math to convert the ADU number back into number of photons.



                        The following math should illuminate the situation for you. Assume a pixel area of A, an exposure time of T. Suppose we are illuminating with monochromatic light with frequency W. Physics tells us that the intensity of light can be calculated as




                        I = hbar * W * N


                        hbar is Planck's constant and N is the number of photons passing through a unit per unit time. N is the photon flux.



                        We can see that if the exposure time is T then the number of photons passing through a particular pixel will be given by




                        N_pixcount = N * A * T = I * A * T/(hbar * W)


                        So given pixel area, light intensity, exposure time, and light frequency it is possible to calculate the average number of photons which pass through a pixel.



                        How does a sensor register photons? For an ideal sensor, every time a photon falls on the sensor a photoelectron e- would be created. However, because sensors have finite quantum efficiency QE, only a sub-unity fraction of photons are converted into photoelectrons.




                        N_electron = QE * N_pixcount


                        Practical sensors may have quantum efficiencies anywhere from 30% up to 95% or so.



                        After the photoelectrons are created the electron count is 1) converted into a voltage (using a charge amplifier) and then this voltage is converted into a digital signal (usings an analog-to-digital converter, ADC). These two stages are described by a signal specification for a sensor called the gain, G. The gain is specified in e-/ADU. That is, how many electrons are necessary to increase the grayscale level by 1. Putting this together we can see:




                        grayscale_level = N_electron/G = N_pixcount * QE/G = I * (QE * A * T)/(hbar * W * G)


                        This is a rough overview of how photons are converted into digital counts for a digital sensor. There are few more things that I haven't covered here. Namely 1) there is noise introduced in each of these stages that should be considered if you care about image quality and 2) some sensors will have additional post-processing amplification stages. For example, different pixels may be put through gain stages with different gains to help homogenize the sensor response.



                        Regarding your question about the dependence on photon energy. We can see that the photon wavelength comes in in the conversion between intensity and photon number flux. The core part of my answer is that sensors are photon counting devices, it doesn't matter the energy of the photons. However, it is important to bear in mind that the for all real sensors the quantum efficiency is wavelength dependent. This means that a blue photon may have a better chance of being detected than a red photon. Thus, to determine what a sensor will read you need to figure out the photon flux for each wavelength included in the illumination and multiply through by the relevant factors to figure out the overall contribution to the pixel count due to all illumination wavelengths present.






                        share|improve this answer





























                          0














                          Yes. A pixel roughly counts the number of photons which fall onto it. However, that number of photons is subsequently processed before the digital grayscale number (0-255, for example) is reported. This digital number is reported in what may be referred to as analog digital units, ADUs. One must do some reverse math to convert the ADU number back into number of photons.



                          The following math should illuminate the situation for you. Assume a pixel area of A, an exposure time of T. Suppose we are illuminating with monochromatic light with frequency W. Physics tells us that the intensity of light can be calculated as




                          I = hbar * W * N


                          hbar is Planck's constant and N is the number of photons passing through a unit per unit time. N is the photon flux.



                          We can see that if the exposure time is T then the number of photons passing through a particular pixel will be given by




                          N_pixcount = N * A * T = I * A * T/(hbar * W)


                          So given pixel area, light intensity, exposure time, and light frequency it is possible to calculate the average number of photons which pass through a pixel.



                          How does a sensor register photons? For an ideal sensor, every time a photon falls on the sensor a photoelectron e- would be created. However, because sensors have finite quantum efficiency QE, only a sub-unity fraction of photons are converted into photoelectrons.




                          N_electron = QE * N_pixcount


                          Practical sensors may have quantum efficiencies anywhere from 30% up to 95% or so.



                          After the photoelectrons are created the electron count is 1) converted into a voltage (using a charge amplifier) and then this voltage is converted into a digital signal (usings an analog-to-digital converter, ADC). These two stages are described by a signal specification for a sensor called the gain, G. The gain is specified in e-/ADU. That is, how many electrons are necessary to increase the grayscale level by 1. Putting this together we can see:




                          grayscale_level = N_electron/G = N_pixcount * QE/G = I * (QE * A * T)/(hbar * W * G)


                          This is a rough overview of how photons are converted into digital counts for a digital sensor. There are few more things that I haven't covered here. Namely 1) there is noise introduced in each of these stages that should be considered if you care about image quality and 2) some sensors will have additional post-processing amplification stages. For example, different pixels may be put through gain stages with different gains to help homogenize the sensor response.



                          Regarding your question about the dependence on photon energy. We can see that the photon wavelength comes in in the conversion between intensity and photon number flux. The core part of my answer is that sensors are photon counting devices, it doesn't matter the energy of the photons. However, it is important to bear in mind that the for all real sensors the quantum efficiency is wavelength dependent. This means that a blue photon may have a better chance of being detected than a red photon. Thus, to determine what a sensor will read you need to figure out the photon flux for each wavelength included in the illumination and multiply through by the relevant factors to figure out the overall contribution to the pixel count due to all illumination wavelengths present.






                          share|improve this answer



























                            0












                            0








                            0







                            Yes. A pixel roughly counts the number of photons which fall onto it. However, that number of photons is subsequently processed before the digital grayscale number (0-255, for example) is reported. This digital number is reported in what may be referred to as analog digital units, ADUs. One must do some reverse math to convert the ADU number back into number of photons.



                            The following math should illuminate the situation for you. Assume a pixel area of A, an exposure time of T. Suppose we are illuminating with monochromatic light with frequency W. Physics tells us that the intensity of light can be calculated as




                            I = hbar * W * N


                            hbar is Planck's constant and N is the number of photons passing through a unit per unit time. N is the photon flux.



                            We can see that if the exposure time is T then the number of photons passing through a particular pixel will be given by




                            N_pixcount = N * A * T = I * A * T/(hbar * W)


                            So given pixel area, light intensity, exposure time, and light frequency it is possible to calculate the average number of photons which pass through a pixel.



                            How does a sensor register photons? For an ideal sensor, every time a photon falls on the sensor a photoelectron e- would be created. However, because sensors have finite quantum efficiency QE, only a sub-unity fraction of photons are converted into photoelectrons.




                            N_electron = QE * N_pixcount


                            Practical sensors may have quantum efficiencies anywhere from 30% up to 95% or so.



                            After the photoelectrons are created the electron count is 1) converted into a voltage (using a charge amplifier) and then this voltage is converted into a digital signal (usings an analog-to-digital converter, ADC). These two stages are described by a signal specification for a sensor called the gain, G. The gain is specified in e-/ADU. That is, how many electrons are necessary to increase the grayscale level by 1. Putting this together we can see:




                            grayscale_level = N_electron/G = N_pixcount * QE/G = I * (QE * A * T)/(hbar * W * G)


                            This is a rough overview of how photons are converted into digital counts for a digital sensor. There are few more things that I haven't covered here. Namely 1) there is noise introduced in each of these stages that should be considered if you care about image quality and 2) some sensors will have additional post-processing amplification stages. For example, different pixels may be put through gain stages with different gains to help homogenize the sensor response.



                            Regarding your question about the dependence on photon energy. We can see that the photon wavelength comes in in the conversion between intensity and photon number flux. The core part of my answer is that sensors are photon counting devices, it doesn't matter the energy of the photons. However, it is important to bear in mind that the for all real sensors the quantum efficiency is wavelength dependent. This means that a blue photon may have a better chance of being detected than a red photon. Thus, to determine what a sensor will read you need to figure out the photon flux for each wavelength included in the illumination and multiply through by the relevant factors to figure out the overall contribution to the pixel count due to all illumination wavelengths present.






                            share|improve this answer















                            Yes. A pixel roughly counts the number of photons which fall onto it. However, that number of photons is subsequently processed before the digital grayscale number (0-255, for example) is reported. This digital number is reported in what may be referred to as analog digital units, ADUs. One must do some reverse math to convert the ADU number back into number of photons.



                            The following math should illuminate the situation for you. Assume a pixel area of A, an exposure time of T. Suppose we are illuminating with monochromatic light with frequency W. Physics tells us that the intensity of light can be calculated as




                            I = hbar * W * N


                            hbar is Planck's constant and N is the number of photons passing through a unit per unit time. N is the photon flux.



                            We can see that if the exposure time is T then the number of photons passing through a particular pixel will be given by




                            N_pixcount = N * A * T = I * A * T/(hbar * W)


                            So given pixel area, light intensity, exposure time, and light frequency it is possible to calculate the average number of photons which pass through a pixel.



                            How does a sensor register photons? For an ideal sensor, every time a photon falls on the sensor a photoelectron e- would be created. However, because sensors have finite quantum efficiency QE, only a sub-unity fraction of photons are converted into photoelectrons.




                            N_electron = QE * N_pixcount


                            Practical sensors may have quantum efficiencies anywhere from 30% up to 95% or so.



                            After the photoelectrons are created the electron count is 1) converted into a voltage (using a charge amplifier) and then this voltage is converted into a digital signal (usings an analog-to-digital converter, ADC). These two stages are described by a signal specification for a sensor called the gain, G. The gain is specified in e-/ADU. That is, how many electrons are necessary to increase the grayscale level by 1. Putting this together we can see:




                            grayscale_level = N_electron/G = N_pixcount * QE/G = I * (QE * A * T)/(hbar * W * G)


                            This is a rough overview of how photons are converted into digital counts for a digital sensor. There are few more things that I haven't covered here. Namely 1) there is noise introduced in each of these stages that should be considered if you care about image quality and 2) some sensors will have additional post-processing amplification stages. For example, different pixels may be put through gain stages with different gains to help homogenize the sensor response.



                            Regarding your question about the dependence on photon energy. We can see that the photon wavelength comes in in the conversion between intensity and photon number flux. The core part of my answer is that sensors are photon counting devices, it doesn't matter the energy of the photons. However, it is important to bear in mind that the for all real sensors the quantum efficiency is wavelength dependent. This means that a blue photon may have a better chance of being detected than a red photon. Thus, to determine what a sensor will read you need to figure out the photon flux for each wavelength included in the illumination and multiply through by the relevant factors to figure out the overall contribution to the pixel count due to all illumination wavelengths present.







                            share|improve this answer














                            share|improve this answer



                            share|improve this answer








                            edited Jul 5 at 16:26









                            juhist

                            1,9201 silver badge20 bronze badges




                            1,9201 silver badge20 bronze badges










                            answered Jul 5 at 15:55









                            Jagerber48Jagerber48

                            1091 bronze badge




                            1091 bronze badge



























                                draft saved

                                draft discarded
















































                                Thanks for contributing an answer to Photography Stack Exchange!


                                • Please be sure to answer the question. Provide details and share your research!

                                But avoid


                                • Asking for help, clarification, or responding to other answers.

                                • Making statements based on opinion; back them up with references or personal experience.

                                To learn more, see our tips on writing great answers.




                                draft saved


                                draft discarded














                                StackExchange.ready(
                                function ()
                                StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fphoto.stackexchange.com%2fquestions%2f109313%2fdoes-a-sensor-count-the-number-of-photons-that-hits-it%23new-answer', 'question_page');

                                );

                                Post as a guest















                                Required, but never shown





















































                                Required, but never shown














                                Required, but never shown












                                Required, but never shown







                                Required, but never shown

































                                Required, but never shown














                                Required, but never shown












                                Required, but never shown







                                Required, but never shown







                                Popular posts from this blog

                                Get product attribute by attribute group code in magento 2get product attribute by product attribute group in magento 2Magento 2 Log Bundle Product Data in List Page?How to get all product attribute of a attribute group of Default attribute set?Magento 2.1 Create a filter in the product grid by new attributeMagento 2 : Get Product Attribute values By GroupMagento 2 How to get all existing values for one attributeMagento 2 get custom attribute of a single product inside a pluginMagento 2.3 How to get all the Multi Source Inventory (MSI) locations collection in custom module?Magento2: how to develop rest API to get new productsGet product attribute by attribute group code ( [attribute_group_code] ) in magento 2

                                Category:9 (number) SubcategoriesMedia in category "9 (number)"Navigation menuUpload mediaGND ID: 4485639-8Library of Congress authority ID: sh85091979ReasonatorScholiaStatistics

                                Magento 2.3: How do i solve this, Not registered handle, on custom form?How can i rewrite TierPrice Block in Magento2magento 2 captcha not rendering if I override layout xmlmain.CRITICAL: Plugin class doesn't existMagento 2 : Problem while adding custom button order view page?Magento 2.2.5: Overriding Admin Controller sales/orderMagento 2.2.5: Add, Update and Delete existing products Custom OptionsMagento 2.3 : File Upload issue in UI Component FormMagento2 Not registered handleHow to configured Form Builder Js in my custom magento 2.3.0 module?Magento 2.3. How to create image upload field in an admin form