Physics of light: what is brightness?

Is it photon density per cubic unit of volume? Wave amplitude? Color purity? Something else? When it is said that something is ten times the brightness of the sun, what exactly is meant? Is there a formula to calculate brightness?

Just a WAG but I’m guessing it’s the first thing. Only not cubic but planar.

Maybe these links will help:

If I read them correctly, light intensity is measured according to how much a hypothetical human eye’s light receptors would be stimulated.

From here

x-ray vision, according to my link that definition has been discarded.

Discarded? Damn, when did that happen? Now I’ve got to get my lightbulbs re-labeled.

It’s scary how hard it is to talk to artsy types about light. :slight_smile:

As a physicist, I’d say that brightness is either flux (energy received per second per unit area) or intensity (the same as flux, but also per unit solid angle.) Photographers, theater lighting people, etc. use candelas; I’d use Watts/m[sup]2[/sup]/s or ergs/cm[sup]2[/sup]/s. They use lumens, I’d use Watts/m[sup]2[/sup]/s/steradian or ergs/cm[sup]2[/sup]/s/steradian.

I’ve attempted to translate brightness between the world of physics and the world of artsy before, and found it maddeningly difficult to find formulae, definitions, and conversions on the artsy side. They have different needs, I guess.

If something is ten times brighter than the Sun, I’d say that you were recieving 10 times the flux. Now, that’s not to say that it would look 10 times brighter to the human eye, unless the spectra of the two sources were identical, i.e. they were emitting the same fraction of photons at each frequency. Also, if the source were more intense, that would mean it was concentrated into a smaller region of the “sky” than the Sun, i.e. that it has a smaller angular size.

It’s probably best not to think of brightness as a wave amplitude, but rather as a number of photons. However, the density of photons per cubic meter isn’t really brightness because you don’t know what direction they’re going. If they’re not headed for your eye (or other detector) you won’t see them. That’s why flux or intensity in physics is always defined relative to some infinitesimal area element: you’re only intersted in photons that are headed through that area (and you want to know at what angle they hit it, too.)

I hope that’s helpful, and not just gobbledegook.

I always get confused with this stuff, but I think you’re incorrect on a couple of things. Intensity is energy per solid angle per unit time, as I understand. So it’d be ergs/s/steradian (no /cm[sup]2[/sup]). Also, Watt is a unit of power and not energy, so Watt/sec doesn’t make much sense.

Getting back to the OP… “Brightness” isn’t a technical term with one specific meaning. The absolute brightness of a luminous object is the total energy output, measured in erg/s or Watt. Apparent brightness is how bright an object appears to a particular observer, e.g. how bright a star looks from earth. It’s basically the same as flux, i.e. energy received per unit area. Measured in erg/cm[sup]2[/sup]/s or Watt/m[sup]2[/sup]. There’s a further complication about whether you are adding up energy over all wavelength (X-ray through radio), or just the visible light, or even a narrower bandpass.

Energy of a light wave is proportional to the square of the amplitude. Or you can think of it as the number of photons, multiplied by the energy of each photon.

There, did I confuse things enough?

I do mean in terms of visible light, not the whole spectrum. It does make sense, although I hadn’t thought of it before, that the direction of the light is important. As Podkayne noted, if a photon doesn’t reach your eye, it cannot reasonably be said to be bright. However, it would have an apparent brightness (or flux) to someone whose eye it did reach.

So, can one photon be brighter than another, in terms of absolute brightness? If so, would that be determined by the amplitude of each? What is the brightest possible visible light? How many photons can fit into a square centimeter at the same angle and from the same directions. (No angel jokes, please.) Is there a practical limit to apparent brightness due to the density of cones on your retina? Can one person see brighter light than another?

Sorry for a nitpick, but Watts are Joules per second so you don’t need the /s just Watts/m[sup]2[/sup]/steradian. Other than that you are correct, sorry again for being picky.

This is a really interesting question philosophically because the answer is that brightness can be related to EITHER wave amplitude (actually wave amplitude squared) OR the photon flux, depending on what theoretical model you’re working under.

In classical physics, light is an electromagnetic wave and the energy of the wave is related to its amplitude and its frequency.

In quantum physics, light is a bunch of photons, and each photon carries an amount of energy proportional to its frequency.

Well, since brightness depends on the number of photons AND the frequency of the photon, we normally wouldn’t say that one photon is brighter than another. Certainly, a higher-frequency photon (bluer light) carries more energy than a lower-frequency photon (redder light). So you could say that a blue photon is intrinsically brighter than a red photon. HOWEVER,

you asked about visible light, so I’m guessing you want to know how bright a photon would APPEAR to the human eye. (I’ve heard that we can actually detect single photons under ideal conditions, but I don’t know if that’s true or UL.) In that case, we need to take into account not just the physical properties of the light, like how much energy the photon carries, but also how the eye RESPONDS to that energy. I don’t know anything about the physiology of the eye, but I would guess that it responds best to the middle of the spectrum (yellow). So you may find that a single yellow photon would be “brighter” (to look at) than a single blue photon, even tho the bluie has more energy.

I prefer to think of light as an electromagnetic wave. Here’s an animation of one:

I am not a physicists, but it would seem to me that “brightness” would simply be the amplitude of the wave (i.e. amplitude of the electric field and amplitude of the magnetic field).

“Brightness” is rather a difficult term to relate to intensity: is a 1kW source of yellow (~575nm) light “brighter” than a 1kW source of deep red (~710nm), even thought there’s the same number of incident photons per second?

As an analogy, acoustics tends to plot intensity on a dimensionless logaritmic scale, ie. it takes some arbitrary reference (in this case the threshold of human hearing) as 0B (or 0dB) and measures the power ratio to this reference to give a “loudness” level. This is usually accurate enough, but an added complication is that the human ear is more sensitive at some frequencies than others (rather like our eyes in the yellow/red example - our ears are ‘tuned’ to vowels just as our eyes are ‘tuned’ to green). A measure of perceived loudness is thus the Phon: I’m not sure if “brightness” has an equivialent to the Phon.

Actually, reading that Wikipage on the Phon I find I’m living in the past - the [url=]adjusted dB scale is what is actually used by professional acousticians in calculating human exposure to sound (which I should have remembered since I used to work for one!). I imagine there is a similar scale for the human retina, to answer Lib’s later question regarding maximum exposure before irreperable damage.

Thanks for everybody for catching the Watts=Joules/second thing. You are quite correct, and sorry about the error.

However, the units I gave for intensity are right, scr4. I just double-checked in Modern Astronomy by Carroll & Ostlie in case I was having a brainfart.

Correct. The same source can have different brightnesses to different observers.

I don’t have a really deep understanding of the quantum niceties of this, but I do watch photons bounce off of stuff for a living, and this is how I think about it. A photon has an energy given by E=hf, where h is Planck’s constant an f is the photons frequency. At least from my point of view, the amplitude of a light wave is a red herring left over from classical theory. I don’t think of the amplitude of a wave, I think of a number of photons of a given frequency.

Two photons of the same frequency have the same “brightness.” A higher energy photon would have a higher 'brightness" because it delivers more energy. Perceptually, of course, this is not the case, as a UV photon has no brightness to us, despite the fact that it has a higher energy than, say, a green photon.

It would depend on how quickly your eye can refresh color pigments and also, potentially, how much physical damage your eye could take before you were unable to see. Again, in my way of thinking, if one photon of the proper frequency that happens to strike a molecule of pigment in one of the cones of your eye, the molecule breaks, and one of the resulting chemicals triggers a nerve impulse, and you see the light. The more photons that are within the appropriate range of frequencies to break each pigment, the more you see, until your eye reaches the saturation level and new pigment is produced at the same rate that pigments are broken up by light. Then even if the light source was turned up and more photons were hitting your retina, you wouldn’t see a brighter light because your eye couldn’t register a brighter light.

Now, not every photon that strikes your eye happens to hit a pigment molecule. The others can be reflected or absorbed. Absorbed photons will heat your eye. Eventually, that’ll lead to damage.

I would think that if you had more cones, they could produce more pigments at a greater rate, and you could register a brighter light. I don’t know how much this varies from person to person.

On a very basic level, two photons can pass through each other without modifying each other, because they behave like waves. So my first instinct is to say that you could have as many photons as you like.

However, if we go beyond the basic level, putting a lot of energy in a small space can lead to strange effects.

If two gamma ray photons with high enough energies collide, for example, you produce a pair of particles. I don’t know if you could get the same effect if you just put a whole bunch of, say, photons goether. This is what we call “high energy physics,” and I’m afraid that Podkayne is much more of a low-energy, near-infrared kindofa gal. I’m sure somebody else knows, though . . .

“Breaking pigment molecules”, Podkayne?? (I think what you’re looking for is the action potential which the incident radiation stimulates at the retina.)

Argh. I’m out of my field. I do seem to have it a bit mixed up.

I fished around for an online anatomy textbook that I can understand but which goes beyond the basics.

This must be what I was thinking of. It’s the change in the molecule’s shape that starts the series of event that ends up with a nerve signal, but the molecule can’t change itself back, and has to be disassembled. So it’s not broken apart by the photon, but by subsequent chemical reactions.

I can’t find if visual pigments in cones work the same way; most sites seem to describe the rods in great detail, and do a handwavy “cones are similar” thing.

Also, according to this source,, “Ganglion cells are always active. Even in the dark they generate trains of action potentials and conduct them back to the brain along the optic nerve. Vision is based on the modulation of these nerve impulses. There is not the direct relationship between visual stimulus and an action potential that is found in the senses of hearing, taste, and smell. In fact, action potentials are not even generated in the rods and cones.” and “. . . the absorption of light does not trigger action potentials but modulates the membrane potential of the cones.” I don’t quite grasp what that means, but it seems to be different from what you said—or maybe it’s only a technical quibble.

**It seems to me that brightness and color is purely a perception created within our mind. Neither has a psychical property external to the mind in the same way that sound, brightness and color is not a property of a cell phone signal. I would also say that a smartphone does not output sound, color or brightness. Our mind just interprets the output information that way. From the cell phone to our mind the pressure wave is converted to sound, the radiation wavelength to color, amplitude to brightness and complexity of the light radiation is perceived as saturation.

The radiation from the Sun or a light source that we call “brightness and color” in the “unperceived” state has no visible/measurable attributes other than wavelength, amplitude and complexity. Solar radiation only contains “invisible/unperceived” information in the form of a heatwave. Our eyes and mind interpret that information “wavelength’s, amplitude and complexity” as color, brightness and saturation, respectively.

If true in a round-a-bout we are still “seeing” energy from the Sun/light-source. Plants absorb the heat energy (photosynthesis), animals/humans eat the plants transferring the energy into the body, the body and mind convert the stored energy and perceived visual signals/information back into the perception of brightness and color.

I’m suggesting that brightness and color exist but only as a perception of a living being. Possibly the result of neurons firing in our minds visual centers in response to the perceived radiation/information waves. This “visual perception” would be powered by the energy stored by our bodies, which ultimately came from the Sun to start with.**

Boy, are you opening a can of worms. I say that as an Optics guy.

It turns out that Radiometry and Photometry are very precise and highly developed branches of optical metrology. There are definitions for a variety of weird and related parameters that people would broadly classify as “brightness”, but which mean different things.

radiometric flux per unit area of an object per unit solid angle per unit wavelength falling at a given angle onto a unit of area on a receiver. – that’s just one such quantity. It could be, instead, photometric flux (the light gets weighted by your eye’s relative response). It cfould be averaded over time. It could be integrated over all angles. Et cetera et cetera, ad nauseum.

and each of these definitions gets its own whacko units. Nowhere is the love of christening new units so evident as in radiometry – lux, lumens, nits, phots, stilbs, Watts per square centimete, watts per steradian, watts per square centimeter per steradian.

And all of the above is without getting into Color Theory, and its many and manifold complexities.
When speaking of Brightness, optics folk generally mean the Specific Brightness, for which, see something like this page:

The Brightness is related to the etendue:

For more than you ever wanted to know about this, consult a good book on Radiometry and Photometry. For a brief taste, see here:

You’re welcome.

I don’t want to give the impression that this is a pointless, needless mess. It’s a poiinted and necessary mess. When you’re describing what happens to light as it rattles through a system you’re engaging in a complicated case of photon bookkeeping, and all the weird definitions are useful in keeping things straight and in calculating what you want to know. exercises in radiometry frequently feel like almost pure calculus, though.

The “can of worms” was opened 12 years ago, FYI, in case you didn’t notice…