The stars, of course, like our sun are giving off light and heat. The light that reaches earth from them does amount to very much and I asssume the amount of heat that reaches the earth from the stars is negliable. But is it measurable? Has it been measured?
Not including the sun I would say enough to raise it to about 3 degrees Kelvin, which is the background temperature of space.
Diddly squat. This is one of the thought experiments about size and age of the universe. If the universe were a steady state infinite collection of stars in Galaxies, then in every direction we would see (and feel the heat of) a star thatlay somewhere in that directon.
The background radiation (3K) is the residual radiation from the big bang, IIRC. The guys that measured it just knew they had this background radiation interfering with their microwave transmission, while some other prof had calculated that there would be the residual radiation, but hadn’t found it.
You can get a good idea of the real number of heat difference from comparable light. Assuming most stars have the same relative distribution of heat and light as the sun (some, K through M are cooler, some, F to O are hotter) then the relative heat effect matches the relative light effect… about 26 magnitude difference, or 2^26 compared to the sun, if I recall my first-year astronomy?
The Sun only gives off “heat” in the sense that it gives off infrared radiation, and when this radiation strikes matter it tends to cause it to heat up. In that sense, the study of the “heat” coming from other stars is the field of infrared astronomy, and it’s certainly possible to measure the infrared brightness of a star in a similar way to measuring its visible brightness.
There’s nothing special about infrared-- Heat can be carried by any radiation. Most of the heat we get from the Sun is in the visible light, not infrared. The only reason that we think of infrared specifically as “heat” is that most things on Earth are at temperatures that cause their peak emission to be in that band, but hotter objects will transmit heat in other frequency bands in exactly the same way.
That’s just the cosmic microwave background, and does not include contribution from stars. There are also other sources of heat, like cosmic rays. Interstellar gas clouds typically reach a temperature between 10 and 20 K, if I remember correctly. So if the Sun were to stop shining completely, the Earth would eventually cool down to that temperature.
So, that means the Sun provides enough additional heat to increase the Earth’s temperature 20 times higher. (Assuming the equilibrium without the Sun would be 15K, and approximating the actual average temperature as 300K.). Radiative heat is proportional to 4th power of temperature, so the Sun provides 20^4=160,000 more heat than everything else combined (stars, microwave background, etc).
Sorry, as you probably anticipated, that statement confuses people like me.
So, let me ask: do you mean that thermal radiation (i.e. radiation produced due to thermal motion of molecules, etc.) is heat or that thermal radiation is generated by heat but is not, itself, heat? Or perhaps neither?
Regarding the second possibility above, the Wiki article on thermal radiation states that thermal radiation “represents a conversion of thermal energy into electromagnetic energy”. Is that the same as saying what you did, that “heat can be carried by any radiation”?
Does the EM radiation from a given thermal source have a temperature (or do we just say that the EM radiation coming from a source corresponds to the temperature of the source but does not itself have a temperature)? If the EM radiation does have its own temperature, what does that mean? (i.e. what is meant by the temperature of EM radiation? That it corresponds to a black body’s radiation at that temperature?) Would the various wavelengths comprising the EM radiation from a given source each have a different temperature? If so, why?
I don’t expect you, of course, to answer all my many (and probably stupid) questions but I hope you can keep them in mind if you choose to elaborate a bit on what you said.
If an object has a temperature above absolute zero (it does), then it will emit electromagnetic radiation. The radiation will have some spread of frequencies, but it’ll be clustered around a frequency that’s proportional to the temperature of the source (this frequency is in the infrared range for animals and fire and a lot of other things). If the source is a blackbody (it almost never actually is, but it’s a reasonable approximation for a lot of things) then the radiation is considered to be thermal, and is said to have a temperature (the same temperature as the source, unless there’s some effect redshifting or blueshifting it). An absorbing object exposed to a bath of this radiation (i.e., from all directions) will come to an equilibrium temperature that’s the same as the temperature of the radiation. Even if the radiation isn’t thermal, one can still describe an effective temperature for it, that being the equilibrium temperature of something exposed to it.
Does that clear things up?
Indeed, it does! Thank you.
I assume when you say ‘cluster around a frequency that’s proportional to the temperature of the source’, you’re referring to Wien’s Displacement Law? Here’s the graph with the ‘y’ axis being the “amount of radiation” if I’m understanding it (and you). Last question (maybe!), what are the units for the “amount of radiation”?
I appreciate your help. Thanks!
Imagine a clear night in a field far from any city or manmade light sources. Imagine a night with no moon. The field will be lit up to a certain level. Now imagine a night with a full moon. The level the field is now lit up is in the same ballpark as the moonless night.
This tells us the light from the stars is “comparable” to the light from the moon.
The moon is 14 magnitudes fainter than the sun. 2.512 raised to the 14th power is approximately 400,000.
Therefore, a rough estimate is that ALL the stars combined are putting out 1/400,000 of the amount of light/heat the sun is.
I would prefer my heat to be carried by Bekenstein-Hawking Radiation, but I’m having a bit of a struggle with the EPA over that one.
I may be misinterpreting you, but a moonlit night isn’t even in the same ballpark as a moonless night, not even close.
Go somewhere truly dark as described, and a moonlit night will feel well lit, things cast shadows, you can read (if the text is big enough), etc. Same place without the moon: dark. you won’t be able to see anything but the stars, and the outline of things blocking the stars.
Meh, the moon is a hella lot closer in brightness to the night sky than it is to the sun.
I’ve been in that situation many times. On a scale of 1 to 10 with 10 being world class light pollution free and clear skies, I’ve been at 7’s and 8’s hundreds of times.
A moonless night in a wide open field under those situations is NOT dark. You can easily see what your doing. You can’t read fine print, but you can see.
Is the moon 10 times brighter? Probably. Is it a 100 times brighter? Perhaps. Just how much brighter is it? If you actually KNEW what it was, we wouldnt be doing this back of the envelope calculation now would we?
Now, lets also consider that you’ve only got half the sky to see at any time. So thats a factor of 2 right there. Then, consider the largest number of stars (and perhaps a large fraction of the total heat/light output) will be be small red dwarfish stars that have a large fraction off their energy in the infrared part of the spectrum, so using “visible” light as a comparative measure is shortchanging those stars contribution. Not sure whether this factor is minor or major though.
Anyhow, if you don’t like my back of the envelope calculation, consider it an upper bound and divide it by 5/10/100 or whatever number suits your fancy to get the real number.
If somebody want to search for terms like total integrated sky magnitude milkyway they can probably find the skies total magnitude. Then find the sun’s magnitude. Take the difference and raise 2.512 to that power (difference) and you’ll have a better answer.
Doing a quick and dirty internet look for intergrated sky magnitude, take my 400,0000 and divide it by about 250. Keep in mind the red dwarf thing may (or may not) make a big difference.
Wien’s Law is if the emitter is a blackbody. Non-blackbodies will have different spectra, and it’s difficult to generalize about all of them, which is why I was so vague in my wording there.
http://en.wikipedia.org/wiki/Apparent_magnitude - so I had to go look this up.
–26.74 - Sun (398,359 times brighter than mean full moon)
–12.92 - Maximum brightness of full Moon (mean is –12.74)
–1.47 - Brightest star (except for the Sun) at visible wavelengths: Sirius
6.50 - Approximate limit of stars observed by a mean naked eye observer under very good conditions. There are about 9,500 stars visible to mag 6.5.
Note that the scale is such that 6 magnitudes is 100x brighter.
All stellar bodies have approximately the same distribution. The “redder” a body, the lower the peak wavelength of it’s electromagnetic spectrum emissions.
Assume all 10,000 visible stars average out to -6 or so; meaning the total starshine (no moon) would be 10,000 more than 6 or about +6, less than 1/100 the full moon. In fact there are a lot more close-to-minus-ten mag stars than close to 0 mag stars, but we have to take into account galaxies, the milky way, etc…
Still, if the stars together are (400,000x100) or more less bright than the sun, their total heat contribution would also likely be in that ballpark; ignorning also whether certain wavelengths are more likely to be absorbed by interstellar dust.
That law applies to calculating the solar radiation at a given distance from the sun. For instance,…the earths radiation level,…(nearly all of it in the visible light spectrum), is 1365 watts/ square meter. All one had to do,…to figure out what the level is for say Venus,…is to take the ratio of the squares of the distance,…and you get 2600 watts per square meter. Same thing for Mercury 9,400 watts per square meter. So,…I created a spread sheet,…using the known luminosity difference,…(the ratio) of the star to our sun,…then assumed a spherical shell equal the distance from earth to the star in question. Sirius,…Rigel,…did it for several,…the wattage,…was usually something like 10-4 watts per square meter. For radiation sources that far away,…I don’t know if there are other factors at work,…but the tiny numbers seemed reasonable. Some were a mili-watt or so. I don’t know why it “shouldn’t work”.
I think you need to check your work.
The distance from the Earth to the nearest star (other than the Sun) is 4LY, or around 2x10^13 miles. That’s around 2 Million times farther than distance from the Earth to the Sun, so the Earth should receive 1/ 4x10^12 as much radiation from that star (assuming a Sun-like star). So, instead of 1KW/m^2, we would receive on the order of nanowatts/m^2.
Another factor is that chemically and structurally stable objects can only get so hot before disintegrating in some way. At that limit (a few thousand Kelvin), most of the radiation is still in the infrared (hence why incandescent bulbs are so inefficient in the visible spectrum).
Hotter things like plasmas can put out larger fractions of high-frequency EM radiation, but we don’t tend to think of those as “objects”.
That’s far, far too high. You wouldn’t even get that from the full Moon.