I think what you are looking for is illuminance (in lux), not apparent magnitude? Sunlight at noon on a clear day is about 120,000 lux, and at sunrise and sunset, about 400 lux. Of course the sky appearing blue is a result of the atmosphere itself as well as the angle of the sun. Also does your gas giant emit it’s own light, or is it reflecting the sun?
This doesn’t directly answer the OP’s question, which is about naked-eye appearance, but the sky is always blue, even at night. With a sufficiently-long exposure, you can take pictures that look like an ordinary daytime scene.
Are they normal humans living on this moon? If they are aliens, you could assume that their eyes are adapted for however much light the gas giant provides. If they have giant lemur-eyes, what looks like deep twilight to us could be bright daylight to them, in which case the sky would look like a daylight sky.
Lux may be a more useful unit in this context, but it’s easy enough to convert.
I had only considered the light the gas giant would be reflecting, not emitting. I don’t think it would be emitting light in the visible spectrum, though. (Though, as Sam notes below, the native life may have adapted to see the infra-red radiation it would emit.)
It’s another star, but it’s sufficiently Sun-like that I’ve been using its mass and luminosity to estimate orbits and appearances.
When I calculated how bright the gas giant would appear, I just treated its albedo as a single number, as if it reflected all frequencies equally. How big of a difference do you think that would make?
Absolutely. I haven’t worked out any details of the other moons, but I’m sure at least one would be visible in broad daylight, under the right conditions.
The question is, when the gas giant is full, what is the minimum brightness another object would need to be, to be seen. Bright moons would definitely be visible. What about dim moons? Other planets? Other stars?
I’m not sure that would make a difference for this. No matter how good your eyes are, you can’t see stars during the day. And you don’t need particularly good eyes to be able to see the moon during the day.
The gas giant, providing less illumination than the sun would obscure other objects less. The question is, how much less?
FWIW, I just tried Stellarium and it can simulate Earth’s atmosphere, accounting for distance (meaning it looks different from Earth than it does from Pluto’s distance). It also changes the visibility of stars depending on where the Sun is, but I can’t tell if it’s accurately simulating the effects of a nearby gas giant.
Here’s a screenshot from Europa looking at Jupiter, with the Sun coming in from the right in the late afternoon of its day. You can see 1) the effects of atmospheric refraction from blue to yellowish 2) the obscuring of stars by the Sun and 3) the glow of Jupiter, but not its brightness obscuring anything else that I could notice
Last post on this: Celestia might do atmospheres better. It simulates Titan’s atmosphere independently of Earth’s, and visualizes both sunlight and planetshine from Saturn. If you add your own planets and orbit them around the sun and give them appropriate atmospheres, I bet you can get a pretty good idea of how the sky would look.
No, because in daylight the scattered light in the sky is brighter than the stars and washes them out. That’s the only reason you can’t see them. If they’re bright enough, they can be seen even in daylight. Venus and Jupiter can both be seen during the day during the right conditions, as can the full moon.
If the alien’s eyes were adapted to brighten all light sources, the relative brightness of the sky would remain the same relative to the background stars (i.e. they’d both brighten). So the stars would be visible even if the sky is blue, I would think.
The alien’s eyes aren’t seeing more light because they see into the infra-red, but simply because their eyes are bigger and therefore have more light-gathering ability. They might also see into the infra-red, but the atmosphere also tends to filter out a lot of IR.
I would think being in the direct reflected light of a gas giant would be pretty bright, depending on how far away from the sun it is. After all, the Moon is a very tiny fraction of that size, and a full moon can light up the night bright enough to read by in the right conditions. Jupiter is incredibly bright in the night sky, and we’re seeing light that’s going all the way there and coming all the way back.
Also, even if the light was only 10% as bright as direct sunlight, we’d perceive it almost the same because our irises would simply open up more. Room light is a tiny fraction of full daylight, but once our eyes adapt to room light it feels plenty bright to us.
No, I don’t think so. How many stars you can see is not a function of resolution, because all stars are point sources to us anyway. We can’t see their discs. How many stars are visible is purely a function of light-gathering capability.
I can take an image with a tiny point-and-shoot camera with a crappy lens, but if I leave the shutter open for, say, 30 seconds I’ll see a LOT of stars in the image. You can take beautiful images of the milky way with a small camera this way. It’s all about how much light you can gather.
Getting back to the blue sky and whether or not stars would be visible: I’ve taken plenty of long-exposure images where the result is what looks like a bright blue sky during daylight, yet with the sky peppered with stars.
If that exposure had been even longer, you’d see even more ‘daylight’ - and even more stars. If you were standing there looking at the scene with bare eyes, you’d see a black sky with stars, and perhaps some moonlight on the trees. Judging by the gradient to the skyglow, this was probably shot at dawn or dusk where just a hint of light might be seen on the horizon but with naked eyes the sky would still be black with maybe a hint of blue or red towards the bottom. But a nocturnal animal (or an alien with big eyes on a dim planet) would probably see something like the linked image.
Yes, which is precisely why resolution matters. The amount of light you get from one pixel (or other limiting resolution element) doesn’t change, since the star always fits into a pixel. But the amount of background light you’re competing with in that pixel does change, being more or less uniform across your field.
Now, unless you’re very clever about compensating for it, your resolution is going to be ultimately limited by the atmosphere you’re looking through, and that limit is coarse enough to prevent seeing most stars through daylight. But there are in fact clever ways of compensating for that.
Well, now you’re talking about astronomy and apparent brightness, which does change with resolution. And yes, the atmosphere is a limiting factor in that regard (although it may not be so much on an alien world - ‘seeing’ conditions change drastically even on Earth depending on the turbulence overhead, which in turn depends on things like where the jet stream is). And I doubt that an alien is going to have eyes that do anything like image stacking or adaptive compensation to eliminate atmospheric distuirbances. So we’re really nitpicking now.
But here’s a factor much more important: How much moisture and particulate matter is in the atmosphere? Transparency is a better measure of how many stars you’ll see, because if the sky isn’t transparent you wind up with a lot of scattered light (“skyglow”), and that determines the magnitude of stars that you can see. I do optical astropohotography from within a city, and on transparent nights I can get amazing images. If the sky isn’t very transparent due to humidity or other conditions, I am very much limited in how long an exposure I can take before skyglow wipes out everything. And it doesn’t have to be city lights, either. A full moon and a humid sky can make it impossible to take long-duration images.