If I pointed a laser pointer at the moon... (Hypothetical)

If you spread an ordinary laser-pointer dot over a 1-foot diameter, then you’ll have a hard time seeing the reflection of that dot off of a surface. But if your eye is in that one-foot circle, you would certainly see it.

Not just see it, but be night-blinded by it.

Not necessarily. At least in this country the severity of the crime is the same for both weak and powerful lasers. Please experiment pointing lasers at trees in a forest, not aircraft in the sky.

A good flashlight can put out 20W, and you could see even the oldest/cheapest from a lot farther than 1000 feet. Being able to ‘see’ a 5mW laser pointer at 1000’ is itself an interesting factoid, but I’d always wondered about the dispersion (and I’ve never owned a laser pointer that was 5mW: unless you’re in a big room, you don’t need that much for laser pointing).

You can see street lights and billboard lights (which point up and are a notorious source of light pollution) and flashlights and other light sources when you look down, and the whole sky glows with diffuse light. All of which make me ‘night blind’ to a noticeable extent. If you’ve been more night blinded by a laser pointer, at any altitude, I suggest that it wasn’t a laser pointer such as those used for classrooms.

You are making two fundamental errors.

    • A good flashlight is not putting out 20W of light. Maybe it’s emitting 5W if it’s extremely efficient.
  1. A flashlight has a beam divergence of many degrees. A 5 mW laser pointer is around 100x more collimated.
    Try it.
    Go to the end of your block, and have someone shine a flashlight at you, and then a laser pointer - you tell me which one appears brighter. And, the difference will be even more pronounced at a mile.

2% of 20W is 400mW. A typical classroom laser pointer was less than 1mW. I wondered about the divergence of a laser pointer: I think that it’s safe to assume that a 0.25mW laser pointer designed for use at 3-10 feet is even worse than a 5mW laser pointer designed to be ‘legal in the USA’.

According to engineer Samuel M. Goldwasser, who maintains an extensive Web site about lasers called Sam’s Laser FAQ, if you were to look directly into a laser-pointer beam from a mile away, it would appear as bright as a 100-watt bulb seen at a distance of less than 100 feet. Most people would find such a bright light very uncomfortable and would instinctively blink and/or turn away.

That is a very good source, which directly addresses the question.

At a distance of 1 mile (1,609 m), the beam from a typical helium-neon laser (which is a quite well collimated source) will have spread to a diameter of roughly 4 feet

The calculations indicate that a 5mW laboratory laser at 1 mile is quite bright (as bright as a porch light several houses away). This is much brighter than I expected, and would be much worse at 1000’.

But if a typical 10mW helium-neon laboratory laser is spreading to 4 feet in a mile, I’m taking that as evidence that a $10 laser pointer such as I used to use is not going to be anything close to only 5 feet in a mile (as suggested above) Typical laser-diodes are known for their wide divergence: whatever it is, it’s much worse than a typical helium-neon laser.

Not sure how big the perp’s laser was but…

While it doesn’t deal with the specific scenario in the OP, this XKCD What If post has what looks like some relevant math.

https://what-if.xkcd.com/13/

We’re imagining the wrong sort of laser, if the goal is being able to see it from far away. Laser beams spread, at a minimum, in inverse proportion to their diameter, due to diffraction. A proper laser for seeing from the Moon would have a very wide beam to minimize this diffraction. In practice, lasers can be spread by the equivalent of holding them up to the eyepiece of a telescope. I believe I’ve read this approach was used in several long distance laser experiments.

Also, most laser diodes (which emit spreading light and need optics to collimate that light into a beam) have the interesting property that, if you trace the light rays back to the diode, the point at which the rays cross (the point from which they appear to emanate) is more or less distant according to the plane you choose in which to trace them. It’s a form of astigmatism. A really good telescope system or lens system for preparing a Moon beam would also have a compensating astigmatism built in.

There’s a discussion of beam expanders here at Edmund:

There are certainly different thresholds for “can see a light source”, “bright enough to be dangerous”, and “bright enough to cause direct damage”. I’m 100% confident that a cheap low-power laser pointer, at airplane range, is at least in the first category. Whether it’s in the second, I never did the calculations for. And it’s almost certainly not in the third.

IIRC, in clear conditions and without competing light sources, you can see a candle from 50 miles away. It doesn’t take much at all to be able to see a light source.