Basically, the focal point increases at the square of the distance.
So, let’s say you have a magnifying glass placed a foot from the ground that perfectly focuses the sun’s image, and therefore its heat, to an area of 5mm in diameter.*
Now, move that lens away to 2 feet away from the ground, and that focal point, which is really just a clearly focused image of the sun, is now the square of 5mm in size (25mm), and the inverse square in brightness and intensity. So, if at a foot off the ground, the spot produced heat of 500°F, then at two feet away, the temperature of the 25mm spot would be the inverse square, so 4x less intense (2x2=4), producing a maximum of 125°F within the 25mm image of the sun. It’d also be a quarter as bright, too.
So, keep going, at three feet from the ground (3x3=9), the spot can’t get any smaller than 45mm, 9x less hot and bright than at 1 foot (55.5°F).
So you can see, that 250 miles up, this weapon would have to be massive and accurate. Especially to overcome the insulated effects of the atmosphere.
And like billfish678 says, it’s because you’re really just projecting an image of the sun, and the sun isn’t a point source, but has and angular diameter in the sky, so the distance of the lens or mirror, can only project a non-diffuse (or non-blurry) image of the sun on the ground of a minimum size and intensity based on this angular size.
At least I think! Please correct my understanding if I’m wrong!
*these numbers are arbitrary just to keep it simple.
ETA, looking at billfish’s last post, it seems the focal spot increases linearly, not squared?