I think it’s more like “we know how much stuff we need to shove in a resistor package to make it 100 ohms, but the machine that makes them is only capable of measuring it within a 10% tolerance. So we’ll call them 100 ohm resistors but with a 10% tolerance.” It would be implausible to test the actual resistance of each resistor coming off the manufacturing line; these things are turned out by the bazillion.
I would guess that an average sample of resistors would display a normal(ish) distribution of resistance centered around the value on the package.
High-precision resistors are made by manufacturing resistors that are deliberately too resistive, then using a laser to trim them down until they are within 0.1% or so of the desired value. These are a lot more expensive than ordinary resistors because of how time-consuming they are to make.
10% tolerance in resistors is actually pretty lousy, as would be 5%.
You’ll find that in practice you will usually do somewhat better than 3%, its just that you don’t see a 3% or lower tolerance band.
Part of this dates from valves, where you might be dealing with high resistor values, up in the 1.5M plus range. In those days QC was so poor that 10% was a pretty good figure, so the two tolerance bands were set at 5% and 10%.
Nowadays we can do so much better, in analogue circuits we tend to use so much negative feedback in the many different types of amplifier that it tends to reduce the effect of loose tolerances, and in digital circuits all we are usually aiming for is a means to tie or discharge a voltage or current within a specified time, so really good tolerance is not that important.