The main thing is the power level. At the power levels that wifi systems are allowed to use, you are talking ranges of maybe a couple hundred feet. Cell phone towers reach a couple of miles. It’s a significant difference. A cell phone tower will cover a significant chunk of a city or town. A wifi isn’t going to affect too much outside of the business or location where it is installed.
Wifi ended up on frequencies that are fairly crappy as far as long distance communication goes. The area around 1 to 10 GHz is absorbed fairly well by water, especially at about the 2.4 GHz part of it, which is what wifi uses (they also use 3.6 and 5 GHz which aren’t much better). This means that you really couldn’t pick much worse frequencies to transmit on as water vapor in the air tends to cut down your signal, especially in areas of high humidity. Not coincidentally, 2.4 GHz is also the frequency your microwave oven uses.
The folks who make wifi would have loved to have used a better frequency range, but the spectrum is pretty crowded, so they had to take what they could get. The fact that water vapor in the air tends to cut down the signal actually works to their advantage, as it further decreases the likelihood that one wifi spot will interfere with another nearby.
Wifi protocols are also fairly tolerant of errors, so if you do get two wifi systems interfering with each other, chances are everything will still work, but you may notice reduced data speeds due to data packets having to be re-sent because they got garbled or whatever.
Between the low power levels, poor signal propagation through the atmosphere, and fault tolerance inherent in the protocols used, the FCC doesn’t see any need to license wifi systems to prevent interference with nearby systems.
Cell phone towers, by contrast, easily interfere with their neighbors, and the FCC has to license them to make sure they don’t all cause each other problems.