Net neutrality: now what?

And what are you? You say you are an engineer, but it’s hard to believe anyone looking rationally at the progression of wireless could say that it sucks and will suck forever, end of story. You seem to be talking about WiFi mainly, then you switch to satellite and seemingly think they work the same and are attempting to extrapolate from that to any and all future technologies to paint a blanket statement that this is all there will ever be.

A couple of things. First, ‘wireless’ means a lot of different things. RF wireless certainly has some limitations (I believe our microwave systems can only do 10GB uplinks today, for instance, and the WiFi systems we have using the AC standard can only do around 1GB max, and usually are running around 50-100mb/s using internet speed tests during normal operations, meaning maybe 30-40 people in a given building), but we are nowhere near maxing it out. Something like 802.11 wireless has limitations because it’s FCC open, thus anyone can transmit on the bands available. That’s why they use frequency agile and channel hopping technology, but sure, there are limitations there. But there are other technologies being looked into besides RF, such as LiFi.

It’s funny, but you seem all futurist at some points, then you’ll say something like this which seems to indicate you really don’t follow this stuff too closely. Did you know that in many countries with what is considered to be very high-speed access they don’t even use traditional landline internet, but mainly get their internet through their cellular networks…which, last I checked were wireless on the last mile? And did you know that you can use your cellular device even in a city where there are more than 1000 neighbors without going down to .0001mb/s? Yeah, who knew?? The kids these days…

Attenuation problems, IIRC.

Yes. That’s the main reason. The bigger problem is that it would mean ISP techs would need to deal with high voltage. Vastly more expensive and dangerous. Over time, wireless tech has gotten to the point that the smart meters just use a cell phone or mesh network radio now, earlier smart meters did exist that would communicate at very slow speeds over the power grid itself.

My authority is the laws of physics. These laws hold for all forms of electromagnetic communication. I linked the article which includes a mathematical proof. The reason there’s bandwidth caps with cellular internet is because of this limitation. And yes, they covered this in school, but I’m not an RF engineer.

It’s possible to spatially filter signals. This is why it is even possible to seemingly do better. The shannon entropy limit is absolute, but the cell tower can beam steam and steer it’s reception into a cone, so it only “hears” a few devices at once. This allows those devices to use all available spectrum, which is why you do see speeds of tens of megabits on 4g sometimes.

This is not possible to do if all the users are close together, among other problems. And not possible to do from a satellite because of the large distance involved - there is not enough angular difference between each user on the ground.

It is true that there is a physical limit on the amount of data that can be transmitted wirelessly, but luckily that limit is orders of magnitude above our current needs. As I said before, Verizon conducted tests this year and is planning on offering wireless broadband next year.

Why does everyone assume I’m uneducated about it? The government is happy to regulate, and if things go to hell, I don’t doubt that they’ll jump in and start regulating. I understand the risks entirely. But they’re just risks, and all of you pro-net-neutrality types are risk averse. You’re the types who call the police because there’s a man at the park, and who like the TSA because they’re preventing airplane hijackings, and want to disarm honest Americans because some jackass shoots up Las Vegas. If there’s not a crisis, you have to invent one.

Let’s deal with the sky falling when it starts to fall, okay?

One more thing: I just lived in China for five damned years, and the internet sucked, and you know what? Despite that, we got by okay. The lack of net neutrality is nothing compared to the Great Firewall of China. I’ve got experience. Really, this is no big deal. There’s no crisis. Wait and see. If necessary, regulate (or better yet, repeal regulation to promote competition).

Maybe it will require the unhindered abuse by monopoly ISPs to convince voters to actually vote in their own interests. Maybe the next federal administration will have the evidence they need to drop the hammer. Force the cable companies to sell their wires to local and state governments or something.

Wow, really?

Do you know why we have net neutrality now? Because ISPs were screwing people over and the FCC regulated them. Now we’re going back to not regulating them. Why, exactly? What reason is there to do that?

People take risks if they see a possible gain on the other end. What realistic possible gain can you suggest that ending net neutrality might have to offer us? Because, to me, it sounds like “come on, take a risk, drive off that cliff, you might not die”.

That’s false. Targeting an individual person is tricky but a house is relatively easy.

The basic limit here is the Rayleigh criterion, or theta = 1.22*wavelength/diameter.

Suppose houses are 50 meters apart and the satellite is 500 km up. Theta is 0.0001. And say we’re operating at the top of the Ka band, or 40 GHz. That’s a wavelength of 0.0075 m. That makes for a diameter of 91 meters, which is kinda big for a LEO satellite but not completely impossible (this is just the antenna of course, which can be lightweight and collapsible). You could cut it down to an easy 10 m and still target <100 households.

Not to mention that even just a single channel can have an obscene amount of bandwidth. The entire Ka band covers 14 Ghz; with a spectral efficiency of (say) 4 bits/Hz, that’s 56 Gb/s shared among the users.

SpaceX’s constellation won’t be this ambitious but they are absolutely going to use beamforming from LEO to increase bandwidth.

I’m not sure phase array beamforming actually gives you a pinpoint cone optimized to theoretical limits, though. I think to get that you’d need a very long and narrow antenna.

The current satellites are 52 times higher than that, and that cone would thus be 52^2 times larger, right?

And SpaceX won’t have the licenses to the entire Ka band.

Do you have a cite on this by any chance?

I’ve been hearing tangential references for years to SpaceX launching somewhere on the order of 10,000 to 40,000 satellites (if it were any company but SpaceX, I would laugh) to deploy a satellite ISP network offering full-globe coverage, but I’ve yet to see any solid details on this, would be very interested in taking a look.

Long and narrow doesn’t work; that just gives you a dipole radiation pattern, which looks like a torus. Not good here.

Broad and flat, with lots of individual elements, is what you want. And the bigger the antenna, the lower the divergence angle (within some limits). Ultimately it comes down to diffraction with the antenna as the aperture.

More like 70x, and so 5,000x the area. They still do beamforming from geostationary orbit, but as you might imagine they target pretty large areas (like bigger than a city).

Sure, I was just giving an example of what’s possible. At those high frequencies, there’s a huge amount of bandwidth around.

Hmm; I can’t find a cite for a phased array antenna on the satellites themselves, but Musk does say this:
The base station would have a phase array antenna with a switching time that’s of the microsecond to low millisecond level. So it would only take a few milliseconds seconds to switch from one satellite to the next. So, as opposed to having a dish that has a slew rate.

I’ll look around a bit more for info on the satellites, although they’re keeping things pretty close to their chest at the moment so some of it is informed speculation.

The initial constellation will be more like 4000 satellites, which is still a hell of a lot. Later they plan on having a second constellation with 7500 satellites. I hadn’t heard any numbers in the 40k range but maybe that’s some very long range plan.

Well, Comcast has gone back on its promise to honor NN no matter what.

Obviously we can expect things to go downhill from here.

I’m not an RF engineer either, nor do I pretend to be. But we extensively use wireless systems in our network, including microwave backbones, wireless point to point and point to multipoint bridges, WiFi and even private 3G networks, and your concept of the limitations based on your ‘physics’ and an equation are misleading. I don’t know if they are intentionally misleading because you really want to make some case that all technologies except landlines from traditional ISPs are impossible (thus NN is the end of the world!) or you just don’t get it. It’s hard to say with you as in the various threads I’ve seen you in it seems sometimes you understand quite a lot then other times…not.

On a 4G system you often get that good at a minimum, depending more on the backhaul than on number of users, since cell towers are, by design, separated and dense enough to counter the issue you circling around. That’s why you can get good cellular data in a city of more than 1000 people. You can do similar things with WiFi though it doesn’t scale as well.

But another option might be a point to multipoint wireless system that doesn’t need last mile copper or fiber. There are several ways to do this, and I know of at least a few projects looking into this. Even if you don’t have an FCC licensed band you can do a lot with such a system. In one of our small rural towns on our microwave backbone we distribute last mile via such a system, and even though it’s not a licensed spectrum we are getting a GB throughput reliably. And the system was pretty cheap as these things go. If I had the need and someone would fund it I could easily bring the entire village into the network and give each home over 50mb/s to our network, backhauled by 10gb/s redundant links. No local ISP required.

Like I said, there are different kinds of wireless, and you seem to be trying to lump them all together then make blanket assertions that they suck today (which isn’t true in) and will suck forever (which is really ridiculous, since wireless communications are probably going to explode in usage over the next decade). And there are limitations, but we haven’t even gotten close to them as yet, and with today’s technology you could build a competitive system that doesn’t use traditional broadband or telco last mile copper or fiber, assuming you thought you thought you could get those customers using a traditional broadband or DSL ISP to make the switch…which you could, if they were pissing folks off like people here are speculating.

There are other means of providing wireless than satellite, even if this were a universal truth that could never be overcome. And there are companies pursuing those even today before the traditional ISPs have gone off the deep end as predicted by many in this thread.

It wasn’t widespread, and the ISP’s started to behave themselves under regulatory pressure. I can recollect only two incidents that bothered limited numbers of people for a short amount of time: Indian River VOIP blocking, and Comcast’s use of TCP/IP resets with Bittorrent traffic (something that happens to a lot of traffic in China, by the way, and for which there are workarounds.).

Well, yeah. That’s how it works. You can’t punish someone for what they might do. Wait and see what they do in fact.

If you set up a situation where you are

(1) economically rewarding a company for abuse

(2) there are no viable competitors, the company can do what they like without facing any meaningful loss of customers

Technically the company is under a fudiciary duty to it’s shareholders to push it as far as possible. This is subject to interpretation, but generally, if Comcast now has the power to squeeze money from users and large websites, and they don’t do it, they’re leaving billions in potential revenue on the table.

Comcast can now start offering users priority bandwidth, sold by the gig. Guaranteed to go through first at every Comcast internal router. They can order their service techs to go out and throttle their internal routers or have them deliberately drop non-priority packets frequently, to create an incentive to purchase this.

And the same thing with websites. They can just have a store page where major websites can buy priority traffic terabytes.

A wire that connects you to the CO or head end will always be superior to wireless. It is going to have far more data capacity than OTA technologies can even theoretically approach. Any technology that improves data rates in wireless will also find its uses in RF or fiber technologies as well. You can put several times the spectrum into a single RF cable than the total wireless spectrum that is shared by everyone. Fiber just ads another order of magnitude or few to the capacity.

Wireless may get to the point of being “good enough” (though I doubt that, as we always want more), but wired will always be better. Wirelss will be fine for your phone, but wait until all the people on your block have 4k TVs, and they want to stream movies on them. Wireless will not be able to handle that, not even theoretically, much less practically.

It wasn’t widespread specifically because it fell under regulatory pressure.

The new moves of the FCC removes that regulatory pressure.

[/quote]

It is not punishing them for what they might do, it is just telling them that they may not do that thing. You are not punished for speeding by being told that the speed limit is 55.

It was widespread enough that net neutrality was introduced the first time.

Look, I understand your argument, and to a point it makes sense. “Wait until there is a problem, and then do something about the problem.”

The point where it stops making sense is “And then pretend there was never a problem and refute the fix on the grounds that the problem has not and will not occur.”