Time Warner is about to fuck you the fuck up

OK, just to make sure we’re all using the same metric, test your speed here. I have Comcast, and I’m getting 12,374 Kbps download speed at the Chicago server, and slightly faster (?) at the Seattle server. What numbers are you getting?

Jesus, my head hurts. I think there’s only one of those I understand.

One news article pointed out that watching 3 HD movies in a month would put you over the top and cause additional charges.

Which is the point I was trying to make earlier…more and more movies will be in HD, and more movies will be watched thru Internet connections instead of cable TV, so the average user will start to feel the effects of the clampdown before long. When that happens, the cable company can’t claim it’s just the power users abusing the system, and it becomes a surcharge that most everyone pays; i.e., a price increase.

Which brings me to another point…cable companies are always encouraging use of TV channels; they don’t throttle that use. If they are running an all-IP system, watching a movie on TV channel 9 is essentially the same data traffic as watching it thru an Internet download. But now they’re going to penalize you for doing it one way but not the other? That’s just a price increase pure and simple.

You and me both!

I thought this was going to be a fun pitting, but I, like you, don’t understand anything these guys are spouting.

Are the geeks having a pissing match?

HL2 is something like 1.3 gig, not 225mb.

Isn’t it?

But what if you were? Or at least what if you were truly capped on bandwidth? Now why would they care if you use that bandwidth 24/7? The answer, as has been mentioned several times already is that this is not about fairly proportioning fees to usage, but is simply a way to increase prices without looking like they’re doing it.

Big number.

Streaming coverage of baseball games provided for a fee by Major League Baseball, the governing body of baseball in the US.

:smiley:

The apartment next to 800J.

Forty gigabytes… or 40,000,000,000 bytes. I think.

Time Warner Cable, the largest provider of premium TV and broadband internet service in the US.

:confused:

I may have been wrong about this. Apparently Firefox reports download speed in KB/s, not kbps like I’m used to.

It’s not just Firefox. I think every application I’ve ever had reports transfer speeds in KB/s.

edit: Yeah, even UNIX ftp gives me transfer speeds in KB or KBytes/sec.

So the basis of your complaint is that if people start using more bandwidth, they will have to pay more?

IP and cable television have basically nothing in common.

If you were getting guaranteed bandwidth from your ISP, you’d be paying a heck of a lot more and then no, they wouldn’t care how much bandwidth you used.

I’m getting between 2,000 and 2,800 kbps at work. Oddly, the Atlanta server (closest one to me by far) is slower than the others.

I’ll have to try this at home later.

Actually, I do have one factual question here. I just checked my throughput speeds, and I’ve got 13,374/Kbps download and 361/Kbps upload. Is this normal? First of all, I didn’t think my download speeds are supposed to be that high (I thought I had something like 6 or 7 Gbps), but all the speed checks online agree with that number.

Second, why the hell is upload so slow? It’s always been this way. I upload about one to two gigabytes worth of data a week, and it takes me a good 18-24 hours to complete. Is it normal for uploads to be that much slower?

Cable upload bandwidth sucks. It’s a fundamental limitation of the technology. That’s one of the reasons cable ISPs have such problems with P2P – having their customers uploading huge files is not something their networks can handle very well.

Yes, I just noticed. I was uploading 35 MB to my server and did the download speed test – wow. Somehow, just uploading something at a rate of 361 Kbps ramped down my download speeds to, get this, just over 200 Kbps. WTF? Where the hell did all that bandwith go?

Would something like a flat fee + a fee based on the amount of bandwidth used per month work? Or would that create too many headaches?

I’m getting 1001 kbps down, 289 kbps up on a wireless G connection to a DSL router. Just for comparison’s sake. That was from a Seattle server–I’m in Portland.

I’d like to see some concrete proof that ISP’s are barely scraping by because I’m not feeling their pain. How about getting some of their money back from the providers of content since they’re the ones causing the issue? If a site is making money from the traffic it inspires, why shouldn’t it share with the ISP’s? Truckers pay higher road taxes than passenger cars for just this reason–they overuse the infrastructure. Why should the end user be the only source of revenue?

They do – content providers need to be provisioned with a lot more bandwidth than a typical end user, for which their ISPs charge accordingly.

From what I know, they are still raking in profits. Once the infrastructure is set-up (which it is,) then the costs if keeping the bandwidth isn’t hard. Whether you personally use your connection at it’s maximum 24/7, or use it once a day to check email, the cost difference to TWC isn’t much.

Someone can correct me if I’m wrong, though.

Yes. My ISP charges a fixed rate no matter how much I use (up to the guaranteed speed). I don’t pay more if I use more, why should they?

They have everything in common. In all-IP based systems, the same pipe carries all signals as IP packets: telephone, cable TV shows, Internet access. That’s the way my hookup works.

My company doesn’t seem to care, and yes, I am getting what I was promised and paying what I expected. At least so far. It’s honest and fair to everyone.

My complaint about the new billing concept is this: I am promised a level of service for a certain price because the ISP doesn’t expect everyone to need 100% of the service 100% of the time. But if I actually use it as promised, the ISP says, “We didn’t really mean YOU and we didn’t expect you to take us up on our offer – you are abusing the system.”

You are using the fixed-width pipe to send data at the same time as the test? And you’re surprised that one affects the other? How else would it work?

No speed test is valid if other things are running at the same time.