Is there a 2 GB file size limit för http transfer

I’ve got a “cloud solution” where I upload backups on a “online hard drive”, but I can’t upload files larger than 2 GB; the job gets cancelled.
The provider says that there’s nothing they can do about it, because “the two gigabyte limitation is a part of the http protocol”.
Seems kinda weird to me and Google can’t confirm that that is the case; but on the other hand there seems to be a few two gigabyte limitations in this and that type of transfer so perhaps the guy is right in a way. But be how it will with that,

what’s the straight dope on the http protocol’s 2 GB limit in file transfers?

They’re blowing smoke up your modem. There is no inherent file size limit in HTTP, mainly because HTTP is a whole zoo of protocols and methods. If HTTP had an actual 2 GB limit, streaming video would become difficult.

As you found, there are some file transfer protocols that may have limits, but for the most part, any limits have been imposed by the server administrators. The “trivial FTP” (TFTP) protocol in its first version, for example, had a 32 meg limit, but now, its limit is 4 GB, and in some implementations, its actually limitless.

My suspicion is their servers are dusty old things running Windows NT and the FAT16 file system, which, IIRC, did have a 2 GB file size limit. But that’s a limitation of an old version of Windows, and not of HTTP.

I’m assuming your are “uploading” via HTTP Post. According to the specifications there is no limit, but in practice the size of a request web server software can handle is limited by memory. The major servers (Apache, IIS) allow the server admin to configure a max request size, up to a max of 2 gig. It’s an old limit that could probably be increased with today’s 64-bit operating systems, but as far as I know nobody’s in a big rush to, probably because HTTP Posting is not the most reliable way to send large files.

I was working on a piece of software that used a custom HTTP handler to handle document uploads and I was told it was used because the default behavior with IIS and ASP.NET is that a file uploaded via HTTP POST is fully loaded into RAM. This can be a problem if hundreds or thousands of users are uploading multi-megabyte files at the same time.

I had to write a Tomcat filter not too long ago for that reason. We frequently move around multi-gig files and at the time needed a way to accept an arbitrary amount at once as POSTs. The filter streamed them straight to disk instead of holding them in RAM. Not too hard to do and if I were the OP I’d be nervous about any data I have stored with someone who’s telling me you can’t transfer more than 2 gigs over HTTP.

Thinking about it, Wakinyan, are you using an older browser? What might be happening is that when your browser tells their server the content-length of the file you want to send, the browser is trying to fit the file size into a signed 32-bit integer (so max file size there is 2,147,483,648 bytes or two gigs). If your browser is up to date, then I think it’s most likely that they’re using an older server that tries to fit the content-length you’re sending into a signed 32-bit int and failing.

Everyone else has answered the question, so I’ll just congratulate you on the use of the metal umlaut in the title. :wink:

Is there a 2 GB file size limit för http transfer

Not long ago I used ftp (a Solaris version) to transfer some very, very large files (some over 10 GB.) Keeping the link up long enough was an issue, file size no. I gave up on one over 50 GB since transfer time was > MTTF of the link, and it wasn’t all that important.

What kind of link was it, do you remember? I can’t imagine a run-of-the-mill home or SOHO (Small Office/Home Office) broadband line dying in only the amount of time needed to ship 50 GB, unless something outside the connection’s purview forced it to die (suicidal raccoons, fiber-seeking backhoes, a gentle sprinkling of rain on my old ADSL connection, etc.).

YouTube still has the option to POST a video file of I believe up to 20 GB. (At least 7 GB, since I’ve uploaded that much in IE9 which doesn’t use the fancy Chrome uploader.)

They’re not being strictly truthful - as others have said there’s no inherent limit to HTTP file upload size - but it’s certainly true that supporting uploads larger than 2 gig requires some work.

Modern web servers can be configured to support larger uploads (nginx for example). As can PHP, ASP and the like. But there’s more to it than just reconfiguring those. Typically a web application will sit behind a load balancer or reverse proxy; you need to make sure they support large uploads too (and in a typical hosting arrangement, that’s often controlled by the web host rather than an individual site). You also have to make sure time isn’t an issue. Large uploads obviously take longer, and it’s common for load balancers and web servers to be configured to time out after 5 minutes. Some CGI interfaces also have a separate timeout. And finally, any web service accepting large uploads is probably storing them on a CDN. There may be timeout or size limits at the CDN to be configured, and the application needs to be written with some additional error handling in mind.

In short, while there is no inherent limit, properly supporting large file uploads requires a bunch of things to all be configured and engineered correctly. It needs some specialized knowledge to get it all right, and I’m not really surprised you’d get a dismissive answer like that.

It might not be as easy as plugging a machine into the into the internet and putting a website on it but it is a solvable problem. For that reason I’m a bit surprised they’re so quick to admit they’re either incompetent or don’t care to design a service that is able to receive their customers’ files. I would take it as a sign that they shouldn’t be trusted with my data.

It was big company to big company, and was often used to transmit massive amounts of data about our designs. My goal was to transfer massive amounts of archived data on their site back to us in case we needed it in the future. No one uses it any more. By dying I mean flaky enough for the transfer to stop - When I restarted it (and I had a script to do it, since there were thousands of smaller files as well as a gigantic big one) it always worked for another day or two. So the problem could have been things beside the link itself. It died every day or two - not a problem except for this very large file. 10 GHz files transferred fine.
I also suspect this link has been in place for 10 years at least.

No, as a matter of fact, the software I’m using for the transfer is a (presumably web based) application I got from them.

Thanks, but since I’m Swedish I’m not sure I deserve the praise. “For” in English is “För” in Swedish so that word just slipped through the translation filter I guess.

As a matter of fact, I just told them that we will not renew the contract. This question was just the tip of an incompetent iceberg as many of you suspected.

Thanks everybody for your time!