While it seems like the number of web pages is infinite, it is, obviously, both finite and, relatively small.
But what is the actual limit? And what is the factor that causes the limit?
While it seems like the number of web pages is infinite, it is, obviously, both finite and, relatively small.
But what is the actual limit? And what is the factor that causes the limit?
That isn’t obvious at all. In fact, I can write about 10 lines of code that would put a countably infinite number of web pages on the Internet. I call it “friedo’s big list of natural numbers.”
If the webpages are procedurally generated (as you say), then that isn’t statically finite.
However, thanks to the 2nd Law of Thermodynamics, it’s still dynamically finite, because you’d run out of the ability to do work before you run out of (infinite) natural numbers.
“Countably infinite”?
With IPv6 (the current IP standard): IPv6 - Wikipedia
So at least that many devices.
Yes, you could just have a bunch of web pages, and the first one would display “1”, the second one would display “2”, the third “3” and so on until you run out of integers, which will happen when the sun expands into a red giant and vaporizes your servers.
However, those websites have to be addressable. So the DNS system is an obvious limit.
One of the many types of infinity.
Countably infinite means you can enumerate them but there are an infinite number. Example is natural numbers. I can start counting and name natural numbers in order, but I can never count them all. Positive real numbers are not countable because I can never name even the first real number.
You can generate a series of web pages that each correspond to a natural number, and therefore are countably infinite by definition.
I can create a web site that no matter how many pages you ask for, there’s always another available. That’s the definition of infinite. “Statically finite” is not a phrase that makes any sense in the context of an inherently dynamic medium.
Thank you.
Depends on what you define as a “website”. A “website” as commonly understood (single domain), that isn’t cloud-based and publicly accessible, is generally tied to an ipv4 ip address, as are many publicly accessible devices. There are a finite amount of these (4,294,967,296 to be exact).
However, with the advent of ipv6, which is continually being rolled out, but not with nice, telephone-like numbers, there are 3.4×10^38 possible IPs. This is enough that you could assign one to every atom on the surface of the earth, and still have enough for 100 more earths.
For practical purposes, you’d never run out of distinct “websites”. And if you did, they’d just come up with another scheme.
A “web page” is a different animal.
The actual limit would be the IP addressing scheme. DNS is a convenient abstraction but entirely unnecessary. Even that would not be a limit, if you define “web site” as a “top level index.html-type page”. A single host can have a number of web sites that are limited only by available storage. For example, you can consider Straight Dope Message Board - Your direct line to thousands of the smartest, hippest people on the planet, plus a few total dipsticks. to be one web site and www.straightdope.com to be another even though they in the same second-level domain.
That only limits the number of web sites (or “devices” as you say). Each site can host as many pages as it wishes.
Most browsers limit the length of a URL. IE is something like 2000 characters. That imposes a limit of N[sup]2000[/sup], where N is the number of allowed characters in a URL. I think N is about 75, the 62 alphanumerics plus about a dozen others. But this is a pretty soft limit since different browsers have different length restrictions and it could be easily increased in a browser update.
And you don’t need a browser to access web pages. Any HTTP client will do. Even telnet, if you feel like typing in your headers manually.
The HTTP protocol does not place any limit on URL length, although a de facto limit exists as to what the server will accept.
How is the DNS system a limitation? All of those pages could reside on a single server.
Even if the OP uses them interchangeably, I really think there should be a distinction between web pages and web sites for purposes of this discussion
The URI RFC doesn’t put a maximum length limit on the overall URI, but strongly recommends compatibility with DNS in the “Authority” portion (what would be considered the “host” portion in an HTTP/HTTPS implementation), so a limit of about 255:
*Even this is too generous; a DNS name is actually limited to 253 octets in length.
None of the relevant RFCs dictate an actual URI maximum length, but in the same real world where you can’t procedurally generate an infinite number of web pages because of the heat death of the universe, you’ll find implementation limits in the tools. Internet Explorer seems to puke of the URI is over 2083 character, or if the path portion of the URI is over 2048 characters. Other browsers or servers have higher limits. Firefox seems to have an untested upper bound (known to be over 100,000 characters), but at some point the computer itself runs out of memory, so that’s finite also.
DNS limitations only matter as far as the hostname is concerned.
Which is fair. My only point with this nonsense is to make it clear that trying to count individual webpages is futile, because so much of it is dynamic.
To simplify, imagine I write a web site that simply echoes back a page that has everything it received as input. That adds a number of potential pages to the Internet that is limited only by the maximum size of the server’s input buffer.
Yeah, if we’re talking web pages, we could create one containing the current time stamp. and keep generating. The limit would depend on the amount of available raw materials to generate disk drives to store them.
Web sites is a far more interesting.
The data collection system I built when I was working generated dozens of fairly complicated web pages several times a day, so I’m interested in this topic. They were internal, and so didn’t have a DNS address. I’m not sure if some of the things count as sites or pages for this discussion.
You can have thousands of separate sites on on IP address. Trying to look for a maximum number of host headers allowed on the Windows web server, IIS, and general consensus seems to be that there is no upper limit.
The IP address directs you to a server and then the server’s Web server software directs you to a folder which holds the physical copy of the web site. You tell the Web server that takes traffic for 123.45.67.890 that when a request for ILoveJesus.com comes in, send it to the C:\ILoveJesus\ folder, and when a request comes in for ILoveSatan.com comes in, send it to the C:\ILoveSatan\ folder. That’s two very different sites on the same IP.
Even one set of files in one folder can have logic programmed in to serve multiple different Web sites using the same files in the same folder. The host headers get the server to the right files, and then the logic within them serves up the right site.
Anyway, the total number of IPs available is nowhere near the number of possible different sites, all running from their own folders. And as everyone has already shown, the number of IPs is ridiculously large.