Why are so many websites broken down into pages?

I mean things like forums (not this forum, but for e.g. digitalspy.co.uk). Or pages of race results such as this, which is what prompted the question just now:

http://www.greatwesternrunners.org.uk/index.php/results/2014/487-towpath-results-2014-race-5

For this why not show all times on one page, why have to click next to see more?

I could kind of understand it back in the day with slow internet connections, but in the broadband age what’s the advantage to the user or the website owner in not showing everything on just one page?

By dividing things up into pages, they get to show more ads.

Many websites are outsourcing their components to the cloud, such as putting video or images on an Amazon S3 bucket, to speed up the loading. This slows things down.

Actually, last night I finally solved the problem of Icecat ( Firefox ) not displaying large images, something other browsers managed on my computer ( they are there but not visible except in zoom ) *, and searching for a solution I looked into network http pipelining. I found:

Firefox only has the ability to send 8 requests at once, so altering this value to anything higher than 8 is pointless. Will it hurt anything if you set it that high? No, but it just shows you that the people who told you to do this don’t know what they’re doing.

Even if it could send 30 http requests at once, you wouldn’t want to set it that high. If everyone made 30 simultaneous http requests to every web server they connected with, the internet would be slow, webmasters would hate you (and Mozilla,) and your IP would probably be banned from many websites. There is a reason that this setting is turned off by default.

Egonitron
So, the more requests, the slower it will be. And so spreading out these requests to 10 different servers, each of slightly different speed, prolly isn’t optimal.

  • Seriously, just change image.high_quality_downscaling.enabled to FALSE in about:config. Firefox default is TRUE, which is a very bad thing, but then Mozilla make a lot of bad decisions.

If only a fraction of your visitors consume all of a story, you probably lower the load on your servers if you feed it too them in portions.

But it may be that a more important factor is serving ads and old habits of thinking in pages.

It’s all about the ads. The web server traffic increase for a few more pages of text, and a few more graphics, is completely negligible in the age of Netflix. Historically, the deliberate pagination didn’t really come around until popups were routinely blocked, banner and flash ads mostly went away, and AdBlock gained prominence. It was a response to the decreasing ad revenue.

Web designers are being taught to do this. I recently updated my web page that lists all my papers (and a few other things). It has a chronological listing with tabs for each decade and a separate subject listing with tabs to each subject. It is all on one page. My son-in-law is just taking a course in web design so I asked him to look at it. His first response was that I should use separate pages instead of tabs. I assume that is what he has been taught.

It cannot all be about ads. Some paywalled academic journals do this, although there are no ads, or very few, when you are inside the paywall. They may, for instance, spread the contents listing even of a single issue over two or three pages. This very annoying because it means you cannot use your browser’s inbuilt search to find what you want. (There is normally a proprietary search engine built into the journal’s pages, but in my experience these are both awkward to use and unreliable, often failing to find articles that really are there.)

I have the impression that this is now less common than it used to be, but it used to be very ubiquitous on such sites, very annoying, and, so far as I could ever tell, quite pointless.

Mine is broken into pages because there are lots of different topics, and because content at the bottoms is going to get lost. I put buttons at the top of the page to stuff that sits on the home page, but there is way too much information.
It is also easier to get lost.

Bing image results show everything on one page, and when there is a lot of stuff it takes forever to stop refreshing, so you can get totally lost.

The answer in the case of the OP’s link and the original question involving forums is database load. When you paginate you only have to get X records at once and the processing load is smaller.

It might seem stupid to do with 150 results and break them into 3 pages. But what if you don’t know how many results there might be? It doesn’t seem stupid if there are going to be 5000 records. You would definitely notice a speed difference then and so would the database server. A developer has to sort of guess how big to make the paging so that there’s not too many pages but they’re not too long. And sometimes you don’t know so you guess or make a default, because the people with the data are not the same as the people displaying the data so you just have no idea, and 50 sounds ok.

Nothing to do with ads or total requests or how fast your browser is. At least in the case of the OP.

I prefer pagination on blogs etc., Endless Scroll just seems sad. Pinterest manages it, although it could stand to turn a page after 200 images… But some page scrolling through 2014 to 2008 places a strain on my browser cache, as well as, as you say, making it difficult to keep a place.

That’s only just better than those dumb entertainment sites that span 200 words over 5 -12 pages and risibly imagine I’m going to click through to the bitter end. Or even Page 02.

Why is that a bad thing and what will doing that do?

CMC fnord!

It’s a known issue, and makes people think Firefox is flaky. Because, as I said, on some platforms — more likely to be 64-bit — opening a large image, say over 1500 x 2000, the image starts loading, and when finished vanishes into a white page. The image is there, and if one zooms in comes up in full colour, but one can only see a bit at a time, leading to scrolling and the use of imagination to visualize the whole. It even affected large images downsized to appear small as samples on web-pages, thus leading to a lot of blank portions.

If saved the image is normal and can be viewed in Gwenview or any other viewer.

So changing the setting to FALSE enables the whole image to be seen normally every time.
See:

Ghacks Disable Firefox’s Image smoothing algorithm
Mozillazine

But if it doesn’t affect your browser, no point in bothering ( although I would change it since changing would do no harm, easy enough to revert in about:config ).

Can you clarify this?

Using a CDN does speed things up, for several reasons. But it has nothing to do with the question in this thread.

Just a note that requesting 5,000 records 100 at a time results in a significantly higher load than requesting all 5,000 at once. Oversimplifying a bit, the server has to re-run the query on each and every page in most cases. Of course if you assume most readers only ever hit the first page or two then it makes sense.

The real answer to the OP is probably a combination of factors including those mentioned above. The longer each page is, the more time it takes to load, and the more readers you lose. Breaking things up into pages means search engines can be more specific. Most readers probably don’t scroll further than a certain distance down a page. Most users probably don’t use ctrl-f to search within a page. Designers and information architects like separating things into pages. Smaller pages means lower bandwidth costs (most visitors only skim the first page). Ad revenue often depends on impressions, and more pages means more impressions. It also means inflated stats, which may not tie directly to revenue but everyone wants more page views.

This. Lots of Web developers are still hewing to the database load problem. The textual First Previous Next Last sequence are links to PHP pages (although not in the OP’s case) to get the next set of records. Also don’t overlook the residual impact of slow internet connections on users as well - 50 records is lots quicker to load and display to 500 records. I doubt most Web developers at the OP’s level care at all about ads. The site developer is just doing it “the way it’s always been done”

For the example you listed above, it’s because the software used, according to the page is “Report Produced using RaceMaster98 Software from Sport Systems”.

If you visit the page in question, you get charmingly quaint snippets like

  • RaceMaster98 is the most advanced, friendly and reliable Race Management Software on the market! To get your copy just complete and send the Order Form and cheque to us today!
  • Runs on all 32 bit Windows systems (Windows 95/98/Me/NT/2000/XP)
  • Both files are around 12Mb in size and will download in about 40 minutes using a 56K modem or about 4 minutes using 512Kb/sec broadband.

In short, it’s using software that hasn’t been substantially changed since “back in the day” and it shows.

True.

At the other extreme (usually involving non-commercial sites), there is a good correlation between eccentricity/looniness of the site and the tendency to have a single website page that scrooollllls on forever.

Using pagination to avoid database loads is old hat. Modern websites tend to use dynamic loading (and unloading.) You get enough info that the user has to scroll, and when they get close, you start loading more.

That wouldn’t work with the OP’s website, but seeing as it’s a fixed list, you don’t even need to continue using a database. Look it up once, save it on a webpage, and use that from now on. That’s just one database hit per view.

I think the pagination is far more likely to be legacy. It looks like a printed page. There aren’t even pagination links at the bottom. And the top mentions it being a “report,” which implies static (often printed) output from a database.

It looks a lot like what you get if you printed a report in Microsoft Access.

This isn’t the reason for news stories, but since the OP also mentioned message board threads: You often don’t read an entire thread all at once. Typically, you’ll find the thread while discussion is still ongoing, read however much there is, and then come back to it later to see what else has been added. And when you’re reading that way, it’s a lot easier to pick up where you left off if it’s broken into pages.

Maybe HTTP should be modified to include a requested page size…

“Please only send me at most 40 screens worth at a time” or something equivalent. (lines, points, inches, or something.)

Not everyone has fast internet. The US isn’t the entire world.