Is there a website checker that does more than ping?

Sites like Down for Everyone or Just Me appear to do no more than ping the server and say “It’s just you” if they get a response. They don’t appear to check if the server is actually serving up pages. A while back I tested DFEOJM by shutting down Apache on one of our servers, us being the school I used to work at. Down For Everyone cheerfully reported that it was up, even though the machine was doing nothing useful other than responding to pings.

Is there an alternate checker that will actually verify that pages are being served?

Try to archive the page: https://web.archive.org/ and https://archive.is/ will go out to the page, fetch all the resources, and show you what they got.

You’re starting to ask for a bit more than machine intelligence.

A site could serve up a single “Under Construction” page, or a “404 not found” page or “this address for sale” page, but a website checker would report that it responds with webpages. To see that it’s an actual working website, with information there, might take some human intelligence. And then to identify it as a cobweb page, that hasn’t changed in years, would take even more effort.

Under construction and “this site for sale” are both valid pages. But a “404 not found” will come with an error code (oddly enough, code 404) that will tell the browser that it’s not a valid page.

Getting an “Under Construction” page or a “404” page would still indicate that the web server is operating, and would qualify as the site being up. I’m just talking about when there is no web service running, but the machine still responds to pings.

The archive sites Derleth mentioned seem like they’ll do the trick.

A silly side question: what if Down For Everyone etc. is down for everyone?

If this is for monitoring sites you own, we use Pingdom (https://www.pingdom.com) for that. It will even test the time to load the entire page (and will break down the parts to show what takes the most time when loading). It is a subscription service, but it’s cheap.

There are so many good reasons why a website does not allow it’s users to ask the webserver to access an arbitrary URL - it would be ripe for abuse in so very many ways.

There are services that allow a site owner to set up an external monitor - a regular HTTP/HTTPS request to a specific URL that demonstrates functionality. However, due to the security issues mentioned above, these are paid services that require that the purchaser be registered, anddemonstrate that they have management capabilities on the target site.

UptimeRobot can retrieve URLs and ensure keywords are contained in the result. I use it to monitor a few sites.