See, here’s the query in question: linkety-link
Here’s the URL for that page:
The second page URL, showing results 16 through 30 of the 244 results is:
What I’d like to be able to do is list all 244 results on a single page for more printer-friendly results. Any suggestions on how I can achieve this?
Any chance to see the actual PHP code?
Not my website, so, no. I was just hoping there might be an easy way to write the query (am I using the right terminology?) in such a way that the db would return what I’m looking for.
Without knowing the actual PHP code, the only way to find out is start messing with parameters and see what happens.
Be aware that the owner of said website may not appreciate that.
I presume you’re talking about the embedded query string in the URL (the name/value pairs after the ? in the address, typically indicating information sent to a web form with the GET method).
Doesn’t look like it to me – my guess is that this web page is set to display 15 items per page, no matter what. (i.e. a hard-coded limit). None of the name/value parameters are specifically about number of items per page.
“counter” is indicating what item you looked at last, so “counter+1” is actually the item that the current page starts with.
“currentPage” is obviously the current page of results you are on, and “numPage” is the total number of pages (which appears simply to control how many page links are placed at the bottom).
“sub” doesn’t have a clear meaning, but appears to be related to the actual search you did, controlling the category of results in some way.
The query string here and the actual query to the database are really two different things. I’m sure that a direct query to the database can do that, but the PHP script that’s controlling the query to the database doesn’t appear to allow it. And you can’t see the PHP code, because PHP is a server-side scripting language. So if you do “View Source” on a page that uses PHP underneath, you’ll only see what the browser gets, which is the result of the PHP script and the HTML it generates.
Well, here’s what I would do.
First, I’d collect all the pages, like this:
for i in 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17
curl "http://www.calgary.servpro.ca/testlist.php?city=Calgary%20and%20Surrounding%20Area&letter=&compType=Apartments&counter=$(($i*15-15))¤tPage=$i&numPage=17&sub=30439" >$i.html
Then, I’d cat them all together, and print them out.
That will include all the extra headers. To roughly cut it off, do this:
for i in *
sed -e '/Pages/,$d' -e '1,292d' $i > x/$i
Then cat all that together.
Then, when your message board won’t let you post the result, because it’s too long, copy the text (not the html, but the text output), and clean it up with:
cat * > out
sed -e 's/^[ ]*//' -e 's/[ ]*$//' -e '/^$/d' -e '/map/d' out
Which looks like this:
Alanna The 909 2 Avenue NW
All Investments Ltd
4015 1 Street SE
Amica At Terrace Gardens Admin Ofc
727 6 Avenue SW
Amica At Terrace Gardens On Sixth
727 6th Ave SW