I’ve just written a simple database driven forum (like vBulletin) and implemented word search.
I’m rather worried that once there is a sizeable amount of data on the forums that the searches will be rather processor intensive and slow, but I’m not sure how intelligent PostgreSQL is about caching query results (particularly when using LIMIT and OFFSET) so that you can go back and grab more later (i.e. when you split the results over multiple pages) without the query having to be rerun.
One idea I had was to run the query once, creating a new table from the result so that as you page through data you can just do a simple where clause, and dumping the table after 15 minutes or so, but I’m not sure how necessary something like that is, if creating and deleting lots of tables dynamically will create fragmentation issues, or if there are better solutions to this problem. Anyone happen to know?