There’s an increasing trend toward dynamic web pages lately. That’s a good thing. For dynamic data.
When the page cannot know in advance when it should update, a truly dynamic page makes sense. A page listing a current stock price, for example, makes sense to be updated as it loads — the number of updates is probably even larger than the number of views. If you update on view, you end up saving work overall.
Now, the contrary example is, for example, a list of personal files. Looking the list up in a database is complete overkill. For each hit, you’d have the overhead of parsing some SQL, switching context to the database server, running the query, switching context back, piping the data in, and formatting it. All this to look up what’s very likely the same thing that was looked up last time.
It’s time to go full circle back to the era before databases. Back to when writing a CGI script was difficult, and so only done if neccesary. We have better tools now. The tools are the same as one would use for a dynamic site.
php is just fine for the task.
Take an ultra-simple PHP script like so:
and there’s your basic dynamic index page. Now this case is ultra simple, but imagine it does some non-trivial amount of processing instead of just pulling a quick
make to the rescue:
php -q $< > $@
index.html: index.php .
php -q $< > $@
Put that in a file called
Makefile and all you have to do is make a lightning fast static directory index is type
make in that directory. If you think you’ll forget, you can put it in your
crontab or in your logout script.
Then, in your
.htaccess file, put in
DirectoryIndex index.html, to make sure PHP is not called for each hit. Better, do it in the webserver config file. To test changes to the script, go to the index.php directly. Users won’t see changes until you run make, though, so don’t forget. What they will get will be lightning fast and cacheable, too. Watch your webserver load drop as static data is stored statically, and let dynamic scripts do dynamic things.