PHP to Static HTML On The Fly

Data driven web sites are da bomb, in my not-so-humble opinion. The ability to drive information through a slim passage and have that information display, in various ways, is without a doubt amazing. Anything can be manipulated, altered, structured, added, removed and decided before it even hits the screen. Systems like WordPress, Joomla, Drupal have had their day in the sun. On the horizon are even more progressive, minimalistic UCGm [User Generated Content management] systems that will have all of us programmers gushing like school-boys. Old[er] folks, like myself, who worship L, M, W -AMP need to sharpen our skills, keeping marathon pace with the wondrous changes in the digital universe.

Recently, I was faced with a serious issue for a clients website, where -due to their server set up- and limited ability to get noticed by the Big3 Searches. If fact, with the Panda/Penguin frequency and alterations for “YingBook” {yahoo, bing, facebook} social search, the site all about keeled over. The problem was: her pages were getting ignored or the database would go offline for exceeding the 128 limit. Evn after increasing to 256 the site was still @ a snails pace. Not to the point of tears, she asked me why she could not just create Static pages like the old days {html4}, using the current CMS platform.

My explanation was nothing short of a technical plethora, of which she was oblivious. Finally, I said to her, “We can set up a system to transfer all your data based pages into static html, that you can edit on-the-fly. Long story short, as a marketing “gurette” of sorts, is now working with me to develop a strategy to offer the general population such a program -primarily for mobile/wireless devices like iPad, Android maybe even SUR.

The PHP to static HTML works quickly and effectively, to take the information and create  fully functional, SEO friendly pages, without the trapped header or slugs limitation often associated with CMS programming. It can even include breadcrumbs or anything, really, that fits into a traditional web page. I first used this approach for rebuilding RSS Feeds ala minute as pages were changed/updated, created or deleted from the system. A very efficient, lightweight measure, that truly boosted visibility, while providing an outlet from the data doldrums and easy enough for the novice designer or non-tech to use.


  1. Collect Data from Database
  2. Get Static or Template Elements
  3. Parse PHP to HTML
  4. Reload Sitemap/RSS

Database Checkpoint – Update or Create?

Simple and easy. Get the post Title, etc. Do the error checking, like empty, too short or too long. Apply the errors for all the additional items like photo, video or audio formats (extensions), content word count, etc. Once complete, we execute two options. Does this Title exist? If yes, update the record by this author, back-up the old page into a zip file, unlink the old page, create a new page and reload the RSS feed. Else, add the new record to the database, create a new page and reload the feed.

/* Check for Errors */
if($TITLE=="") {die('Title cannot be empty');}
else {
$TITLE=str_replace($skunk,' ',$TITLE);
$TITLE=str_replace('  ',' ',$TITLE);

die('Please make the Pubtitle more than 12 letters');}
die('Please make the Pubtitle less than 40 letters');}
else {

/* Check Title Database */

$title_chk=mysql_query("SELECT * FROM `PUBS` WHERE TITLE='$TITLE'")


/*Zip old File*/
$zip = new ZipArchive();
if($zip->open('$TITLE-$', ZIPARCHIVE::CREATE) !== TRUE)
{die ("Could not open archive");}
$zip->addFile('../pubs/$TITLE', ''.$TITLE');
} else {
[email protected]_query($new);
if(mysql_affected_rows()==1) {


Static Page Creation

This can often be done using the Get_File_Contents function, which will grab elements from an existing Template and APPEND new information to the Static Page. But, I find this can be tedious work, especially since it requires breaking apart the template and rebuilding the elements on a line by line basis. So, to speed up the process, my approach is to start with a completely fresh page. By doing so, I create new headers and meta, then  transpose the entire content of the page. This allows MASSIVE flexibility as to what items will be displayed. As a big fan of jQuery .load(‘page.html’) function, the items necessary to perform certain functions already exist off page, thereby reducing digital clutter.

/* create master page per Title */

$handlehtml=fopen($HTML, 'w');
<html xmlns="">
<title>'.$TITLE.' |  MY KEWL WEBSITE</title>
<meta name="description" content="$SCRIBE"/>
<meta name="keywords" content="$PHOTO1, '.$KEYWORDS', ETC"/>
<meta http-equiv="Content-Type" content="text/html; charset=utf-8"/>
<meta name="author" content="AUTHOR"/>
<meta name="revisit-after" content="1 days"/>
<meta name="y_key" content=""/>
<meta name="msvalidate.01" content=""/>
<meta name="google-site-verification" content=""/>
<meta name="alexaVerifyID" content=""/>
<link rel="canonical" href="$TITLE.html"/>

<link href="css/abc.css" rel="stylesheet" type="text/css"/>
<script src="js/jquery-1.7.1.min.js"></script>
<script src="js/"></script>
<script src="js/abc.js" type="text/javascript"></script>

Using this method, we simply re-create a typical HTML page, being certain to include all the header, meta and linked scripts, like CSS and JavaScript. Notice we use a write (w) command versus append (a). This is important, else nothing will be written on he new page. Again, if you have a Template, you can use the append to add the entire <body>CONTENT ELEMENTS</body> being sure to close the </html> before closing or moving the file.

Now, let’s say for example, you or the author want a backup of the OLD file before creating a new one. You would create a .zip, .gz or .tar file, drop the old file in and then unlink the old file, before creating the new one. And, again, this is me, very elementary, methodical -almost OCD- when it comes to programming. I am buggered by errors or mistakes, so take a very mechanical/robotic approach to doing functions like these. {A most common error, believe it or not is a missing semi colon or curly brace that would taunt me for the better part of a day}

Handle the Data, Write the Flat File and Close Up {its like digital surgery}

fwrite($handlehtml, $loadhtml);

From here, skies the limit. Add any and every element you prefer, be it traditional html4, xml, html5, etc. Write the $handlehtml and $loadhtml elements into the file. Do not forget to do this, else the system with throw an error. Once the file is complete, reload the RSS Feed/Sitemap file. This is very vital for the Big3 which will pick up the fresh, new page -as an updated version -or brand new version- when the feed reloads (usually every 20 minutes). This enhances the chances of being indexed quickly and when the spider comes crawling, it finds that clean link and new html page in perfect press.

  1. Use the same process to rebuild RSS Feed/Sitemap
  2. Delete the old RSS/XML
  3. Create A New RSS/XML

Handle & Write the Information from the Database ORDER BY DESC 15 -as to force the Updated or New Page to be first on the list!  Add whichever additional headers to redirect to the New Static HTML page, trigger the Old for download, or revert back to the authors account.

Did I mention PHP is da bomb ? Huzzah!

Applied Philosopher and Programmer, specializing in PHP, jQuery, CSS. His approach to Web Development is a modern, minimalistic hybrid of logical-creative. Full-time web solutions developer, as well as part-time writer. "I enjoy teaching, learning and exploring new and challenging aspects of digital fandango!" More articles by Charles James
  • Justin Dalton

    This is a neat idea, but why not just utilize a caching library to keep from regenerating the pages every time?

    • I agree, full page caching is basically generating HTML.

  • Ajit

    l iked php to html5 tutorials and
    function’s very easy…

  • Hi, thanks for commenting.
    The idea is to create fresh new pages, with updated information. Reloading from cache would not work, as the page is no longer data driven at run time. It is a clean static page with fresh content. In order to get the newly updated information, the page would require constant live-data connectivity. In this case, the function only uses the data-con once then makes the page. The updated information is stored in the database for future use but does not rely on a constant connection, which is one large issue with CMS platforms. This also complies well with the new Page Layout, Page Map//Microdata elements for better SERP, since authorship, pubdate and the new information are reloaded into the xml//rss as well s the page itself.

  • bullzeyezm

    its great that they can fly

  • its a pity cms framworks don’t do this …

    50 or more mysql queries just to show a page in many of them is just ridiculous.

    as is waiting more than a second or so for a page to start loading!

    My guess is most people building them are only testing on dedicated boxes with terabytes of ram with only one or two users and have forgotten about the real world where shared hosts and vps hosts are much more more common and even small sites are constantly getting hit by crawlers.

    Caching helps but there are many situations where pre-generating static pages might be better.

    eg what if you run a web service api endpoint and one ofthe users decided to let people enter search queries from a form and you suddenly find yourself getting hit with lots of unique search terms every second? (and you don’t want to block them because they are not doing something naughty)… how many cms platforms could cope with that on shared or vps hosting – even with caching?

  • battlemidget

    How does this help a clients cost when deploying to the cloud when all system resources are calculated? Wouldn’t it be better to cache the content, and save on the expense of file system operations by doing a less resource intensive action like querying a database? What happens when the clients site grows to 30-40 pages and you’ll have to regenerate those pages when a template change occurs? Sure you can build on a beefier machine locally and transfer over but wouldn’t you save more cycles by changing the template, uploading once, and letting the database provide the content?

  • Blogs are a great place to learn about PHP regardless of your skill level. Excellent works guys keep them coming spreading the word via the power of technology.

  • Nice update……

  • bobo007

    uh, is this site dead?

  • web development and design

    Nice content. Thank you for sharing your wonderful thoughts with us.
    This is really awesome and very useful post!!!

Home CSS Deals HTML HTML5 Java JavaScript jQuery Miscellaneous Mobile MySQL News PHP Resources Security Snippet Tools Tutorial Web Development Web Services WordPress