A few weeks ago I wrote about how I moved this web site from a proprietary content management system to one where the data is managed from a FileMaker database. In this post I am discussing how data is cached to ensure the site is always fast and that the load on the server is kept to a minimum (very very important for high traffic sites).
To recap, we pull the data from FileMaker Server using a SQL SELECT statement... …
$query = 'SELECT Title, PostSummary, pagelink
FROM Posts
WHERE isOnline = 1 AND Category = \'blog\'
ORDER BY Page_Date DESC';
$blogposts = fmExecuteSQL ( $dbname, $query );
In order to add caching we first try to get the data from the cache.
$page_cache = fmCacheGet ( $_GET['pagelink'] );
If there is no cache available, or if the cache is too old
1hr (3600) | 24hrs (86400) | 1 week (604800) | 4 weeks (2419200)
if ( ( $page_cache === null ) or ( $page_cache['Cache_Age'] > ( 2419200 ) ) or ($_GET['forcereload'] == 1) ) {
If this is the case we build and execute the SQL statement.
$query = 'SELECT Title, PostSummary, pagelink
FROM Posts
WHERE isOnline = 1 AND Category = \'blog\'
ORDER BY Page_Date DESC';
$blogposts = fmExecuteSQL ( $dbname, $query );
$page = fmExecuteSQL ( $apjltd, $query );
Put the result into the cache
fmCachePut ( $page, $_GET['pagelink'] );
If the cache exists and is not too old we get the contents of the cache.
} else {
$blogposts = $page_cache['Cache_Contents'];
}
As before we then loop over the results to output the data.
foreach ( $blogposts as $row ) {
// Convert row into associative array.
(Makes it easier to use the results.)
$columns = fmGetRow ( $row, 'Title, PostSummary, pagelink' );
echo '<h4>' . $columns['Title'] . '</h4>';
echo html_entity_decode($columns['PostSummary']);
That’s it really caching implemented. So now the thousands of daily visitors to this site will not slow the server down at all!