Pros and Cons of SSI for dynamic-static delivery
Posted on 2004-11-14
We find ourselves gravitating to turning whole sites into foo.shtml pages and putting the load on Apache to build the page from SSI's a typical example is below, where (obviously) all elements common to all pages are pulled in just before delivery. Now, we are beginning to do more stuff like this that extends from simple text includes to exec's that drive' CGI's that test the URI and other stuff and return chunks based on lot's of other logic that the CGI works out...
So far this site dev path seems like a walk in the park. In fact our hinduismtoday.com magazine archive site is completely SSI driven top to bottom. The advantages are obvious, instantanious global updates just by editing a few files...the final page *appears* static and is indexable by search engines, as opposed to dBase driven pages which may not be so easily indexed (but, I may be making wrong assumptions about this latter point, as I don't have experience with dbase generated web sites) etc. Of course you know all that...
Also we find the page load times very fast compared to .asp or java created sites. So, before I go turning our other web sites into complete foo.shtml-parsed page sites, I just wanted to air this out here to see if there are downsides that others have actually experienced... I'm not really looking for "opinions' though opinions are also welcome, but hard experience from those who have actually done it, or answered questions of others who have actually done it and can give real life specific "consequences" or "if you go with parsing all the pages... here is what you will face one day..." answers... So far the CPU speeds on our box and current version of apache seem to have no problems, but then, our user hits per month are not that high, yet.... we get, from webalyzer about 200,000 unique IP's hitting our sites per month... round that down for rotating IP's on a single session it's probably more like 170,000 unique IP's per month, across all sites. so one specific question will be, at what point would parsing all pages start to cause delivery slow down... i.e. the hit rate per minute at which parsing pages starts to choke the CPU? And, if this is an issue at all or not? I'm talking about 80,000 html pages overall... on all sites combined, delivered from the same box., many of which are accessed only infrequently.
<!--#include virtual="/ssi/copyright.txt" -->
<TITLE>Hinduism Today | Jul 1993</TITLE>
<!--#include virtual="/ssi/article_header.txt" -->
<!--Start article text-->
<!--Current issue date ie. "May/June, 2001"-->
<!--Start exported text here-->
<font size="7">Article Headlines (actual text removed for this post)</font><font size="7"></font>
<p><font size="4">article Text (actual text removed for this post)</FONT>
<!--#exec cgi="/cgi-bin/local_nav_include.cgi" -->
<!--End article text-->
<!-- ======= FOOTER TEXT ======= -->
<!--#include virtual="/ssi/article_footer.txt" -->
<!-- =========================== -->