Hi
I'm building a page which uses arrays to display different content based on the page's url (dynamic pages).
However, it looks like these arrays will end up getting quite extensive, and I was wondering if this can cause problems or latency later on. In other words: is there a maximum size for arrays (or a point after which it will become too cumbersome for the server to handle properly)?
Thanks!
Do your arrays hold data for ALL the pages? e.g. Say you have an array of page titles, only 1 title exists per page, so when you say show me the title for page 246, do you get that data from the array which contains all page titles?
If so, then every single page request will have to hold onto a LOT of junk (relavitely) data.
There are several ways you could limit the amount of data per page.
1 - Include files with the name being generated by the page request ...
<?php
$sFile = "inc{$_GET["pageIndex"]}.p
include $sFile;
?>
2 - Use a DB (much more efficient).
You COULD use the page with the arrays to physically generate the other pages. This is not sound as daft as it sounds.
You currently have 1 page with the arrays in. Maintaining this manually I would assume. Using the manually maintained page with all the arrays, use the arrays to create all the subsequent pages and store them on the server. You would need to have write access to the directory containing the pages. You would only need to re-generate the pages when you made a change to the page holding the arrays.
If the age holding the arrays is also dynamically generated, then instead of using the arrays, simply rely on the source data.
Maybe.
Richard.