I'm working on this perl script for a website - I have basically two types of pages, A and B, each with a title, some text and other details stored in an individual text file. Each A is associated with a number of Bs, up to 100 or so, which are listed on its page. In order to generate the list of B's for an A page it has to open each relevant B file to read the title out of it.
So as it stands, to generate certain A pages it might be opening 100+ different B files. Now I don't know jack shit about servers but I am aware that file access takes time. And given that it's on shared hosting, am I likely to run into problems here when it goes public? Would I be better off putting all the B's in one big file, or splitting them into blocks of say 10-20, or even just biting the bullet and using some SQL database? Or is it fine the way it is?
if you want database-like functionality but don't want to use SQL, try using Storable
use Storable;
my %big_hash = (
...
);
store \%big_hash, 'this_file.db';
and to open:
use Storable;
my %hash = %{ retrieve 'this_file.db' };
*** DOUBLE POST ***
note that Storable works with scalars and arrays too of course
>>2-3 Thanks, I'll keep that in mind.
do you know if the way I'm doing it now will cause any significant performance issues though? I'm happy enough sticking with it for now if it works just as well, rather than redoing everything.
you will take a significant performance hit from reading 100+ files every time you generate a page
>>5
ok, thanks. I suspected as much, I was just vainly hoping there might be some obscure reason that it'd all be fine and I wouldn't have to rewrite everything. :D
you should really generate the A pages as static html files and only regenerate them when the content changes.
I agree with >>7