I'm working on this perl script for a website - I have basically two types of pages, A and B, each with a title, some text and other details stored in an individual text file. Each A is associated with a number of Bs, up to 100 or so, which are listed on its page. In order to generate the list of B's for an A page it has to open each relevant B file to read the title out of it.
So as it stands, to generate certain A pages it might be opening 100+ different B files. Now I don't know jack shit about servers but I am aware that file access takes time. And given that it's on shared hosting, am I likely to run into problems here when it goes public? Would I be better off putting all the B's in one big file, or splitting them into blocks of say 10-20, or even just biting the bullet and using some SQL database? Or is it fine the way it is?
if you want database-like functionality but don't want to use SQL, try using Storable
use Storable;
my %big_hash = (
...
);
store \%big_hash, 'this_file.db';
and to open:
use Storable;
my %hash = %{ retrieve 'this_file.db' };
*** DOUBLE POST ***
note that Storable works with scalars and arrays too of course