Excessive file access in a perl script? (10)

1 Name: #!/usr/bin/anonymous : 2011-07-22 16:53 ID:5HtMA4vP

I'm working on this perl script for a website - I have basically two types of pages, A and B, each with a title, some text and other details stored in an individual text file. Each A is associated with a number of Bs, up to 100 or so, which are listed on its page. In order to generate the list of B's for an A page it has to open each relevant B file to read the title out of it.

So as it stands, to generate certain A pages it might be opening 100+ different B files. Now I don't know jack shit about servers but I am aware that file access takes time. And given that it's on shared hosting, am I likely to run into problems here when it goes public? Would I be better off putting all the B's in one big file, or splitting them into blocks of say 10-20, or even just biting the bullet and using some SQL database? Or is it fine the way it is?

This thread has been closed. You cannot post in this thread any longer.