Excessive file access in a perl script? (10)

1 Name: #!/usr/bin/anonymous : 2011-07-22 16:53 ID:5HtMA4vP

I'm working on this perl script for a website - I have basically two types of pages, A and B, each with a title, some text and other details stored in an individual text file. Each A is associated with a number of Bs, up to 100 or so, which are listed on its page. In order to generate the list of B's for an A page it has to open each relevant B file to read the title out of it.

So as it stands, to generate certain A pages it might be opening 100+ different B files. Now I don't know jack shit about servers but I am aware that file access takes time. And given that it's on shared hosting, am I likely to run into problems here when it goes public? Would I be better off putting all the B's in one big file, or splitting them into blocks of say 10-20, or even just biting the bullet and using some SQL database? Or is it fine the way it is?

9 Name: 1 : 2011-08-01 04:57 ID:Heaven

>>7-8 yeah that's a good plan actually. I may well do that.

10 Name: #!/usr/bin/anonymous : 2012-01-15 16:18 ID:2W+UB1mm

Just curious: I have a similar type of site as >>1 but it uses a MySQL database, is it worth doing something like >>7 suggested with that too?

This thread has been closed. You cannot post in this thread any longer.