Hi and thanks for your response,
I'll try and answer your questions :
- The script uses id's from a "master index" table to assign a name for each of the new tables created.
- The script is loading in a lot of data, sometimes millions of records - hence my reasoning in splitting up the tables into smaller ones (and answering your 5th point).
- The db is not write only. It is queried by other scripts which also uses the id's from the "master index" table to determine which set of tables it should be using.
- I want to prevent combining the data from each search (also another reason to keep the data in seperate tables).
- see points above :)
My whole aim is to try and speed up the mysql side of things. I have achieved this so far by lessening the amount of data in the tables (i.e. instead of one table with 1,000,000 records - 900,000 of which are not needed for the specified query - i've split it up into 10 tables with 100,000 records each).
Hence that has led me to my initial question - will too many tables slow down the db and if so, would it be quicker to use seperate databases?
I guess also seperate db's would have the advantage of better security to avoid mixing the search data in any way.