in reply to Writing to many (>1000) files at once

Can you describe your constraints a little more? Is the issue to get the data out as fast as possible and to all targets at once? Or is it more about updating a set of data as a whole?

Anyway, turning the problem around, I was wondering if it would work to keep all your report files in or under a directory, say /reports. Then duplicate it in, say /reports_updating, where you can take your time opening and writing/appending files. When ready, swap directory names. Instant update as far as external processes are concerned.

--marmot

Replies are listed 'Best First'.
Re^2: Writing to many (>1000) files at once
by suaveant (Parson) on Aug 16, 2006 at 14:55 UTC
    Yes, trying to get the data out quickly... actually I think what we have will be fast enough, I was just curious if there was some better way to handle the multiple files than I was doing. As it is I generate the files I need in about a minute.

    The directory thing's not really a problem since the program triggers events that a monitor handles to tell the next step to start.

                    - Ant
                    - Some of my best work - (1 2 3)