Sorry it has taken me so long to respond. I had an emergency project come in that has taken my attention away from this one for the last little while.
The reason I am building the large pdf files is because they are going to be passed to a piece of printing software, which requires only one file be passed to it per print job. we have decided to break things down along the lines I have defined to help with the processing overhead both on my end and then again on production.
The way I have the opening/closing structured, was meant to help minimize them by only doing so once the page count determined that the pdf needed to go in a different file, but after reading your comment and thinking about it more, if I assume worst case and every set is as mixed as possible then I would indeed be doing a large amount of opens/closes. However do you think the performance hit would be greater than opening/closing the individual pdfs twice as opposed to once with hihger combined file opens/closes?
In reply to Re^2: Am I on the right track?
by Pharazon
in thread Am I on the right track?
by Pharazon
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |