Refactor your code into subroutines. Nothing wrong with small subroutines, there IS something wrong with routines that extend over more than 30 lines, certainly 50 lines is a problem. Well, actually, there is one thing wrong with very small routines ... it it's invoked a few million times, the expense of invoking a function far exceeds the time required to execute the actual code. At that point, you have two options:
You could follow the paradigm of DOS programs from the 640K days .... Group the routines into categories and put each category into a separate files. The modern, clean, reliable and scalable way to do this is to give the files a name ending with ".pm", say "Update.pm", and have package Update; as one of the first lines in the file. Of course, this is a module. See merlyn's articles or perldoc for more. The old-fashioned, unreliable, un-robust, un-scalable way ( you didn't think it got replaced simply cause the Perl developers got bored, did you? ) is to make the file an ordinary perl script, and invoke it by using require "update.pl" ... assuming that's the file's name, of course. The subroutines will be available to be invoked, any code outside the routines will run at the time you "require" the script.
Of course, in the DOS days, the overlay replaced the previous code, gaining maximum use of the 640KB that no one wouold ever need more than. Today, as you require one script, then another, your memory usage will increase, but who cares. I've got 512MB on my machine, and that's old-fashioned.
--
TTTATCGGTCGTTATATAGATGTTTGCA
In reply to Re: Split up a perl script
by TomDLux
in thread Split up a perl script
by debiandude
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |