While I certainly agree with all of the points you've made, what are your experiences in the field of "large projects" in perl? While I'm not complaining and labeling perl as a slow solution, I'd like to see what the ceiling _limit_ that perl developers have reached in matters of sheer size. Currently all of the modules are being interpreted when the program runs (a feature that will be disabled after development hopefully) I'm quite happy with the speed of the system at this point, but the aspect of bloat makes me wonder at the possiblity of reaching a point when the program becommes annoying to use or even completely unusable. Currently at one point, it calculates a great number of variables to come up with a total price per night sold.. each iteration per night takes between 0.5 and 1.5 seconds so when multiplying that time factor by the number of nights requested also multiplied by the number of relevant results this thing could take upwards of 50 minutes to return a result! (gross eh? and thats the best scenario by average) But I am confident that the loops will get optimized and thus further reducing the time to response. I guess what I'd like to know specifically is this.. what kind of factor does the number of lines of code play (number of instructions or better said # of ;'s etc) when using an interpreter such as perl, also the file open calls by perl to look in each module. Of course the end result is the most important in a production environment and the code will get fully profiled before release. What kind of issues concerning useablity (ie. speed) have you run into while working on a large project in perl?