If the program is short lived and/or runs on a box dedicated to it, then ignoring memory consumption and cpu utilisation is fine.
However, the sulphuric mentions " 5 hashes on a single page", which I would take to mean he is working in a webserver environment. Excessive memory and cpu utilisation can be can be disasterous in environments where the number of copies if the process can be large. Especially so when the number is controlled by external forces.
Whilst hardware is cheap, for individuals as well as many companies that rely upon ISP or hosters for their hardware, the cost of purchasing (the use of) large enough boxes to handle peak loads, that sits idle 90% of the time is prohibitive. One only has to look to the slugishness of this site a few month ago before Pair kindly donated addition hardware resourses, to see that it isn't always a simple case of economics.
Taking elementary steps to avoid wasting resources is far from "premature optimisation". Even when writing simple apps, understanding what costs and what doesn't is not "evil", more common sense.
There is a compromise between bloat and unmaintainable, over-optimised code. The key to finding that compromise is understanding. Branding every request that mentions "efficient", "fast" or "use less" as premature optimisation is to deny that there is a problem. The barrier to understanding is the denial of the problem and the possibility of solutions.
In reply to Re: Re: Memory /speed questions
by BrowserUk
in thread Memory /speed questions
by sulfericacid
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |