Hello Monks,
I am currently reviewing a bit of existing source code and have to estimate the possible cost of redevelopment, compared to the cost of extending existing Delphi/Pascal sources.
I am trying to value these functionality with Models like COCOMOII. one of the best sources available online is
Cost Estimation, Metrics and Models.
To evaluate the required effort the advanced techniques not only require the number of code lines, but also the number of lines per function point.
I would like to be able to compare these results to my own software development in Perl. This leads me to some questions:
- What value is considered to be the number of lines of a function point in Perl?
- How would you defined this number in Perl? I am fairly spacious in most my sources, each curly loop brace gets its line on its own, this would easily extend the number of lines by 5 to 10 percent. ALternatively I could count the number of ";", yet there do not need to be any semi-colons in some lines that are fairly complex. And then a for loop has three of them. Is there any definition around?
-
You might have considered writing me that an extended regexp can do as much as 30 lines of code without any regexp, so here I come and ask if there are "special" lines of code that take more time. I guess in five Perl lines you can open a file read its contents reformat them completely and write them to another format. This would in my understanding make 0.4 function points per line. (Delphi has only one per every ninety lines)
- Alternative methods. Are there any? Are they Perl exclusive? How do you personally estimate the time programming some script takes you.
-
More in general, are there any good sources about software engineering in Perl? It is a slighlty uncommon languange to use for extended professional standalone programmes, but it is my first choice to C and JAVA and I would like to find arguments for it and help more specific to software engineering.
As for my previous experience estimating time valuyes, I always used to only add on fairly small portions of code to an exsisting tool, or write rather small scripts from scratch. In these cases I had a fairly exact estimation of how much time it would take me by just thinking about how to write the tool. I usually ended up to be able to estimate this amount of time to about two hours correctly for a problem that took a week of coding. I then added 50% time for debugging and 50% for wrong user-specifications. Debugging usually required ten minutes unless and that happens a bit too often the user specifications were too unspecific or wrong. Yet still this was a good estimate that was within 15% boundaries.