I don't think we can (reasonably easily) do exactly as the AM requested. I am also concerned about encouraging the impression Perl 6 is not feature complete (for some users and use cases it is now sufficiently complete).
However, I would like to publish a metric that helps the AM and interested parties track Perl 6 development. I would appreciate constructive feedback to my response and thoughts below.
TL;DR As a metric to help track Perl 6 development, I favor test pass rates such as spectest passes over time or "64 modules passing 100% of their tests when compiled using the 2012.08 Rakudo compiler". Do you like these?
Any metric will almost certainly have to be based on data that's already being collected. I could see trying one or more of the following:
We could update the pie chart weekly.
This data currently tracks 143 very high level compiler "features", ones that have been mostly implemented in five years or so (Rakudo) or 2 years or so (Niecza) for an average implementation rate in the order of 20 - 40 per year.
Some issues:
I suspect the fact this feature list doesn't include the debugger reflects weakness for tracking what the AM cares about.
Do we count Rakudo as having the junction and hyper features? If we do, folk are likely to complain that this is misleading because it doesn't yet do autothreading. Do we include the optional autothreading in the "features not yet implemented" count? If we do, folk are likely to complain that Rakudo isn't ready because it's not "feature complete".
This data generally changes at least a little from week to week.
I think this is about the best data we have that can quickly convey progress but publishing it has tended to elicit criticism ("Test count is a silly metric.").
In the past flussence has generated charts of spectest passes over time. He gave up on it recently, perhaps due to it closing in on 100%, or maybe due to the negative responses we've gotten from publishing such data, but I imagine it would be fairly simple to restart this.
There are module smoketesting results. We could list how many modules in the Perl 6 ecosystem are passing 100% of their tests (64 as of 2012-09-05 tested against the August Rakudo compiler.)
A couple days ago mst uploaded Perl 6 to cpan. That's going to bring in a lot of interesting platform test data. Maybe we track that?
The roadmap is probably even less suitable as the subject of a weekly update than either of the previous two items. But I thought it worthy of mention here because it is a very important project driver.
A few weeks ago (on August 20th) I wrote How things are going compared to rakudo roadmap using jan 1 2012 as baseline.
Since I wrote the above gist there have been two roadmap changes. Removing (marking as done) the one star (very easy) task "sigilless variables" and removing the five star (very difficult) task "design, implement and switch to QAST".
(While we're here, let's summarize the remaining roadmap tasks. There is just one task marked as top priority: "basic Perl 5 interop (use, eval, etc.)". Interestingly it is not marked as being difficult. Then come 38 medium priority tasks (only two marked difficult); 17 low priority tasks (only one marked difficult); and one task of unspecified priority and difficulty ("Correct type smiley support (:U, :D, :T, etc.)").)
Do you particularly like or dislike any of the above? Any other ideas?
Edited to correct statements of roadmap task difficulty.
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re: Metrics tracking Perl 6 development
by moritz (Cardinal) on Sep 09, 2012 at 20:02 UTC | |
by Anonymous Monk on Sep 10, 2012 at 08:08 UTC | |
by moritz (Cardinal) on Sep 10, 2012 at 13:55 UTC | |
by raiph (Deacon) on Sep 12, 2012 at 19:40 UTC |