|Problems? Is your data what you think it is?|
There is an observation that applies to all sorts of engineering, research, and product development:
Good. Fast. Cheap. Choose only two.
What happens is that in the real world there are always trade-offs between quality, development time, and cost. Essentially, there is a finite pool of resources available to a project, and if you need more of one thing, at least one of the other two (and possibly both) will suffer.
On a commercial project, if you need reliability*, you will have to pay for it either in a longer development cycle or in paying more programmers (or both). If you need working software yesterday, you can hire an army of programmers or skimp on the reliability of edge cases. And if you want the cheapest software on the planet -- free (as in beer) software -- both development time and quality will suffer as a result.
In essence, our pool of resources is limited to the available time of the volunteers who maintain that particular project. Therefore, if the quality of a particular module isn't up to your standards, you will need to look at getting that project the resources it needs, and the surest way to do that is to do the work on it yourself and contribute to the community. In terms of the trade-off above, Good trumps Fast and Cheap in your opinion.
That said, most programmers (myself included) are quite content to use what is already available and live with the limitations. There's nothing wrong with that, it's simply an issue of priorities -- Cheap trumps Good and Fast every time.
*Or you're working on something that will kill people if any bugs are left in the final product, which if that is the case you shouldn't be using Perl.