in reply to Re^3: winter games, perl, ice and bling
in thread winter games, perl, ice and bling

What you're suggesting is a return to The Bad Old Days, when module authors couldn't even count on arrays starting at 0 because you might have changed $[. That sort of thing is frowned on these days, because it made writing modules totally unmanageable. A module author has every right to expect that magic variables like $/ will be at their default values and a programmer who fails to localize changes in these variables is the one at fault for the trouble caused.

For reference, see MJD's Sins of Perl Revisited.

  • Comment on Re^4: winter games, perl, ice and bling

Replies are listed 'Best First'.
Re^5: winter games, perl, ice and bling
by BrowserUk (Patriarch) on Mar 27, 2008 at 13:20 UTC
    A module author has every right to expect that magic variables like $/ will be at their default values ...

    Absolute unmitigated rubbish!

    If a variable can only take one known value, it is called a constant. If a module author want to use a specific value for $/ or $\, it is his responsibility to ensure that he has that value: local $/ = '\n"; or whatever. Just as your authority figure, MJD does in his module Tie::File:

    ## from _read_record() in Tie::File { local $/ = $self->{recsep}; my $fh = $self->{fh}; $rec = <$fh>; } ... unless ($self->{rdonly}) { local $\ = ""; my $fh = $self->{fh}; print $fh $self->{recsep}; }
    and a programmer who fails to localize changes in these variables is the one at fault for the trouble caused.

    More of the same.

    You are processing a fixed record length file. You set the record separator to that length and loop. Within that loop, you call some functions or methods from within a module.

    What you are suggesting means that you, as application writer and code owner, now have to undo your requirements before you call each function or method--regardless of whether those modules make use of those variables because you don't know--and restore them afterwards, and every time you call those functions, in order that the module writer can be lazy.

    use Some::Module qw[ funcA ]; use Someother::Module qw[ funcb ]; local $/ = \64; while( <$fh> ) { ## do stuff { local $/ = "\n"; funcA(); } ## Do stuff { local $/ = "\n"; funcB(); } ## Do stuff { local $/ = "\n"; funcA; } }

    And you want to do this for every function or method? Regardless of whether it makes use of these variables, in every module you call, and everytime you call it? You're nuts. You have completely inverted the burden of responsibility here.

    If module author needs a specific value for one of these variables within a subroutine or method,

    1. they know they need it;
    2. they know where they need it;
    3. the scope of the need is naturally bounded by the subroutine;
    4. they only need do it once;

    Any other conclusion defies rationality.


    Examine what is said, not who speaks -- Silence betokens consent -- Love the truth but pardon error.
    "Science is about questioning the status quo. Questioning authority".
    In the absence of evidence, opinion is indistinguishable from prejudice.
      All I'm saying, and I don't think this is a stretch, is that the current implementation of things like $/ with no lexical scoping is a mis-feature, and the best thing to do with these is to avoid them as much as possible and use local on them when they are unavoidable. The global scoping of these settings has caused many problems for many perl users, and caution is called for.

        But think about what not having those variables means.

        No other language I know of allows the application programmer to write a standard piece of code:

        while( <$fh> ) { processRecord( $_ ); }

        And with that standard, recognisable, picked-up-in-the-first-week-of-programming construct, can process any file. Regardless of whether it uses *nix conventions, or dos conventions or Mac conventions or network/socket conventions. Or if it is uses fixed length records. Or paragraphs.

        The same common construct deals with all these by simply changing the value of one variable.

        You want scope for errors?

        Most runtime libraries have readline() and writeline() subs or methods, but the record separator is hardwired.

        Want to read or write a *nix file on a dos machine, or MAC file on a *nix machine? Then you have to implement it yourself and move to using low-level reads() or getc() and doing your own buffering. And repeat the exercise for each input type. And each output type. And every programmer on every platform has to do the same.

        Every facility Perl provides is there for a reason: to simplify the life of the programmer and move as much code into the thoroughly tested and supported RTL and core as possible. The more the core does, the less the programmer has to.

        And all the current vogue for rejecting and deprecating Perl's features, on the basis of OO orthodoxy, or one-size-fits-all "best practices" does, is create the need for every programmer to re-invent those features themselves.

        And that simply creates the scope for errors, not prevents them! And all for the sake of less need to educate and cheaper programmers. You see a good ROI in that?


        Examine what is said, not who speaks -- Silence betokens consent -- Love the truth but pardon error.
        "Science is about questioning the status quo. Questioning authority".
        In the absence of evidence, opinion is indistinguishable from prejudice.