Why do you think KISS and Occam's Razor is an evil node, written by an evildoer? I subcribe to the Occam's Razor school of thought. Hell, I even programmed in Occam for a transputer in a previous life.
Maybe the external web references in that other node could have been formed using the [http://url|title of web page] method, but I'm not too fussed about that. Hell, a few months ago, a bug in Perl Monks' HTML met a bug in Mozilla and pages suddenly because tens of thousands of pixels wide. Stuff happens, deal with it.
For these reasons and more, I downvoted your node.
Another thing. I don't think the Freshmeat solution is a good one. It fails the print test (print the page and the information is useless). A better alternative would be to break long URLs automatically. Once upon a time, you could do such a thing in Netscape with the <wbr> tag, which would signal an optional word break. Perform a $url =~ s{/}{/<wbr>}g and you're set.
Unfortunately, when the W3C got around to standardising HTML 4.0, in their infinite fooli^Wwisdom they refused to put it in the spec. Too bad. Hyphenation and justification (H&J) is really badly done on the web. Without the hint of something like <wbr> there is no way to deal with this elegantly.
The real issue is not that long URLs are bad, it's just that browsers suck.
--In reply to Re: long URLs
by grinder
in thread long URLs
by TomK32
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |