I hadn't seen Sys::Binmode - that's interesting. It only solves half the problem though. From its own documentation:
Of course, in the end, we want mkdir() to receive 6 bytes of UTF-8, not 4 bytes of Latin-1. To achieve that, just do as you normally do with print(): encode your string before you give it to the OS.
use utf8;
use Encode;
mkdir encode("UTF-8", "épée");
That code is what you want on modern Linux systems, but who wants to write that everywhere? Meanwhile, writing that breaks Win32 support, in most cases. Perl does not currently have *any* cross-platform solution for this problem.
My point is that a person using file paths on a modern system wants to work with unicode, not mess with knowing how to encode filenames for their platform. Perl has great support for *handling* unicode, but the user experience for unicode filenames (especially on windows) is complete brokenness. I see no way to fix it while preserving back-compat, and the only solution seems to be wrapping the problem inside a module. If a module did fix the problem, then it should be core, because this is an intrinsic problem with perl that *needs* a fix.
On a sidenote, I'm baffled why Perl is so popular in Japan. I would think they'd be up against this problem nonstop, as opposed to most of Europe who can get by with Latin-1 or variants.
| [reply] [d/l] |