in reply to Problem with join'ing utf8 and non-utf8 strings (bug?)
The input hash was a mix of utf8 and non-utf8 strings. At the last stage, XML::Simple::XMLout join's components together and I get corrupted data.
Well, if the "non-utf8 strings" happen to be all ascii characters (ord()<128), then it won't matter, because they are just a proper subset of utf8, and concatenating these with utf8 strings causes no problem.
But if a "non-utf8" string happens to also be "non-ascii", then what would you expect to happen when you concatenate this with a utf8 string? What would you expect to do with the result of such a concatenation? (Hint: unless the answer is something strange and ad-hoc involving pack and unpack, then the real answer is: something incoherent.)
You can't just throw utf8 characters and non-utf8/non-ascii data into a single scalar value and expect to get anything usable. If you combine data this way, the bug you expose is not in perl, but rather in your expectations.
Either keep these data types separate at all times, or else, if the latter type is actually character data in some other encoding, then decode() it into utf8 characters (refer to the Encode module) -- or alternatively, encode() the utf8 string into the same character set as the other data, before concatenating.
UPDATE: Actually, as pointed out by almut, perl's default behavior (interpret non-ascii/non-utf8 bytes as Latin-1 characters) makes it possible that one of the "more likely" situations -- converting some old single-byte Latin-1 text data to utf8 -- can be handled automatically, and produces a coherent result. It's only when the non-utf8 data is neither ascii nor Latin-1 that the trouble starts.
|
|---|