in reply to SOLVED: Storing UTF-8 data into database from scraped web page

Every part of the handling must be correct; in and out. Scraper is probably doing the right thing, returning utf-8 decoded strings into your Perl. Your DB level is the most likely problem. You can try to enable UTF-8 handling but is the table or the table.column using that charset? If it's not, you're maybe stuffing binary UTF-8 into Latin-1. Look at the table in question; pull the LIMIT here or put in a WHERE

mysql> SELECT SCHEMA_NAME, DEFAULT_CHARACTER_SET_NAME, DEFAULT_COLLATI +ON_NAME FROM information_schema.SCHEMATA limit 1; +--------------------+----------------------------+------------------- +-----+ | SCHEMA_NAME | DEFAULT_CHARACTER_SET_NAME | DEFAULT_COLLATION_ +NAME | +--------------------+----------------------------+------------------- +-----+ | information_schema | utf8 | utf8_general_ci + | +--------------------+----------------------------+------------------- +-----+

If you get latin1 and latin1_swedish_ci (one of mysql's many awful defaults) then you should update the table, the column, or both to CHARACTER SET utf8 COLLATE utf8_general_ci. This is potentially dangerous for data there so backup your DB first.

Update: s/encoded utf-8/decoded utf-8/ to maintain pedantic semantics which are ultimately less confusing.