@Jxn: Yes, that’s what I meant – encodings are the same. So a file created with de_DE.UTF-8 is on a binary level the same as one created with en_GB.UTF-8, so actually it shouldn’t matter if a program created them with one locale and another program or instance of the same reads them with the other locale 😉 (shouldn’t matter with any Unicode, but as mentioned, there are a lot of versions of the UTF-8 standards since it was expanded and changed a few times to include more languages with new characters)
And the character encoding is often not explicitely mentioned in locales, and sometimes the country is lower case as well, and divided by a – instead of a – (just take a look into the language preferences your browser sends, or the Linux environment LANGUAGE (not LANG))…
To make this even more complicated, depending on the standard byte order of different operating systems, UTF-16 and UTF-32 can even differ on a byte level – so these texts usually start with a two-byte BOM, a Byte Order Marker, telling the application if they are lesser endian or bigger endian. UTF-8 has a standard at least, but the byte order can be changed through an optional BOM as well.
You wrote a very detailed description on the topic though 🙂
Back the problem which isn’t one any more, I assume mt-daapd should run under the “standard” locale? Which would most likely be the one from the environment; type “env” and look out for “LANG=…” which in my case is “LANG=de_DE.UTF-8”. That environment variable wasn’t set before I created the locale though, so the question would be what would be used as default if that was not there.