
Hi, On Fri, Jan 10, 2014 at 6:38 PM, Griffin Boyce <griffin@cryptolab.net> wrote:
I'd like to bring up localization. Right now, translation is done on Transifex and then just put into individual translated pages. The real downside with this is that it's not obvious to users who come to a given page, how to get to that page in their language (if it even exists).
IIRC the Tor project is using gettext and Transifex for its programs. There are no translations of the homepage at the moment. Back when they existed they were not managed using gettext. That said, I also like the idea of using gettext for translations.
What would improve this process is using a passive localization script that detects the user's browser language (like l10n.js) and then swaps out chunks of content based on that. You can designate a fallback language. Considering translation is spotty for several key languages, I think that this approach is better than the current model.
You don't even need the magic in the browser. As an example of what could be done look at the Tails projects homepage tails.boum.org. This is static html compiled from markdown using ikiwiki (basically a Perl based static generator on top of a VCS). The translations are managed using gettext, that is chunks of the markdown master are translated in po-files. The one downside is that translators need to be able to edit po-files. Unfortunately I don't think that Transifex helps much here: When you are translating a complete web page, you need a lot more context than when translating isolated strings in a program. But such an interface for translator could maybe be designed?