Developing custom I18n backend for Lokalise in Ruby on Rails

For projects intended for multiple platforms (iOS, Android, Web) it’s often a good idea to centrally manage all translations. One of the tools available for the task is Lokalise - A localization and translation management platform for agile teams. Big advantage of using the platform is that there is always a single source of truth for all translations. Using Lokalise allowed our client to modify translations for their projects anytime they need, as opposed to using default static files in Rails, which can be modified only by developers.

Motivation

For projects intended for multiple platforms (iOS, Android, Web) it’s often a good idea to centrally manage all translations. One of the tools available for the task is Lokalise - A localization and translation management platform for agile teams. Big advantage of using the platform is that there is always a single source of truth for all translations. Using Lokalise allowed our client to modify translations for their projects anytime they need, as opposed to using default static files in Rails, which can be modified only by developers.

At first we created a simple solution to integrate Lokalise translations within the Rails app. We had a simple rake task which downloaded new translations using Lokalise API, and replaced all YML files in config/locales/ with new ones. This was sufficient for initial development. We updated translations only when we were releasing a new feature.

However, once most of the project features were released, the client often needed to tweak and change just the Lokalise translations, and we had to make new releases only to update a few translations. This is the issue we wanted to solve.

Solution

We needed to come up with a solution that would allow changes in Lokalise to immediately propagate to running LIVE instances of Rails app. Most important part of the requirement was that our solution shouldn’t modify the behaviour of the app or slow it down.

We agreed that updating Lokalise translations should be available to admin users in the admin interface. We also considered using Lokalise webhooks feature to trigger updates automatically, but we couldn’t do that because we didn’t use Lokalise versioning system and overall it did not fit our use case.

Planning

Once we knew what we wanted, we started planning. This is where it got interesting. We could build our solution by combining various existing Ruby on Rails features and gems. We agreed to use:

  • Cache storage redis_store to store translations downloaded from Lokalise.
  • Thread.current to keep a fresh copy of translations for each app instance.
  • Gem RequestStore would be used to keep a fresh copy of translations for the lifetime of request.
  • I18n::Backend::KeyValue backend to store and retrieve translations.
  • I18n::Backend::Chained to be able to fallback for static YML files in case of an error.
  • Official gem lokalise-ruby for working with Lokalise API.

Initially for storing translations in redis_store we came up with two solutions, but each had its problems:

  1. Using Redis as a key-value store for lokalise keys. While this was very easy to implement, we didn’t like the idea that the app would have to request Redis multiple times on every I18n.t() call, only to receive small translation data. Drawback: overwhelming Redis by too many requests on each HTTP request.
  2. Using Redis to store all translations as a single object. This would reduce the number of requests to Redis to only once per request, but now the size of the data requested from Redis was a problem. Drawback: one redis call instead of multiple redis calls,but one for each I18n.t().

Compromise to these two solutions was to store all translations as a single object stored alongside a timestamp indicating when the translations were updated last time. When storing data from Lokalise to Redis we use two keys: we store all translations to key “translations/data” and the time of update to key “translations/updated_at”. Similarly, when we copy data from Redis to memory (by using Thread.current), we also store the timestamp value to Thread.current[:translations_timestamp]. After updating we only need to ask Redis for the timestamp and compare it with the timestamp in memory. Only if the timestamps don’t match we need to ask Redis again for “translations/data”.

Next, we identified 3 requirements:

  1. Find the translations once on every request (to ensure they are fresh for given app instance),
  2. load translations only when they are actually needed (to prevent unnecessary calls to Redis) and
  3. keep translations available in-memory of every app instance (for fast retrieval).

For the first requirement we used gem RequestStore. It appends simple rack middleware which takes care of creating a simple hash storage for every request and purging it once the request finishes. This allows us to seed translations into store at the beginning of request and re-use them throughout the rest of the request.

For the second requirement we used I18n::Backend::KeyValue, to which we passed a custom key-value object. This object would allow us to perform Redis check and seed translations only when I18n.t was explicitly called.

For the third requirements we simply ensured that Thread.current is synced with RequestStore after the seeding process.

Putting all of this together, here is the the flow of a simple request when the translations are stale:

  1. Request starts.
  2. I18n.t is called, so it asks Redis when the translations were last updated.
  3. If translations in memory are stale, new translations are requested from Redis and the in-memory timestamp is updated.
  4. Redis either returns cached translations, or we request translations from Lokalise API, writing them into Redis and then returning them.
  5. Fetched translations are stored in I18n key-value backend.
  6. All subsequent I18n.t calls will now use translations in the RequestStore.
  7. Request finishes.

This is the slowest example. Since we need to call Redis (and possibly also Lokalise API) to get the translations, but it’s also very rare to happen.

After the first seeding of app instance, all the next requests are performed fast:

  1. Request starts
  2. I18n.t is called, so it asks Redis when the translations were last updated.
  3. Translations in memory are fresh, so request store translations are pointed to in-memory translations.
  4. All subsequent I18n.t calls will now use translations in the request store.
  5. Request finishes.

In this case all RequestStore has to do is to reference translations which are prepared in Thread.current memory.

Flowchart of a simple request

Implementation

For each environment we first defined I18n backend:

Here I18n::RequestStoreTranslation is a custom key-value object which requires to have methods [], []= and keys, so the I18n::Backend::KeyValue may use it.

This ensures that once I18n.t is called for the first time during the request, it attempts to seed itself with translations, which are then accessible until the request ends. Here we also initialize the store with an empty hash, so if the seeding process fails, I18n.t may fallback to static files.

Seeding class ensures the seeding process by the logic we defined earlier. It’s main goal is to check if translations in memory are fresh based on the timestamp retrieved from Redis. What is interesting is that here we use the method  I18n.backend.store_translations which ensures that our translation data are stored exactly how I18n needs them.

In the example above you can see how it 

  1. adds locale to each key,
  2. flattens the nested hash and
  3. converts values to JSON.

To retrieve translations from Redis storage we next use the custom service class FetchLokaliseCache, all it does is that it fetches translation data and timestamp from redis and passes it in the result object. Should the data not be cached, it uses Lokalise API to fetch it. It also accepts an option to refresh cache, so the service can be reused for manual update of translations in our /admin sections.

Caveats

During implementation we encountered few interesting caveats which needed solving.

Frontend translations

We use React as a web frontend, which also uses custom translations tagged differently in Lokalise and read as static JSON files by React. For this we had to modify the flow to also include frontend translations and then we created an API point to provide fresh JSON files for frontend. For simplicity the code above includes only flow for the backend.

Translations being loaded during rails build

We noticed that I18n.t() may not be called only during runtime, but also during the rails build, using some translations as frozen constants (for example we found that gem active_admin does this).

This was a potential problem during certain tasks, which needed to build rails, but couldn’t connect to the internet to fetch Lokalise translations. Luckily it resulted only in rather ugly error messages in the logs. We resolved this by adding an additional check for Lokalise configuration, ensuring that if credentials are not defined in ENV, it will skip attempting to fetch translations.

Too many Lokalise requests in development mode

In our current development we used :memory_store caching strategy, so each server restart would clean up the cache. Not great, not terrible, but it could potentially result in too many requests to Lokalise from the developer's machine, so to prevent this we changed the default development cache strategy to :file_store. When using :file_store, the translations were locally fetched and persisted only once.

Testing end-to-end with rspec

We wanted to test the new feature thoroughly in automated tests, so we included chained I18n backends for each environment.

We then used mocking to test what we needed:

  1. In most of our test cases we didn’t need to test translations directly, so we mocked Lokalise API service to return an empty result by default.
  2. In a few test cases where we wanted to test this new feature we used VCR to mock a successful request to Lokalise API.

Conclusion

In this article we described the process of coming up with a solution on how to integrate external translation services, such as Lokalise, with Rails app without slowing down the app. Solution could also be easily re-used for any other external translations management tool.

Related posts

Do you want find out how can we help your business?