The top 25 global websites from the 2017 Web Globalization Report Card

I’m excited to announce the publication of The 2017 Web Globalization Report Card. This is the most ambitious report I’ve written so far and it sheds light on a number of new and established best practices in website globalization.

Here are the top-scoring websites from the report:

For regular readers of this blog, you’ll notice that Google is yet again ranked number one. But Google isn’t resting on its laurels. While many software companies are happy to support 20 or 30 languages on their websites, Google continues to add languages across its many products. Consider Gmail, with support for 72 languages and YouTube, with 75 languages. And let’s not overlook Google Translate, now at 100+ languages.

Google could still stand to improve in global navigation, though I am seeing positive signs of harmonization across its many product silos. But I do maintain the recommendation that Google present a more traditional global gateway to visitors across its sites and apps.

Other highlights from the top 25 list include:

  • Consumer goods companies such as Pampers and Nestlé are a positive sign that non-tech companies are making positive strides in improving their website globalization skills.
  • IKEA returned to the list this year after making a welcome change to its global gateway strategy.
  • Nissan made the top 25 list for the first time. BMW slipped off the list.
  • As a group, the top 25 websites support an average of 54 languages (up from 52 last year); if we removed Wikipedia from the language counts the average would still be an impressive 44 languages.
  • GoDaddy, a new addition to the Report Card, wasted little time in making this list. Its global gateway is worth studying.
  • Luxury brands such as Gucci and Ralph Lauren continue to lag in web globalization — from poor support for languages to inadequate localization.
  • The average number of languages supported by all 150 global brands is now 31.

But as you can see here, the rate of language growth, on average, is slowing. That’s not necessarily a bad thing. Companies are telling me that they are investing more on depth and quality of localization — which is of huge importance.

The data underlying the Report Card is based on studying the leading global brands and world’s largest companies — 150 companies across more than 20 industry sectors. I began tracking many of the companies included in this report more than a decade ago and am happy to share insights into what works and what doesn’t. Time is often the greatest indicator of best practices.

I’ll have much more to share in the weeks and months ahead. If you have any questions about the report, please let me know.

Congratulations to the top 25 companies and the people within these companies that have long championed web globalization.

The 2017 Web Globalization Report Card

Click here to download a PDF brochure for the report.

Managing language expectations when you can’t translate everything

I don’t know of any large company that translates all of its content into all of its target languages.

I won’t go into the many reasons for why this is — money being the major reason — but I will say that if this is an issue you struggle with you’re not alone.

The key to success is in managing user expectations. Few companies do this well.

I wrote a bit about this for a sponsored blog post here.

An excerpt:

Some links are better left un-translated
Let’s suppose that you’ve translated 25% of your website — you’re bound to have plenty of web pages that link to other web pages that are still in the source language. Should you remove those links, or should you translate them?

I recommend leaving these links un-translated. Therefore, users will see English text before they click on it, which is a subtle but important way to manage user expectations.

An alternative, which can often be supported by content management systems, is to go ahead and translate the links but also append text that reads “In English” next to the link. Make sure “In English” is also translated!

Link

Measuring translation quality: A Q&A with TAUS founder Jaap van der Meer

Every translation vendor offers the highest-quality translations.

Or so they say.

But how do you know for sure that one translation is better than another translation?

And, for that matter, how do you fairly benchmark machine translation engines?

TAUS has worked on this challenge for the past three years along with a diverse network of translation vendors and buyers, including Intel, Adobe, Google, Lionbridge, and Moravia (among many others).

They’ve developed something they call the Dynamic Quality Framework (DQF) and they took it live earlier this month with a website, knowledgebase and evaluation tools.

TAUS DQF

To learn more, I recently interviewed TAUS founder and director Jaap van der Meer.

Q: Why is a translation quality framework needed?
In 2009 and 2010 we did a number of workshops with large enterprises with the objective to better understand the changing landscape for translation and localization services. As part of these sessions we always do a SWOT analysis and consistently quality assurance and translation quality popped up on the negative side of the charts: as weaknesses and threats. All the enterprises we worked with mentioned that the lack of clarity on translation quality led to disputes, delays and extra costs in the localization process. Our members asked us to investigate this area further and to assess the possibilities for establishing a translation quality framework.

Q: You have an impressive list of co-creators. It seems that you’ve really built up momentum for this service. Were there any key drivers for this wave of interest and involvement?
Well, on top of the fact that translation quality was already not well defined ever since there is a translation industry, the challenges in the last few years have become so much greater because of the emergence of new content types and the increasing interest in technology and translation automation.

Q: What if the source content is poorly written (full of grammatical errors, passive voice, run-on sentences). How does the DQF take this into account?
We work with a user group that meets every two months and reviews new user requirements. Assessing source content quality has come up as a concern of course and we are studying now how to take this into account in the Dynamic Quality Framework.

Q: Do you have any early success stories to share of how this framework has helped companies improve quality or efficiency?
We have a regular user base now of some 100 companies. They use DQF primarily to get an objective assessment of the quality of their MT systems. Before they worked with BLEU scores only, which is really not very helpful in a practical environment and not a real measurement for the usability of translations. Also many companies work with review comments from linguists which tend to be subjective and biased.

Q: How can other companies take part? Do they need to be TAUS members?
Next month (December) we will start making the DQF tools and knowledge bases available for non-members. Users will then be able to sign up for just one month (to try it out) or for a year without becoming members of TAUS.

Q: The DQF can be applied not only to the more structure content used in documentation and knowledgebases but also marketing content. How do you measure quality when content must be liberally transcreated into the target language? And what value does the DQF offer for this type of scenario?
We have deliberately chosen the name “Dynamic” Quality Framework, because of the many variables that determine how to evaluate the quality. The type of content is one of the key variables indeed. An important component of the Dynamic Quality Framework is an online wizard to profile the user’s content and to decide – based on that content profile – which evaluation technique and tool to use. For marketing text this will be very different than for instructions for use.

Q: Do you see DQF having an impact on the creation of source content as well?
Yes, even today the adequacy and fluency evaluation tools – that are part of DQF – could already be applied to source content. But as we proceed working with our user group to add features and improve the platform we will ‘dynamically’ evolve to become more effective for source content quality evaluation as well.

Q: An argument against quality benchmarks is that they can be used to suck the life (or art) out of text (both source and translated text). What would you say in response to this?
No, I don’t think so. You must realize that DQF is not a mathematical approach to assessing quality and only counting errors (as most professionals in the industry have been doing for the longest time now with the old LISA QA model or derivatives thereof). For a nice and lively marketing text the DQF content profiler will likely recommend a ‘community feedback’ type of evaluation.

Q: Where do you see the DQF five years from now in terms of functionality?
Our main focus is now on integration and reporting. Next year we will provide the APIs that allow users to integrate DQF in their own editors and localization workflows. This will make it so much easier for a much larger group of users to add DQF to their day-to-day production environment. In our current release we provide many different reports for users, but what we like to do next year is allow users to define their own reports and views of the data in a personalized dashboard.

TAUS Link

Transcreation is here to stay

In 2005, I wrote transcreation is gaining momentum.

I predicted that we’d see a lot more use of this word in the years ahead. Why? Because “translation sounds like a commodity; transcreation sounds like a service.”

So here we are in 2013 and a Google search on Transcreation brings up 392,000 results.

Translators often cringe when hearing this word. And I have often felt the urge to do the same because, frankly, good translators and translation agencies have been providing this service all along.

The idea that literal, word-for-word translation is the only service provided by translators is simply wrong, and to some extent propagated by a translation industry built upon stressing quality (as in literal translation) over more marketing-oriented translation.

So now we have a number of marketing firms and advertising agencies who use this term quite liberally to promote their unique brand of translation services. Here is a screen grab from the website of Hogarth:

Hogarth and Transcreation

By the way, Hogarth is looking to hire a Transcreation Account Manager to “manage the transcreation and production of advertising for major global brands.” Here is the link.

Transcreation is here to stay.