I’m excited to announce the publication of The 2018 Web Globalization Report Card. This is the most ambitious report I’ve written so far and it sheds light on a number of new and established best practices in website globalization.
First, here are the top-scoring websites from the report:
For regular readers of this blog, you’ll notice that Google was unseated this year by Wikipedia. Wikipedia, with support for an amazing 298 languages, made a positive improvement to global navigation over the past year that pushed it into the top spot. And Wikipedia, due to the fact that it is completely user-supported, indicates that there is great demand for languages on the Internet — and very few companies have yet responded in kind.
Google could still stand to improve in global navigation, as could Facebook.
Other highlights from the top 25 list include:
Consumer goods companies such as Pampers and Nestlé are a positive sign that non-tech companies are making positive strides in improving their website globalization skills.
As a group, the top 25 websites support an average of more than 80 languages (up from 54 last year); but note that we added a few websites that made a big impact on that average.
Luxury brands such as Gucci and Ralph Lauren continue to lag in web globalization — from poor support for languages to inadequate localization.
The average number of languages supported by all 150 global brands is now 32.
The data underlying the Report Card is based on studying the leading global brands and world’s largest companies — 150 companies across more than 20 industry sectors. I began tracking many of the companies included in this report more than a decade ago and am happy to share insights into what works and what doesn’t.
I’ll have much more to share in the weeks and months ahead. If you have any questions about the report, please let me know.
Congratulations to the top 25 companies and to the people within these companies who have long championed web globalization.
Intel emerged on top for the second year in a row, followed by Cisco Systems and Autodesk.
A new entrant this year is HP Enterprise, which ranked relatively low, due in large part to limited language coverage, but is notable for a world-ready architecture and above-average global gateway.
Intel held steady over the year with support for 23 languages. Intel modified its web design to support a “fly in” navigational menu. The support section also is better integrated into the design this year.
As before, Intel does an excellent job of supporting global consistency. Shown below is the Brazil home page, which shares the same underlying template as other country sites.
The nice thing about placing the Intel logo in the middle of the design is that you don’t have to worry about the logo shifting from side to side when the layout flips for bidirectional text, such as Arabic, shown below.
Notice the globe icon in the header — easy to find and use for anyone who wishes to navigate to a different locale. This is a relatively new (and valuable) addition to the mobile site, shown here:
Cisco remains the language leader of this category with 40 languages. Cisco debuted a new web design over the past year. Shown below are the before and after designs.
The most noticeable improvement is the addition of a globe icon in the header to indicate the global gateway. This is a small but important step forward in ensuring that users more easily find where they need to go.
Oracle most recently added support for Ukrainian and Arabic, increasing its language total to 32. Meanwhile, SAP dropped two languages over the past year, lowering its language total to 35 languages.
IBM is on year two of its new web design. It remains steady with 38 languages. Unfortunately, the global gateway is buried in the footer of both the desktop and mobile websites.
HP Enterprise is a new global website born of a spinoff from HP. The web design uses a lightweight, responsive template and includes the perfect global gateway icon in the header — yes, the globe icon.
Unfortunately, I found the global gateway menu to be buggy and difficult to use — and it is demoted to the footer on the mobile website.
To learn more about these websites along with best practices and emerging trends, check out the 2017 Report Card.
It’s hard to believe that this is the twelfth edition of the Report Card. Over the past decade I’ve seen the average number of languages supported by global brands increase from just 10 languages to 30 languages today.
And, of course, the top 25 websites go well beyond 30 language. Google supports 90 languages via Google Translate and 75 languages on YouTube. And Facebook stands at 88 languages.
But it’s not just languages that make a website succeed globally. Companies need to support fast-loading mobile websites, locally relevant content, and user-friendly navigation.
Notable highlights among the top 25:
Wikipedia is far and away the language leader, with content in more than 270 languages. The company also now supports a mobile-friendly layout that is considerably lighter (in kilobytes) than most Fortune 100 mobile websites.
NIVEA provides an excellent example of a company that localizes its models for local websites — one of the few companies to do so.
Nike made this top 25 list for the first time, having added languages and improved global consistency and navigation.
As a group, the top 25 websites support an average of 52 languages.
For 2016, we studied 150 websites across 15 industry categories — and more than 80% of the Interbrand Best Global Brands. Websites were graded according to languages supported, global navigation, global and mobile website architecture, and localization.
I’m happy to announce the publication of the 2012 Web Globalization Report Card. This year, we reviewed 105 websites across 17 industries; the websites comprise 70% of the Interbrand Best Global Brands of 2011. This year, we also reviewed mobile websites and mobile apps, to better understand how companies were balancing global and mobile strategies.
Out of the websites reviewed, here are the top 25 overall:
Last year, Facebook emerged (barely) as number one. This year, Google reclaims the top spot. Although Google continues to struggle to harmonize its global navigation across its many applications, the company also continues to invest in globalization. Google now supports more than 140 languages on its search engine and its new Google+ app supports an impressive 40 languages. Facebook’s mobile app, by comparison, supports just 13 languages. Though Facebook continues to improve its global navigation, its language growth stalled in 2011.
As a group, the top 10 websites support an average of more than 50 languages. They also demonstrate a high degree of global design consistency across most, if not all, localized websites. This degree of consistency allows them to focus their energies on content and mobile localization. Two new companies on this list – Hotels.com and Booking.com – exhibit an impressive commitment to mobile devices. Any company that is developing a global mobile strategy should study these two companies.
Why didn’t Apple make the top 10?
I’m anticipating I will get asked this as I was asked the same thing last year. After all, how can a company with nearly $100 billion in the bank not be in the top 10? It seems that Apple has been rather tightfisted with its translation spending; the company supports far fewer languages on its website than on its mobile operating system iOS. Does it make sense for an iPad and iPhone to support Arabic and Hebrew and for Apple’s website not to support these languages?
Language parity between mobile and PC is a key component of the 2012 Report Card and Apple did not fare well in this regard.
It’s worth noting that of the websites reviewed, roughly half now support Arabic and/or Hebrew.
In the Report Card, languages account for 25% of a web site’s score. We also evaluate a web site’s depth and breadth of local content, support for local-language social networks, the effectiveness of the global gateway, and global consistency across PC and mobile platforms. Beginning in 2010, we began tracking how companies promote local social platforms such as Facebook and Twitter around the world. In 2010, only a handful of companies supported a Twitter or Facebook page outside of English. Today, more than half of all companies reviewed support a social network outside of English.
Cisco Systems is worth studying for its Social@Cisco pages. This social aggregation page was first launched in 2010. It is now available in more than 30 markets, with local feeds incorporated.
Hard for me to believe, but this is the eighth edition of the Report Card. It’s the largest report ever, with 40 website profiles and a special section on “taking mobile global.” I’ll have lots more to say in the weeks ahead.
But translation memory requires using translation memory software, which has for years largely meant using SDL Trados software.
When a company hires a translation agency and requires that they use translation memory — not only must that agency have Trados software, but so too must the freelance translators — who are often located all around the world. This is a nice business model for SDL, but it has been a pain point for translators and agencies for years.
For agencies, the more acute pain point has been that SDL not only sells TM software but also sells translation services. Nearly every translation exec I have spoken to has openly asked for an open-source alternative to Trados.
Well, now we have one.
IBM has partnered with LISA (Localization Industry Standards Association), Welocalize, Cisco, and Linux Solution Group e.V. (LiSoG) to launch an open source project that provides a “full-featured, enterprise-level translation workbench environment for professional translators.”
It’s called Open TM2 — and it’s basically a scaled-down version of what IBM has developed and used internally for years. I haven’t used the product yet and there’s understandably quite a bit of work involved to get this software to a point where it’s easy for translators, agencies, etc. to consume.
I’m not prepared to say Open TM2 is going to put an end to Trados. After all, Linux didn’t exactly put Windows or OSX out of business. But I am excited to see it out there in the world. Open source keeps software vendors on their toes. I’ll be very curious to see if developers embrace the code, and what they come up with.
To learn more, I interviewed one of the partners behind Open TM2, Smith Yewell, CEO of Welocalize.
Here is what he had to say:
Q: Why did IBM decide to open source its software in this fashion? What does it hope to gain?
Bill Sullivan can answer this question better than I, but as he stated, “Freelance translators are the backbone of the localization industry. These translators have longed for free and open translation tools to increase their productivity. There is a recognized and growing need for standards in the localization industry. Despite our best intentions, however, standards themselves can often be vague and open to multiple interpretations. What is needed are reference implementations and reference platforms that serve as concrete and unambiguous models in support of the standard.”
In my opinion, productivity and standardization go hand-in-hand. By releasing Open TM2 as an open source product with a standards-based, data-exchange goal, not only is there potential for increased productivity – flexibility and freedom of choice also increase.
Q: And what do you hope to gain from this effort?
I like to use the mobile phone analogy. I can travel just about anywhere in the world, turn my phone on, and it works. This is possible, because competing carriers and hardware manufacturers collaborated to be able to offer that seamless user experience across global networks and handset protocols. Consider the user experience in our industry. There is really no ability for a client to turn on a translation supply chain and have it work out of the box across various content types, tools and translation vendors. The clients I speak with are demanding that this change.
GlobalSight, Joomla and Open TM2 are being used to demonstrate an example of a seamless data exchange based upon a set of standards. LISA will play an important role in documenting and sharing these standards so that they can be applied uniformly to other integrations. To put it simply, we need a variety of tools to be able to talk to each other in an automated way. This is where I think we can improve time, cost and quality results and greatly improve the user experience. Ultimately, I expect Welocalize to gain an increase in productivity, interoperability and freedom of choice in configuring the best set of tools for each client’s unique translation supply chain needs. If we can get under the hood, we can tune the engine; otherwise, it is becoming increasingly difficult to gain time, cost and quality advantages from the old way of doing business.
Q: Who is going to use this software? And what software will it replace?
Many translators are already using TM2 in delivering work to IBM. I expect Open TM2, as its features grow, will appeal to more translators as a desktop workbench. This is only an initial release of the open source product, and there is much work to be done. But the potential is there to collaborate and improve. Ultimately, I think Open TM2 has the potential to replace the Trados desktop workbench.
Q: When you talk open source, stability and support are common pain points. Who will be actively supporting this effort?
The members of the Steering Committee are currently supporting the effort, and the goal is to build a community which can support itself. This open source initiative is not unlike others, what one puts into it will determine the benefits one can pull from it. I wouldn’t be surprised to see a company create a business model to offer Open TM2 support. Support, training and customization are typical services that bloom around open source initiatives.
Q: What would stop a technology company from taking the source code and creating a competitive ™ product?
It is an open source product, so there is potential for companies to build a business model around the product. However, I doubt that will be a proprietary fork of the code. The appeal is an open source product with growing standards compliance, not yet another proprietary product. What is more likely are support, training and integration services. Anyone investing in the product naturally expects a return, and the better the return, the more healthy and diverse will be the community. I think that is a good thing. Competition drives innovation. However, if we can’t get the standard data-exchange protocols right, productivity across the supply chain will continue to lag the increasing velocity of change in the marketplace. Rapidly evolving time, cost and quality demands already exceed what the traditional translation supply chain can deliver.
Q: The source code is available now but documentation is lacking. What is your timetable for launching a more translator and agency friendly product.
I think the first step for the Steering Committee is to take the feedback that is already coming in about the product, good and bad, and use that to set priorities, responsibilities and a timeline. The idea is sound, but it must be tested in practical use and refined according to what the market really needs. Translators have the answers to many challenges in our supply chain, they are just not asked very often.
Q: How will this software be integrated? Is there is a goal of integrating it with the open source GlobalSight CMS?
Content creation, translation, workflow and performance metrics reporting – there are many systems and tools for accomplishing each of these requirements. However, very few of them can pass necessary data in an automated way. A lot can be accomplished with web services and open APIs, but widespread integration possibilities can only be realized with a critical mass actively using an industry-supported data-exchange standard.
In order to demonstrate this possibility in a live use case scenario, Joomla, GlobalSight and Open TM2 will be integrated with the resultant standards published by LISA. I think additional standards organizations will also need to participate to gain wider understanding, agreement and adoption. If enough of the industry’s thought leaders and leading practitioners get behind this standard data-exchange and tools integration challenge, I think all boats will rise. Without it, the industry will never be able to approach the growing volume of content which current production and cost models can’t support.
Over the past six months, Twitter went from mostly serving people based inside the US to mostly serving people based outside of the US.
Today, 60% of Twitter’s 105 million registered users are based outside of the United States.
And half of all tweets are in a language other than English.
This is a remarkable trend, particularly since Twitter has only been localized into five languages so far.
A few months ago, I set out to better understand how large, multinational companies are using Twitter to reach users around the world.
I studied more than 225 companies across 21 industry verticals (representing 80% of the Interbrand 100). And I interviewed a number of people who manage Twitter feeds in different markets.
This work resulted in the report Twittering in Tongues. This report is a first stab at a phenomenon that is very much in its early days, so it’s hard to draw any sweeping conclusions. But there are some clearly emerging trends, which I discuss. I also highlight a number of Twitter’s inherent international limitations and provide some recommendations for companies considering localized Twitter feeds.
Here are a few findings/recommendations from the report:
Most companies have yet to launch international Twitter feeds. Only one-third of the 225 companies studied support one or more Twitter feeds outside of their domestic markets. What makes this ratio interesting is that every one of 225 companies studied supports two or more localized web sites. So these are all companies that do business in three or more countries. A number of companies that support more than 20 local web sites still only use Twitter for their domestic markets.
Sony leads the pack with support for 20 international Twitter feeds, mostly through its Sony Music division. Microsoft, Cisco Systems, and PricewaterhouseCoopers are also out in front with support for 10 or more country specific Twitter feeds. CAVEAT: Counting feeds is a tricky business. Not all corporate feeds are actively managed (which I did not count) and not all local feeds are easy to find.
Brazil rules. Brazil is by far the most popular Twitter market outside of the US. Nearly half of the companies that support one or more international feeds have targeted Brazil. Not surprisingly, Brazilian Portuguese is the second most popular language used on Twitter.
Local Twitter success depends on local web site promotion. It’s also no surprise that the local feeds with some of the highest numbers of followers also had high visibility on their local web sites. Companies such as Dell and Samsung lead in this respect. Below is a screen shot from Samsung’s Brazil home page; Twitter gets prime real estate.
Twitter is local by design. Based on my interviews, most of the in-country Twitter feeds have been launched without any central approval process or even awareness. This also applies to local Facebook and YouTube pages. The evolution is local Twitter feeds is similar to the evolution of local web sites in the 1990s. Back then, local offices often created their own sites, with their own designs and platforms. Over the years, the central offices reined in these disparate sites — sometimes going too far and dampening local enthusiasm. The key challenge I see executies facing now is balancing local control with global consistency. While consistency is important, it should not come at the expense of local enthusiasm and innovation. In the end, the success of local Twitter feeds depends on the local offices.