You Say Falkland Islands. I Say Islas Malvinas.

Remember the Falklands War?

I do and, yes, this does make me feel a little old.

For those of you who don’t remember, the war was fought over a group of small islands far off the Patagonian coast of Argentina.

The British won the war but the Argentines are still very attached to the islands.

So what we have here is a disputed territory, always a challenge for mapmakers.

Here’s a screen grab from Google Maps. Notice how “Islas Malvinas” is in parentheses.

As a test, I switched my language preference on Google Maps to Spanish thinking maybe I’d see Falkland Islands placed within the parenthes. But no.

However, Bing does localize the map based on language. When I switched Bing Maps to Spanish, here’s what I saw:

This is map localization at work.

I hope to one day visit these islands — and I hope they can survive the next looming (environmental) conflict. The Falklands would not be in the news today if not for great quantities of oil buried deep below the ocean floor. Make no mistake, oil is at the center of this current  dispute, not the natural wildlife, which neither government seems too terribly concerned about.

If it were up to me — and if only it were — I would hand over the islands to the one government that promised to leave the islands free of oil derricks.  The Falklands are of enormous importance to penguins, albatross, and many other creatures that are running out of safe places to nest.

PS: Here’s a recent article in the NYT about the islands.

 

Think your translator is cutting corners? Try the machine translation detector…

Lior Libman of One Hour Translation has released a web tool that you can use to quickly determine if text was translated by one of the three major machine translation (MT) engines: Google Translate, Yahoo! Babel Fish, and Bing Translate.

It’s called the Translation Detector.

To use it, you input your source text and target text and then it tells you the probability of each of the three MT engines being the culprit.

How does it know this? Simple. Behind the scenes it takes the source text and runs it through the three MT engines and then compares the output to your target text. So the caveat here is that this tool only compares against those three MT engines.

Being the geek that I am, I couldn’t help but give it a test drive.

It correctly guessed between text translated by Google Translate vs. Bing Translate (I didn’t try Yahoo!). Below is a screen shot of what I found after inputing the Google Translate text:

Next, I input source and target text that I had copied from the Apple web site (US and Germany). I would be shocked if the folks at Apple were crunching their source text through Google Translate.

And, sure enough, here’s what the Translation Detector spit out:

So if you suspect your translator is taking shortcuts with Google Translate or another engine, this might be just the tool to test that theory.

Though in defense of translators everywhere, I’ve never heard of anyone resorting to an MT engine to cut corners.

I actually see this tool as part of something bigger — the emergence of third-party tools and vendors that evaluate, benchmark, and optimize machine translation engines. Right now, these three engines are black boxes. I wrote awhile back of one person’s efforts to compare the quality of these three engines. But there are lots of opportunities here. As more people use these engines there will be a greater need for more intelligence about which engine works best for what types of text. And hopefully we’ll see vendors arise that leverage these MT engines for industry-specific functions.

UPDATE: As the commenters noted below, there are limits to the quality of results you will get if you input more than roughly 130 words. The tool is limited by API word-length caps.

Google, Bing and Babelfish: What’s the best translation engine?

Two months ago I wrote about an effort to evaluate the quality of the three major free machine translation (MT) engines:

  • Google Translate
  • Bing (Microsoft) Translator
  • Yahoo! Babelfish

Ethan Shen has wrapped up the project, soliciting input from more than 1,000 reviewers. He summed up his findings here.

Here are the findings that jumped out at me:

  • Google wins, hands down, translating longer text passages. No big surprise here.
  • Bing and Babelfish are competitive translating shorter texts (150 or fewer characters). Bing did quite well with Italian and German, while Babelfish did well with Chinese.
  • Google’s brand trumps all. About halfway through his test, Ethan removed the brand names from the search engines, so the reviewers did not know which engine was doing which translation. The change in results was significant. Reviewers were 21% more likely to say Google was better than Microsoft when they knew the brand names. And reviewers were 136% more likely to say Google was better than Babelfish.

This last finding is what poses the greatest hurdle for Microsoft and Yahoo!

When it comes to machine translation — perception is (almost) everything. If people think you’re the best translation engine, then you are the best.

Integration is the other key element of success, and Google Translate is doing well here also — I absolutely love the Chrome browser integration.

Ethan is not done with his research. This is only stage one. To help him with stage two, click here.