Home Tech I used Meta Ray-Ban in Montreal to test its AI translation skills. It didn’t go well

I used Meta Ray-Ban in Montreal to test its AI translation skills. It didn’t go well

0 comment
Image may contain urban Car Transportation Vehicle Architecture Building Cityscape Person City Road and street

Imagine you’ve just arrived in another country, you don’t speak the language, and you come across a construction zone. The air is filled with dust. You’re tired. You still stink of airplane air. You’re trying to ignore the jackhammers to figure out what the signs are saying: Do you need to cross the street, walk another block, or turn around?

I was in exactly Such a situation this week, but I came prepared. I flew to Montreal to spend two days testing the new AI translation feature on Meta’s Ray-Ban smart sunglasses. Within 10 minutes of setting out on my first walk, I was faced with an avalanche of confusing orange detour signs.

The AI ​​translation feature is meant to give users a quick, hands-free way to understand text written in foreign languages, so we couldn’t have come up with a better pop quiz on how it works in real time.

As a bulldozer rumbled by, I looked at a sign and began asking my sunglasses to tell me what it said. Before I could finish, a harassed Quebec construction worker started yelling at me and pointing north, and I ran across the street.

Photography: Kate Knibbs

Right at the beginning of my adventure with AI, I encountered the biggest limitation of this translation software: at the moment, it doesn’t tell you what people say. It can only analyze the written word.

I already knew the feature was just for writing at this point, so it wasn’t a surprise. But I soon ran into other, less obvious limitations. Over the next 48 hours, I tested AI translation on a variety of road signs, business signs, advertisements, historical plaques, religious literature, children’s books, tourist brochures, and menus — with wildly varying results.

Sometimes he was competent, like when he told me that the book I bought for my son, Three beautiful babies, They were three beautiful babies. (Correct.) She told me repeatedly that open It meant “open,” which, to be honest, I already knew, but I wanted to give him some trays.

Other times, my translation robot was not up to the task. It told me that the sign for the famous adult cinema Cinéma L’Amour translated as… “Cinéma L’Amour.” (F for effort: Google Translate at least changed it to “Cinema Love.”)

Courtesy of Kate Knibbs

At restaurants, I had trouble getting him to read me every single item on the menu. For example, instead of telling me all the different burger options at a brewery, he simply told me there were “burgers and sandwiches” and refused to be more specific despite my cajoling.

When I went to an Italian restaurant the next night, it also gave me a broad summary of the offerings rather than breaking them down in detail: I was told there were “grilled meat skewers” ​​but not, for example, that there was duck. confit, lamb and beef options, or how much they cost.

All in all, at this point, AI translation is more of a temperamental gimmick than a genuinely useful travel tool for foreign climes.

How it works (or doesn’t)

To use AI translation, the glasses wearer must say the following magic words: “Hey Meta, look…” and then ask it to translate what they are looking at.

The glasses take a snapshot of whatever is in front of you and then inform you of the text after a few seconds of processing. I was expecting simpler translations, but it rarely offers word-by-word breakdowns. Instead, he paraphrases what he sees or offers a broad summary.

You may also like