Tech

Google Surprise of Augmented Reality Translation Glasses After O/I Presentation

Google had a “one more thing”-style surprise at the end of its I/O presentation on Wednesday. Google showed off a pair of augmented reality glasses in a short video. These glasses only have one purpose: to show you language translations that you can hear right in front of your eyes. This is a YouTube video, and Google product manager Max Spear calls this prototype’s ability “subtitles for the world.” In the video, we see family members talking to each other for the first time.

The Most Impressive Aspect of Google Glasses

The most impressive aspect of Google Glass was its ability to translate languages in real-time, right before the user’s very eyes. This strikes me as a really practical application of augmented reality glasses. Even though a significant amount of capital is being invested into making augmented reality glasses a reality, no one has yet developed a “killer app” for augmented reality that would make you forget about the numerous privacy concerns that come along with the technology. A tool that provides a real-time translation of what is being spoken would be an excellent addition.

The company did not indicate when they might be available, and the only way they showed them off was in a video that was not live and did not show the display or how you would use them. The company also did not comment on whether or not they would be priced reasonably. However, what has been demonstrated in the film painted a very fascinating picture of what the future of AR might look like.

Subtitles for the world

Google Surprise of Augmented Reality

During one of the demonstrations, a Google product manager says to a participant who is wearing the glasses, “You ought to be able to see exactly what I’m saying typed out for you in real-time as I say it. It’s almost like having subtitles for the World.” Later on, the movie demonstrates what you may see if you wore the glasses: while the speaker is directly in front of you, the translated language will appear in your field of vision in real-time.

At Google I/O, you can’t be held responsible if you didn’t see this momentous occasion. This event took place at the very tail end of the marathon competition, and it was over in a matter of minutes. The actual name of the smart glasses was never mentioned. They were only demonstrated in a demo video, and even then they were more of an idea than anything else. Even without demonstrating the user interface, Google did not provide any indication that the smart glasses would ever be made available for purchase.

Before they are made into actual things that we can test out, we won’t know how well these might perform. It is also unclear whether this is Google’s Project Iris, about which we wrote in January, or whether this is something else entirely. However, if Google’s vision for I/O were to materialize, it would be of great assistance. This link will take you to the video so that you may watch it for yourself.

During the keynote presentation for Wednesday’s I/O conference, Google CEO Sundar Pichai discussed the company’s perspective on augmented reality before exhibiting the video. After hearing what he had to say, the audience got the impression that the company believes augmented reality may be utilized in a variety of settings in addition to cellphones. Also, we shouldn’t forget that the company says this translation will happen inside a pair of augmented reality glasses. Not to pick a fight, but augmented reality hasn’t even caught up to the company’s concept video from ten years ago. You know, the one that came before the much-criticized and awkward-to-wear Google Glass?

AR translation glasses seem to have a much clearer goal than Glass

To be fair, AR translation glasses seem to have a much clearer goal than Glass. From what Google showed, they’re only meant to do one thing: show translated text. They’re not meant to be an all-around computer that could replace a smartphone. But even so, it’s not easy to make AR glasses.

Even a small amount of background light can make it hard to read text on see-through screens. It’s hard enough to read subtitles on a TV when the sun is shining through a window. Now imagine that the subtitles are strapped to your face and you have to have a conversation with someone you don’t understand.

But technology changes quickly, so Google might be able to get around a problem that has stopped its competitors. That wouldn’t change the fact that company Translate isn’t a perfect way to talk to people who speak different languages. If you’ve ever used a translation app to have a real conversation, you probably know that you need to speak slowly. And in a planned way. And, of course. Unless you want the translation to be messed up. If you make one mistake, you might be finished.

We use simpler sentences when dealing with a machine translation

People don’t talk to each other in a void or like machines. Just like we speak in a different language to voice assistants like Alexa, Siri, or the Google Assistant, we know we have to use much simpler sentences when dealing with machine translation. Even if we speak the right way, the translation can still come out wrong or awkward.

Rami Ismail and Sam Ettinger said on Twitter that during its Translate presentation, Google showed more than a half-dozen scripts that were backward, broken, or otherwise wrong on a slide. This was not nearly as embarrassing as that. (Android Police says that a company employee has admitted the mistake and that the YouTube version of the keynote has been updated to fix it.) To be clear, we don’t expect perfection. But Google is trying to tell us that it’s close to figuring out real-time translation, and these kinds of mistakes make that seem very unlikely.

Google is trying to solve a problem that is very hard to solve. It’s easy to translate words, but it’s hard, but not impossible, to figure out grammar. But there is a lot more to language and communication than just those two things. As a simple example, Antonio’s mother can speak three different languages (Italian, Spanish, and English). She will sometimes use words from one language in the middle of a sentence from another language, even her regional Italian dialect (which is like a fourth language).

People can usually make sense of things like that, but could the company’s prototype glasses do it? Don’t worry about the messier parts of a conversation, like references that aren’t clear, thoughts that aren’t finished, or innuendo.

More About Augmented Reality

The next step in the evolution of computing is called augmented reality, and it has the potential to make all of this even more helpful. The company has invested a significant amount of both money and time into this sector. We have been working to integrate augmented reality into a wide variety of Google products.

These augmented reality capabilities are already helpful on mobile devices, but the real magic won’t begin to emerge until you can use them in the real world without having to worry about the technology getting in the way.

The most promising aspect of AR for us is its potential to free up more time for us to focus on activities that are meaningful in the real world and in our actual lives. Because the world in which we live is an interesting place to be.

Conclusion:

It is very powerful to use the company’s improved language and translation technology through a pair of smart glasses. If I can see right away how good that would be for me and the people I care about, I can only imagine how excited someone who can’t hear will be. You can keep the Pixel 7 and Pixel Watch. The reason I sit through more than two hours of Google I/O keynote presentations is to see demonstrations of amazing future technology like this, and the chills I get when I start to think about how it could change the world are my reward.

Leave a Reply

Your email address will not be published.

Back to top button

Ad Blocker Detector

Please consider supporting us by disabling your ad blocker