Google Lens Gets ‘AR Translate’ and Expands Multisearch to More Languages
Currently, any text that’s converted into a different language uses coloured blocks to mask bits of the background image. However, AR Translate better preserves the image by removing the blocks and just swapping the text outright to make the translated image look as though it was the original photo. Google has also expanded the multi-search language. Currently, when you talk a picture and ask a question, it gives you an answer in English. Now, it will be available in 70 languages in the next few months. Meanwhile, the ability to add a “near me” location filter to a visual query is coming later this fall in the US. For example, Google imagines using Lens to discover what food dish you’re looking at and appending “near me” to find where it’s sold near you. You can translate words to your preferred language using Google Lens. Here is how you can use it.
At the bottom, tap Translate. You may need to swipe right to find Translate. Point your camera at words that you don’t understand to translate them. Tap capture Capture photo.
Check Also: Google Users can now react to SMS from iPhone Users