See six new features that Google added to Search


In order to assist users in finding and examining content in novel ways, Google has unveiled six brand-new search features and innovations.

This was said in a statement released on Thursday in Lagos by Mr. Taiwo Kola-Ogunlade, Communications and PR Manager, West Africa.

According to Kola-Ogunlade, the new capabilities that use machine learning would enable individuals to obtain knowledge in novel ways.

Multisearch, multisearch near me, translation in a flash, Google for iOS updates, even faster ways to find what you're looking for, and new ways to explore information were the six new features he highlighted.

People use Lens to answer more than eight billion queries each month, according to Kola-Ogunlade.

He pointed out that earlier this year, Google made a significant advancement in information search by introducing multisearch.

With multisearch, you may add text to a picture or screenshot in a manner similar to how you might naturally point at something and inquire about it.

"Multisearch is currently coming out in 70 languages and is currently available in English internationally.

You can capture a screenshot or a picture of an object and then use "Multisearch near me" to locate it nearby.

So, if you're craving your favorite regional meal, all you have to do is screenshot it, and Google will connect you to eateries providing it in the area, he explained.

He asserts that one of the most effective features of visual cognition is its capacity to overcome linguistic barriers in a split second of translation.

With Google translating text in photos more than one billion times each month in more than 100 languages, Kola-Ogunlade claimed that "Google has gone beyond translating text to translating visuals."

He claimed that because to significant progress in machine learning, Google can now incorporate translated text into intricate visuals.

Kola-Ogunlade continued, "Google has also improved their machine learning models to achieve all this in just 100 milliseconds—shorter than the blink of an eye.

He claimed that the Magic Eraser on Pixel was powered by generative adversarial networks (GAN models), which were used in this.

In order to help you find what you're looking for, Google is aiming to make it possible to ask inquiries in fewer words, or even none at all.

"Google will help you to specify your inquiry if you are one of those people who doesn't know what you're looking for until you see it.

According to Kola-Ogunlade, "Google is rethinking how it presents search results to better match the ways individuals investigate topics to see the most relevant content."


Post a Comment


Disclaimer: Comments and opinions on any part of this website are the opinions of blog commenters or anonymous individuals, and do not reflect's position.

Post a Comment

#buttons=(Accept !) #days=(1)

Our website uses cookies to enhance your experience. Learn More
Accept !
To Top