Google I/O: AI Test kitchen, LaMDA2 and other key AI announcements 

Google I/O: AI Test kitchen, LaMDA2 and other key AI announcements 

Technology


Google’s I/O keynote last night saw several key announcements across products from the upcoming Android 13 to a more immersive view in Google Maps and of course, the new Pixel 6a. But Google also showcased its new machine learning and AI-focused features at the conference, and while the direct impact of these might be not clear to all users, they do play a key role in improving and enhancing the company’s products. Here’s a look at the key Google AI-related announcements.

LaMDA2 and AI Test kitchen

Last year, Google introduced LaMDA (Language Model for Dialog Applications), which is its generative language model for dialogue applications and can ensure that the Assistant would be able to converse on any topic. For instance, at last year’s I/O Google showcased how the LaMDA-inspired model would allow the Assistant to have a conversation around which shoes to wear while hiking in the snow. So far, LaMDA is being tested with Google employees, which according to Pichai, “ yielded significant quality improvements, and led to a reduction in inaccurate or offensive responses.”

LaMDA 2 builds on these capabilities and in order to help other users explore this, Google is announcing its AI Test kitchen. This is an app which will bring access to this model to users, though Pichai stressed that this is still very early stages and there’s a chance that some of the responses could be offensive during test.

The AI Test kitchen will let users explore these AI features and give them a sense of what LaMDA is capable of. There are three experiences or demo tests that a user can try it: Imagine It, List it out and Talk about it. All of these interfaces are supposed to be very simple.

The ‘Imagine It’ model tests if the AI can take a creative idea and generate what Pichai called “imaginative and relevant descriptions.” An example that Google gave is that you ask LaMDA to imagine something like say living at the bottom of the ocean. LaMDA then writes out a description of what the experience might like be. It can also generate follow-up questions. Pichai said during his keynote that users can “ask about almost any topic: Saturn’s rings or even being on a planet made of ice cream.”

The second test is designed to ensure that the language model stays on topic, which can be a challenge. The ‘Talk about it’ test is supposed to demonstrate this, and in the example, Google showed on stage, they started the conversation by talking about dogs. Pichai stressed that users can take the “conversation anywhere” and they will “get a unique response for that too.” And when Pichai tried to switch topics to cricket, the model brought it back to dogs.

Finally, the third test is around Listing it out, which asks the model to list out a bunch of things. For instance, if one wants to start gardening, the model will be able to answer correctly all of the things required and the steps one needs to take.

But keep in mind that this is all a work in progress. Pichai stressed throughout the keynote that there was still a chance that the model could generate “inaccurate, inappropriate, or offensive responses.” This is also why the app is asking for feedback.  That’s why we are inviting feedback in the app, so people can help report problems. The app will be made accessible to small groups of people gradually and there’s no clear date on when this will happen.

PaLM or Pathways Language Model

PaLM is a new model for natural language processing and AI, and Google states it is their largest model to date and trained on 540 billion parameters. It relies on a new technique called chain-of-thought prompting, which lets Google describe multi-step problems as a series of intermediate steps. The model is able to answer math word problems or explain a joke.

In the training, the AI model is given a question-answer pair, but also an explanation of how the answer was derived. Google says this particular model allows “increases accuracy by a large margin.” One example that was shown with PaLM was the AI model answering questions in both Bengali and English. For instance, Pichai asked the model about popular pizza toppings in New York City and the answer appears in Bengali. What is important to note is that “PaLM has never seen parallel sentences between Bengali and English,” according to Google. It has also not been taught to translate. Google’s hope is to extend these capabilities and techniques to more languages and other complex tasks.

World’s largest publicly available machine learning hub

Google also announced what it calls the “world’s largest, publicly-available machine learning hub for our Google Cloud customers,” at its data centre in Mayes County, Oklahoma. It also stressed that this machine learning hub is already operating at 90 per cent carbon-free energy.

!function(f,b,e,v,n,t,s)
{if(f.fbq)return;n=f.fbq=function(){n.callMethod?
n.callMethod.apply(n,arguments):n.queue.push(arguments)};
if(!f._fbq)f._fbq=n;n.push=n;n.loaded=!0;n.version=’2.0′;
n.queue=[];t=b.createElement(e);t.async=!0;
t.src=v;s=b.getElementsByTagName(e)[0];
s.parentNode.insertBefore(t,s)}(window, document,’script’,
‘https://connect.facebook.net/en_US/fbevents.js’);
fbq(‘init’, ‘444470064056909’);
fbq(‘track’, ‘PageView’);



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *