Google Assistant might have made its debut at Google I/O 2016, but this year's event was its real coming-out party. Over the past several months, Google has been steadily adding features to its AI chatbot, and now we can finally see a fuller vision for Assistant, and it goes far beyond asking questions about our day.
For one, you can now type your queries rather than speaking them. It’s my number one most-wanted feature, and will easily double the amount of time I spend using Assistant.
Now, if I’m in the office or in bed, I won’t have to break the silence with an inopportune “OK Google.” I tried it out at I/O, and typing is just as fast as speaking—and in some cases even faster. You can type fragments like “weather today” or “flight info,” and it’ll bring up the relevant info just as if you had used the natural language engine. And you don’t have to say “OK Google” every time either.
I also played with Assistant's third-party actions, which will now work on your phone just like they do on Google Home. But it’s more than just tracking your Dominos order or sorting your Todoist tasks. With a phone screen, developers can tap Assistant to build much richer actions that may all but eliminate the need to open some of your most-used apps.
For example, I played around with the Tender Cocktails action on the Pixel, and it greatly expanded the breadth of Assistant’s knowledge. Once I said, “OK Google, talk to Tender,” it brought up an app layer on top of the Assistant screen that let me interact directly with the app without needing to open it. I could ask for drink recipes using tequila and it brought up a series of options that I could scroll through. Once I found a picture that enticed me, tapping on it not only showed me the ingredients, but also gave me the option to have Assistant read the instructions aloud.
I can see actions on the phone being far more useful than they are on Google Home. The screen adds a whole new element to the process, including—as Google demonstrated during its keynote but wasn’t ready to demo live—mobile payment. Indeed, soon Assistant will be able to walk you through a full purchase right on your phone, asking questions as if you were speaking to a live person, and showing you exactly what you’ll be buying, including a summary of your bill.
And with Google Lens integration, coming later in the summer, Assistant will be able to look into your photos too. Assistant is at the epicenter of Google’s newfound push into AI and machine learning, and before long it could be our first and last stop for search.