Google have shown off a range of new features due to come to their Android operating system and the company's expansive range of apps at their annual I/O developers conference.
Speaking at the opening keynote, CEO Sundar Pichai emphasized the need to push the boundaries of AI to solve real-world problems.
Those interested can watch the keynote in full below:
Writing on the official Google Blog, Pichai says that "I/O gives us a great chance to share some of Google’s latest innovations and show how they’re helping us solve problems for our users. We’re at an important inflection point in computing, and it’s exciting to be driving technology forward. It’s clear that technology can be a positive force and improve the quality of life for billions of people around the world. But it’s equally clear that we can’t just be wide-eyed about what we create."
"There are very real and important questions being raised about the impact of technology and the role it will play in our lives. We know the path ahead needs to be navigated carefully and deliberately—and we feel a deep sense of responsibility to get this right. It’s in that spirit that we’re approaching our core mission."
Android P - Slices, Swipe Controls, Wellbeing Dashboard and More
The most significant of these is likely to be the next major update for the company's mobile operating system Android. Currently being branded as Android P, the update will bring with it of new features and revisions.
In line with event's broader theme of AI-assisted empowerment, Android P promises to extend battery life and enhance everyday usage through new Adaptive Battery - which prioritizes battery power for the apps and services you use the most - and Adaptive Brightness - which relies on machine learning to like to set the brightness slider given both your surroundings and habits.
There's also new predictive App Actions and a new features called Slices which allows you to surface functions within an application without fully opening it.
"If you search for “Lyft” in Google Search, you can see an interactive Slice that gives you the price and time for a trip to work, and it’s interactive so you can quickly order the ride." In some ways, Slices appears to be a fresh take on the multi-step and inter-application shortcuts that Samsung offer with Bixby.
Another highlight here is a increased focus on user well-being through the new Dashboard. According to Google, this feature "shows you how you’re spending time on your device, tracking time spent in apps, how many times you’ve unlocked your phone, and how many notifications you’ve received. App Timer lets you set time limits on apps, and will nudge you when you’re close to your limit and then gray out the icon to remind you of your goal."
"The new Do Not Disturb mode silences not just the phone calls and notifications, but also all the visual interruptions that pop up on your screen. And to make it even easier to use, we created a new gesture: if you turn your phone over on the table, it automatically enters Do Not Disturb so you can focus on being present."
"Finally, Wind Down will switch on Night Light when it gets dark, and it will turn on Do Not Disturb and fade the screen to grayscale at your chosen bedtime to help you remember to get to sleep at the time you want."
Finally, Android P will bring with it some major changes to the way that most users navigate Android. While Google didn't have much to share in the way of native notch support, as had previously been rumored, they did announce a new swipe-based control scheme. In concept, there are definitely similarities to the gesture-controls of Apple's iPhone X to be found here. However, even at first glance, the overall execution here has more than enough differences to stand out.
These navigation changes come accompanied by a new Smart Text Selection feature that Google promise will be a boon for multitasking users by allowing them to quickly and easily copy and paste from one app to another.
Although Android P won't launch in full until later in the year, Google have released an early version of the software for public testing. However, this beta test is only available to the owners of specific Android devices. At the moment, the Android P beta is only compatible with the Google Pixel 2 and Pixel 2 XL plus a number of other flagships including the Sony Xperia XZ2, Xiaomi Mi Mix 2S, Nokia 7 Plus, Oppo R15 Pro, Vivo X21, OnePlus 6, and Essential PH‑1.
Google Maps - AR Augmentations, Group Planning, Match Scores and Personal Recommendations
Most of the changes coming to Google Maps in 2018 concern the Explore tab in the Maps app.
Google say the updated section will now recommend dining, event, and activity options based on the area you’re looking at. These recommendations themselves will be based on a combination of prominent "tastemakers", local recommendations, Google's own algorithms and input from trusted foodie publishers.
In addition, when you tap on any food or drink venue in the app, it will display your “match” - a number that Google says "suggests how likely you are to enjoy a place and reasons explaining why. We use machine learning to generate this number, based on a few factors: what we know about a business, the food and drink preferences you’ve selected in Google Maps, places you’ve been to, and whether you’ve rated a restaurant or added it to a list."
The Maps app now also supports group planning. A long press on a places you’re interested visiting will add it to a shareable shortlist that your friends and family can add more places to and vote on. Once you’ve made a decision together, Google Maps will then help you quickly book a reservation and even find a ride through selected ride-sharing services.
Google also demonstrated several new in-development integration between the AI and AR-powered Google Lens and the Maps app that allowed users to to identify locations in real-time using their phone's camera and overlay navigation directions onto what you see through your phone's camera.
Google Assistant - Duplex, Multiple Actions, Routines, New Voices
Google's Assistant is set to receive a suite of major software improvements in the coming months.
The first and most surface-level of these comes in the form of new voices. From today, there are now six new voices available for the Assistant. What's more, Google say that a John Legend voice is due later in the year.
Interestingly, Google revealed that, thanks to "advancements in AI and WaveNet technology from DeepMind, we can now create new voices in just a few weeks and are able to capture subtleties like pitch, pace, and all the pauses that convey meaning, so that voices are natural-sounding and unique." Expect additional voices to be released in the future and more frequently.
Google are also making the Assistant more responsive via a new Continued Conversations feature, due later in the year. Once enabled, the Assistant should be better at understanding "when you’re talking to it versus someone else, and will respond accordingly." This feature will also put an end to the need to say "OK, Google" for each consecutive command after the first.
Multiple Actions, which Google say is already starting to roll out, will also allow the Google Assistant to understand more complex questions like “What’s the weather like in New York and in Austin?” and let users string together multiple requests for the Assistant into a single phrase.
Google are also rolling out new Custom Routines, which allow you to create your own Routine with any of the Google Assistant’s one million Actions, and start your routine with a phrase of your choice. Later down the track, Google say you’ll even be able to schedule Routines for a specific day or time either using the Assistant app or through the Google Clock app for Android.
Google are promising that the Assistant will be able to play a more active role in saving you time on everyday tasks via a new technology called Google Duplex. Due to arrive later in the year, Duplex allows the Google Assistant to make restaurant reservations, schedule hair salon appointments, and get holiday hours.
"With Google Duplex, businesses can operate as they always have; and people can easily book with them through the Google Assistant. We’re just getting started, but we’re excited for how Google Duplex can connect small businesses with Google Assistant users."
A video of this feature being demonstrated at I/O 2018 can be seen below (and is well worth watching):
Google Photos - Active Lens, Colorization, Partner Program
In line with the broader improvements Google are making to the AI-powered Google Lens, Google Photos will now offer active one-tap improvements when you open up an image.
"Today, you’ll start to see a range of suggested actions show up on your photos right as you’re viewing them, such as the option to brighten, share, rotate or archive a picture. These suggested actions are powered by machine learning, which means you only see them on relevant photos. You can easily tap the suggestions to complete the action."
Google Photos users will also now begin to encounter Assistant-powered photos that play with color in more interesting ways. Google say they use AI to detect the subject of your photo and leave them in color but set the background to black and white - offering up a stark contrast.
According to them, this is only the beginning with the goal being for Google Photos to eventually offer full re-colorization of black and white images.
Google also say they are looking at integrating Google Photos into new digital photo frame hardware through a new Google Photos partner program "that gives developers the tools to support Google Photos in their products, so people can choose to access their photos whenever they need them."
The company says apps and devices that work with Google Photos will become available in the coming months.
Google Lens - Style Matching, Smart Text Selection, More Supported Devices
Google have announced that Lens will now be available directly in the camera app on supported devices from LGE, Motorola, Xiaomi, Sony Mobile, HMD/Nokia, Transsion, TCL, OnePlus, BQ, Asus in addition to the Google Pixel.
There are also three new capabilities here that the company are using I/O 2018 to highlight. The first of these is improved smart text selection that'll make it easier to copy and paste text captured in an image. Since the goal here is to actually make AI useful, Google say that Lens will go that extra step and try to provide additional context to what you've selected.
"Say you’re at a restaurant and see the name of a dish you don’t recognize—Lens will show you a picture to give you a better idea."
The second improvement here is Style Matching. Google say that "if an outfit or home decor item catch your eye, you can open Lens and not only get info on that specific item—like reviews—but see things in a similar style that fit the look you like." Like similar features in both Bixby and Oppo's new R15, there will also be a shopping component here - allowing you to buy those products right then and there.
Last but not least, Google say that Lens will now play a more active role within your phone's camera. Working in real time, it'll be "able to proactively surface information instantly—and anchor it to the things you see."