Five iOS 16 features that Android phones can already do

Advertisement

Five iOS 16 features that Android phones can already do

Every year when Google and Apple roll out their latest OS updates, there are always features that have been inspired by each other. Whether it’s a new customization feature, a new design, or accessibility, someone always came first. Here are the five biggest iOS 16 features that Google rolled out first and that Android phones can currently do.

Smart lock screen

I’ll be the first to admit, the new iOS 16 lock screen looks amazing. The way Apple uses the depth effect to add depth and realism to photos and make them interact with the watch is awesome and looks amazing. It’s great to have Apple Watch-style complications with health, inventory, battery and weather. You can change the iOS font and clock/complication color to match either the wallpaper or a color of your choice. Everything is in your hands.

Google did a lot of this first. Google has the At a Glance widget that intelligently gives you similar information by predicting what you need. It always shows the weather and date, but other information like upcoming events, tropical storm warnings or boarding passes before boarding a flight is smart. These are more powerful than what Apple offers – you just can’t always manually select what you want. The clock color can also change. It is pulled from the Material You color palette that matches your background image. You have four color palette options with Android 12 and up to 12 options with Android 13.

A much smaller feature that Apple added was Live Activity, which allows apps to add a widget at the bottom of the lock screen with info like sports scores or Uber distance. This is basically like Android notifications that have been available to app developers on Android for years.

The new iOS 16 lock screen is great for iOS users, it looks and works well, but it’s also something Android users have been experiencing for years. iOS users are lucky enough to get it now, although it’s safe to say that Apple has been heavily inspired by Google.

Automatic sharing in photos

iOS iCloud Photo Library

In iOS 16, you can now have the Photos app automatically share photos of your family in a shared album that you all have access to. It has options to allow all photos after a certain date or all photos with them in it. There’s even a button in the camera app that automatically puts photos into the shared album. This shared album now gives everyone equal access to add, edit and delete photos. Everyone has equal access and everything is shared with everyone in the album.

Google Photos has been doing this for at least two years. With partner sharing, the equivalent of Google Photos, you can automatically share photos of that person. It has the same features as Apple, except that it is not just limited to Apple products. Because Google Photos is web-based, you can upload and share photos from a DSLR from any computer.

Set up Google Photos partner sharing

In addition, Google also has automatic albums that you can share. This will automatically add any photos you take of a specific person or pet to an album that can be shared with a link or directly from the app. You can even enable collaboration so others can add their photos too. A whole group of friends can set it up so that each other’s photo is automatically added to the album and everyone has access to it.

Google’s feature has been around a little longer and is still a little more powerful than Apple’s. Luckily for iOS users, you can now just download the Google Photos app on your iPhone to access these features and don’t have to wait for iOS 16.

Smarter dictation with punctuation and user interaction

With iOS 16, while dictating, you can now edit and interact with what you’re dictating as you dictate it. You can click and remove things and just tell the phone what you want to do and it will do it. It now also automatically fills in punctuation marks.

These dictation features are an almost direct clone of the Pixel 6 and 6 Pro’s Google Assistant voice input. It has the same kind of features for interacting with text as you type, voice control over what you’ve already typed, and getting the punctuation right.

Based on my use of both the iOS 16 and Assistant voice types, Google still has a big lead with this feature. iOS 16 likes to put punctuation marks in places it shouldn’t and still struggles to get me right. However, this is the first beta version of iOS 16, so this feature will likely improve.

Multiple stops in maps

Apple Maps now supports adding up to 15 stops along a route in Maps. This seemingly simple feature has been present in Google Maps for years at this point. The only real difference between these features is that Apple Maps supports up to 15 stops while Google Maps is 10 at most. Now if you want multiple stops on iOS, you can always download the Google Maps app on your iPhone.

live subtitles

Live Captions were launched at the Google I/O in 2019 to use Google’s speech recognition technology to provide captions for content on phones that didn’t already have captions. It would work in real time and generate them for any audio except for phone calls. In March of this year, Google announced the same for phone calls.

iOS 16 brings exactly this function. It annotates real-time audio in any app, including calls and FaceTime. The user interface even looks identical. However, after a quick test, it appears to be a bit slower than the Google alternative and not as accurate.

More on iOS 16:

FTC: We use income earning auto affiliate links. More.


Visit 9to5Google on YouTube for more news:

You May Also Like