Five iOS 16 features Android phones can already do

Every year when Google and Apple come out with their latest OS updates, there are always features that are inspired by each other. Whether it’s a new customization feature, design or availability, someone always did it first. Here are the five biggest iOS 16 features that Google did first and Android phones can do right now.

Smarter lock screen

I’ll be the first to admit that the new lock screen on iOS 16 looks fantastic. The way Apple uses the depth effect to add depth and realism to images and make it interact with the watch is ingenious and looks fantastic. Having Apple Watch-style complications with health, storage, battery and weather is great. You can change the iOS font and the clock / complication color to either match the background or a color you want. Everything is up to you.

However, Google did a lot of this first. Google has the At a Glance widget, which gives you similar information intelligently by predicting what you need. It always shows the weather and the date, but other information such as upcoming events, tropical storm warnings or boarding passes before boarding a flight is intelligent. They are more powerful than what Apple offers – you just can not manually choose what you want all the time. The clock color can also be changed. It is taken from the Material You color palette, which matches your wallpaper. You have four color palette options with Android 12, and up to 12 options on Android 13.

A much smaller feature Apple added was Live Activities, which allows apps to add a widget at the bottom of the lock screen with information such as sports scores or Uber distance. This is basically like Android alerts, which have been available for app developers to use for years on Android.

The new iOS 16 lock screen is great for iOS users, it looks good and works well, but it is also something Android users have experienced for years. iOS users are lucky to get it now, although it’s safe to say that Apple was heavily inspired by Google.

Automatic photo sharing

iOS iCloud Photo Library

In iOS 16, you can now let the Photos app automatically share photos of your family in a shared album you all have access to. It has options to allow all photos after a certain date or all photos with them in it. There is even a button in the camera app that automatically adds photos to the shared album. This shared album now gives everyone equal access to add photos, edit them and delete them. Everyone has equal access and everything is shared with everyone in the album.

Google Photos has been doing this for at least two years. Partner sharing, similar to Google Photos, lets you share photos that include that person automatically. It has all the same features as Apple, except that it is not limited to Apple products only. Since Google Photos is web-based, you can upload photos from a DSLR from any computer and share them as well.

Google Photo Partner Sharing Setup

In addition to this, Google also has automatic albums that you can share. This will automatically add all the photos you take of a particular person or pet and add it to an album that can be shared with a link or directly through the app. You can even enable collaboration so that others can add their photos as well. A whole group of friends can set it up to automatically add each photo of each other to the album, and everyone has access to it.

Google’s feature has been around a bit longer and is still a bit more powerful than Apple’s. Fortunately for iOS users, you can only download the Google Photos app on iPhone to access these features now and not wait for iOS 16.

Smarter dictation with punctuation and user interaction

On iOS 16, dictation now lets you edit and interact with what you dictate while dictating it. You can click and remove things and just tell the phone what you want to do and it will do it. It now automatically fills in punctuation.

These dictation features are an almost direct clone of Google Assistant voice writing from Pixel 6 and 6 Pro. It has the same type of features for interacting with text as you type, voice control over what you’ve already typed, and proper punctuation.

From my use of both iOS 16 and the Assistant voice type, Google still has a big lead with this feature. iOS 16 likes to put punctuation in places it should not and still struggles to understand me correctly. However, this is the first iOS 16 beta, so it’s likely that this feature will be improved.

More stops in Maps

Apple Maps now supports adding up to 15 stops along a route in maps. This seemingly simple feature has been in Google Maps for years at this time. The only real difference between these features is that Apple Maps supports up to 15 stops while Google Maps is 10 at most. If you want more stops now on iOS, you can always download the Google Maps app on iPhone.

Direct text

Instant Captioning was introduced in 2019 on Google I / O to use Google’s voice recognition technology to provide captions for content on phones that did not already have captions. It would work in real time and generate them for all sound except phone calls. In March this year, Google also announced this for phone calls.

iOS 16 has exactly the same function. It texts audio in real time across all apps, including in conversations and FaceTime. The user interface even looks identical. After a quick test, however, it seems to be quite a bit slower than Google’s alternative and not as accurate.

More about iOS 16:

FTC: We use revenue generating automatic affiliate links. More.


Check out 9to5Google on YouTube for more news: