According to the latest report, every year when Google and Apple roll out the latest operating system updates, there are always some mutually inspiring features. Whether it’s new custom features, design, or accessibility, someone always does it first. Here are five iOS 16 features Apple has released that Google’s Android phones have been able to do before.
Smarter lock screen
First, to admit, the new lock screen on iOS 16 looks great. Apple uses the depth effect to add depth and realism to photos, and the way it interacts with the clock is novel. Get Apple Watch-style health, stocks, battery, and weather. You can change the iOS font and clock/complication color to match the wallpaper or the color of your choice. Everything is up to you.
Google did a lot first, though. Google has an overview widget that intelligently provides similar information by predicting what you need. It always shows the weather and date, but other information such as upcoming events, tropical storm warnings, or boarding passes before boarding is smart.
These are more powerful than what Apple offers – just can’t be selected manually. The clock color can also be changed. It will be drawn from the Material You palette that matches your wallpaper. Android 12 has four palette options, and Android 13 has up to 12 options.
The little feature Apple added is Live Activity, which allows apps to add a widget at the bottom of the lock screen with information like sports scores or Uber distances. It’s basically like Android notifications, which app developers have been using on Android for years.
The new iOS 16 lock screen is great for iOS users, it looks great and works well, but it’s also something Android users have been experiencing for years.
Automatic sharing in photos
In iOS 16, you can now have the Photos app automatically share family photos in shared albums that you can all access. It has the option to allow all photos after a certain date or all photos that contain them. There’s even a button in the camera app to automatically put photos into a shared album. This shared album now allows everyone to add, edit and delete photos equally. Everyone has equal access and everything is shared with everyone in the album.
Google Photos has been doing this for at least two years. The Google Photos equivalent of “Partner Sharing,” automatically shares photos that include that person. It has all the same features as Apple but is not limited to Apple products. Because Google Photos is web-based, you can upload and share photos from your DSLR from any computer.
In addition, Google has automatic photo albums that can be shared. This will automatically add any photos you’ve taken of a person or pet to an album that can be shared via a link or directly through the app. You can even enable collaboration so others can add their photos to it too. A whole group of friends can set it up to automatically add every photo of each other to an album, and it’s accessible to all.
Google’s features have been around for a while, but are still more powerful than Apple’s. Fortunately, iOS users can access these features by simply downloading the Google Photos app on their iPhone, rather than waiting for iOS 16.
Smarter Dictation: Punctuation and User Interaction
On iOS 16, Dictation now allows you to edit and interact with what you’re dictating while you’re dictating. You can tap and delete content, then tell the phone what you want to do, and it will do. It also now supports autocomplete punctuation.
These dictation features are almost direct clones of the Google Assistant voice input on the Google Pixel 6 and 6 Pro. It has the same types of features for interacting with the text you type, voice control over what you’ve already typed, proper punctuation writing, and more.
Judging by the use of iOS 16 and Assistant voice types, Google is still leading the way with this feature. iOS 16 likes to put punctuation where it shouldn’t and still struggles to understand humans properly. Of course, this is the first beta version of iOS 16 Bera, and this feature needs to be improved in the future.
Multiple sites on the map
Apple Maps now supports adding up to 15 stops along a route on the map. This deceptively simple feature has existed in Google Maps for years. The only real difference between these features is that Apple regions support up to 15 sites, while Google Maps supports up to 10 sites.
Introduced at Google I/O 2019, Live Captioning uses Google’s speech recognition technology to subtitle content on phones that don’t yet have closed captioning. It will work in real time and generate any audio except incoming and outgoing calls. In March of this year, Google also announced this.
iOS 16 brings the exact same functionality. It supports real-time captioning of audio in any application, including calls and FaceTime. The UI even looks identical. However, after a quick test, it appears to be a lot slower and less accurate than Google’s alternative.