Apple Added Notable New Features in Dictation, SharePlay, and Live Text With iOS 16

With the newest iOS 16 introduced at WWDC 2022, Apple is making considerable and notable new features in SharePlay, Dictation, and Live Text.

Now, enjoying SharePlay with your peers gives a magical experience while chatting in Messages, Dictation now allows you to switch between voice and touch instantly, and Live Text now works in a video as well.


With iOS 16, Apple is making it easier to discover all of the awesome SharePlay experiences from within your FaceTime call. With just a tap on Share My Screen, you can jump into SharePlay-supported apps that you already have on your phone or discover new experiences that you can share with your friends.

Apple is also bringing SharePlay to Messages. When you find something to share like a movie on Disney+, you can now kick off SharePlay right there and enjoy it together while chatting in Messages. You can watch in sync and you have got the same instantaneous shared playback controls that make SharePlay magical.


Dictation lets you type just by speaking and can be much faster than typing with a keyboard, this is why it’s so popular for taking notes, sending messages, and more. In fact, Dictation is used over 18 billion times each month and it’s designed to protect your privacy by happening entirely on your device.

Apple introduces the all-new on-device Dictation experience that lets you fluidly move between voice and touch. When you start Dectating, the keyboard stays open so you can switch between using voice and touch. You can even select text with touch and replace it with your voice just by speaking.

There is a new App Intents API for developers to build using the Swift programming language called App Intents. Shortcuts now work with zero set up so you can use Siri to get things done with supported apps.

Live Text

iOS 16 comes with bringing Live Text to video. You can now pause a video on any frame and interact with text along with converting currency with just a tap and translate a foreign language using quick actions in Live Text.

For developers, there is a Live Text API for being able to grab text straight from photos. Now using Visual Look Up, when you touch and hold the subject of an image, you can lift it away from the background and place it in apps like messages, and also utilizes it across iOS, iPadOS, and macOS.

You can more about the newly-released features with iOS 16 on the company’s official website.

Read More:

Leave a Reply

Your email address will not be published. Required fields are marked *