Accessibility options are centered on enhancing an interface for a specific incapacity, however usually they’ll enhance experiences for all customers. The new Live Captions characteristic, presently in beta testing on sure units and nations, is a first-rate instance. Apple goals to show all audio produced by your gadget into precisely transcribed legible textual content, very similar to Live Text can extract textual content from bitmapped pictures.
To allow the characteristic, you have to have an iPhone 11 or later with iOS 16 put in, a comparatively latest iPad working iPadOS 16 (see this record), or an Apple silicon (M1 or M2) Mac with macOS Ventura put in. For iPhones and iPads, Apple says Live Captions works solely when the gadget language is about to English (U.S.) or English (Canada). The macOS description extra broadly says the beta is “not available in all languages, countries, or regions.”
If you need to use Live Captions (or need to test for those who can), go to Settings (iOS/iPadOS)/System Settings (Ventura) > Accessibility. If you see a Live Captions (Beta) merchandise, you need to use it. Tap or click on Live Captions to allow. You can then faucet Appearance in iOS/iPadOS or use the top-level menu gadgets in macOS to switch how captions seem. You can individually allow or disable Live Captions in FaceTime to have captions seem in that app.
Live Captions seem as an overlay that reveals its interpretation of audio in English of any sound produced by your system. A reside audio waveform matches the sound Live Captions “hears.” In iOS and iPadOS, you may faucet the overlay and entry extra controls: decrease, pause, mic, and full display; in macOS, pause and the mic button can be found. If you faucet or click on the mic button, you may communicate and have what you say seem onscreen. This might be useful for those who’re attempting to point out somebody the textual content of what you’re saying.

The textual content produced in Live Captions is ephemeral: you may’t copy or paste it. It’s additionally proof against cell display captures: the overlay is seemingly generated in such a approach that iOS and iPadOS can’t seize it.
Live Captions reveals lots of promise–one thing to regulate because it improves and expands. I examined Live Captions with podcasts, YouTube, and Instagram audio. It wasn’t nearly as good as some AI-based transcription I’ve seen, as in videoconferencing, however it made a valiant effort, and it was superior to not having captions.

Apple may tie Live Captions into its built-in translation characteristic, and also you would possibly be capable to use it to talk in your individual language and present a translated model to somebody of their tongue, or have reside transcriptions of video streams, podcasts, and different audio in a language aside from one you communicate.
This Mac 911 article is in response to a query submitted by Macworld reader Kevin.
Ask Mac 911
We’ve compiled a listing of the questions we get requested most incessantly, together with solutions and hyperlinks to columns: learn our tremendous FAQ to see in case your query is roofed. If not, we’re at all times searching for new issues to unravel! Email yours to [email protected], together with display captures as acceptable and whether or not you need your full title used. Not each query will probably be answered, we don’t reply to electronic mail, and we can’t present direct troubleshooting recommendation.
Source: www.macworld.com