Tag Archives: AVFoundation

How to use a specific voice for text-to-speech in iOS

There are two ways of creating voices with which we can make iOS talk: creating a voice using a locale (classic), or creating a voice using a specific voice identifier (futuristic).

Let me show you both options here.

Classic and Easy

In this snipped we’re creating a voice with a certain dialect, British English in this case:

While very straightforward, we don’t know if the voice is going to be male or female. All we can specify is the language and dialect.

Futuristic and Specific

The above works fine and probably was enough when the speech synthesiser framework was introduced in iOS 7, but since then there are a myriad of other voices we can use in our applications. To specify one of them, we need a voice identifier.

Here’s how to do it:

The setup is almost the same, but instead of the voiceWithLanguage method, we’re using the voiceWithIdentifier method here.

Finding Voices

To see a list of all available voices on the device, we can access the speechVoices method of the AVSpeechVoices class. This will return an array of AVSpeechVoices, all of which have a name, quality and identifier property. The latter is what we’re looking for so we can create a specific voice.

Here’s a method lists all available voices on the current device:

Not all voices may be installed on all devices. For example, Alex is an optional high quality voice that the user needs to download first before he will show up in this array.

The quality parameter either returns 1 for standard/low-res, or 2 for enhanced/hi-res voices. Again it is up to the user to enable the hi-res quality of a voice under Settings.

How to play videos in iOS 9

Until iOS 8 we could use the trusty old MPMoviePLayerViewController class to play videos on our devices, but that’s been deprecated in iOS 9. From now on, Apple recommend we use the AVPlayerViewController class instead. It has many advantages and even supports picture-in-picture out of the box.

Although AVPlayerViewController is a subclass of UIViewController, instances of it cannot be presented using presentViewController – but Apple make sure not to mention this little tidbit. It’s much more “fun” to figure this out on our own.

Here’s how we can use it with a local video from the main bundle:

First we grab a URL to either a local or remote video. Next we create an AVPlayer with this URL, and add said player to the newly created AVPlayerViewController instance. You can auto-play a video by using the player’s play method, or remove it and leave it up to the user to start the video.

Next we’ll present the controller by adding it as a subview to our current view, making sure it has the same frame size.

The class works equally well with local and remote URLs. To play a remote asset, construct the URL like this:

How to extract a UIImage from a video in iOS 9

Here are two ways to extract the first frame of vision and turn it into a UIImage. Both methods are from the generous Stack Overflow community.

Both methods make use to the AV Foundation framework. There’s no need to import it, UIKit will take care of it for us.

Video URLs can be obtained in many ways, one of which is from something like an image picker. You can grab the URL in its didFinishPickingMediaWithInfo delegate method:

Now that we have a video URL, let’s see how we can extract a UIImage from it.

Continue reading

How to play audio in iOS

Photo Jan 11, 19 08 40We can play back an audio file we’ve previously recorded or providing in the main bundle, again thanks to the AV Foundation Framework. Much like recording, it’s not as straightforward as “hitting play” somewhere. Here are the steps involved:

  • create an AVSession
  • create an AVPlayer
  • prepare for and start playing

For the method below to work we need to import and link the AVFoundation Framework to our project. We also need a property that holds the player object.

Stop playback at anytime by calling

Two delegate methods can inform us of the progress here:

There are two other delegate methods that can inform us of an interruption to the playback, such as a phone call. If you’re interested in playing an MP3 file from the music library, check out the MPMediaPickerController.

How to record audio from the microphone in iOS

Talk-Icon-512Recording audio is a complex process – for any computer. iOS makes it simple-ish with several classes courtesy of the AV Foundation Framework.

In this example we’ll record audio from the device microphone. This happens in four steps:

  • create an AVSession
  • setup a dictionary with audio recorder settings
  • create an AVAudioRecorder
  • prepare for and start recording

For the method below to work you need to import and link the AVFoundation Framework to your project. We also need a property to hold our AVRecorder object, otherwise it will no longer exist by the time the method reaches the end and nothing will work. Again.

Here’s a method that is called by a “Record Button”:

This content is for members only.

Additional delegate methods are available to check for a “higher power” interruption, such as a phone call.

How to use the speech synthesiser in iOS 7

New to iOS 7’s AVFoundation Framework is the speech synthesiser. It’s what Siri is using when he/she speaks to you. We can use this as well in our apps by passing a string into an AVSpeechUtterance object, which in turn we pass into an AVSpeechSynthesizer instance.

Here’s how:

To tweak the voice of the synthesiser you can specify the language (defaults to the user locale if not specified), as well as pitchMultiplier and rate values. It’s quite fun to play around with!