Thursday, February 3, 2011

Create a Basic iPhone Audio Player with AV Foundation Framework


Last week we received an email requesting a tutorial that covers playing sounds on the iPhone. I started digging into the documentation and it appears the SDK offers two choices. Using the AV Foundation Framework, which makes it very easy, or using Audio Queue Services, which is about the most difficult thing in the world. The person in the email mentioned buffers and such, so it is clear that this person wanted a tutorial covering Audio Queue Services. I'll be getting to a tutorial on that in the near future, but for now I'm going to create a simple audio player that should cover most people's needs using the AV Foundation Framework.

I think the biggest criticism I have about programming for the iPhone is simply finding documentation about things I want to do. I actually got pretty far into Audio Queue Services before I even discovered AV Foundation Framework. The entire time in between I was cursing Apple's name for making something so simple - playing audio - so difficult to program. But, that time's over and having discovered AV Foundation Framework, I'm once again a content iPhone developer.

Below is a screenshot for what we'll be building today. It's simply a small application with two buttons - Play and Stop. Play begins playing an audio file I've embedded into the app as as resource, and Stops stops the playback.

Example Application Screenshot

I guess the first thing you're going to need to do is get some audio to play and add it as a resource. The playback is based around the AVAudioPlayer class, which supports lots of different formats. As far as actually adding it, it's as simple as right mouse clicking on the resources folder, selected Add Existing Files, and selecting your sound file.

Now we need to bring AV Foundation Framework into our project. For me, this wasn't in the ordinary list of frameworks you see when you right mouse click on Frameworks and select Add / Existing Frameworks. I added the framework from here:

/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS3.0.sdk/System/
Library/Frameworks/AVFoundation.framework

The next thing we need to do is create the user interface. I added a new UIViewController subclass called AudioPlayer to handle this. I then used Interface Builder to slap a couple of buttons on the screen and hook up the button presses to my code. If you're not familiar with how to create user interfaces using Interface Builder, I would recommend checking out our getting started tutorial. Here's the header file for my completed view controller.

#import

@class AVAudioPlayer;

@interface AudioPlayer : UIViewController {
IBOutlet UIButton *playButton;
IBOutlet UIButton *stopButton;
AVAudioPlayer *audioPlayer;
}

@property (nonatomic, retain) IBOutlet UIButton *playButton;
@property (nonatomic, retain) IBOutlet UIButton *stopButton;
@property (nonatomic, retain) AVAudioPlayer *audioPlayer;

-(IBAction)play;
-(IBAction)stop;

@end

Here I simply have some properties to hold my buttons, and another to hold the AVAudioPlayer that will actually be playing the sounds. I also stuck in a couple of methods that are invoked when the buttons are pressed.

Most of the work is placed in the viewDidLoad function for this view controller. Here's the contents of that function.

- (void)viewDidLoad {
[super viewDidLoad];

// Get the file path to the song to play.
NSString *filePath = [[NSBundle mainBundle] pathForResource:@"TNG_Theme"
ofType:@"mp3"];

// Convert the file path to a URL.
NSURL *fileURL = [[NSURL alloc] initFileURLWithPath:filePath];

//Initialize the AVAudioPlayer.
self.audioPlayer = [[AVAudioPlayer alloc]
initWithContentsOfURL:fileURL error:nil];

// Preloads the buffer and prepares the audio for playing.
[self.audioPlayer prepareToPlay];

[filePath release];
[fileURL release];

}

The first thing we need to do is get a file path for my audio file. The file I'm using is the theme song for Star Trek - The Next Generation, and it's an mp3. I then convert the path to a NSURL object, which is what AVAudioPlayer needs when it's being initialized. Next up is actually initializing the audio player. I set the error to nil since I don't care about getting error callbacks for this tutorial. The next thing I do is call prepareToPlay. This function will initialize the buffers and prepare the hardware for audio playback. This function is used to reduce the amount of time between when the play button is pressed and when audio is actually heard.

The only thing left is to actually start and stop the audio when the buttons are pressed.

-(IBAction)play {

// Make sure the audio is at the start of the stream.
self.audioPlayer.currentTime = 0;

[self.audioPlayer play];

}

-(IBAction)stop {

[self.audioPlayer stop];

}

These calls should be very straight forward. The only oddity is setting the currentTime property. Since stop does not reset the song to the begining, this call ensures the audio is played from the beginning anytime the play button is pressed.

And that's it for this tutorial. I've attached my entire XCode project below. As I mentioned before, I'm still working my way through Audio Queue Services, and I'll be creating a tutorial on that in the near future. If you've got any questions, feel free to ask in the comments.

AVAudio Example programs

http://www.modejong.com/iPhone/

Using the iPhone to Record Audio. A guide to the steps required to get it working in the SDK

Today my experiment is going to be learning how to use the iPhone SDK to record and play back audio.

To begin all I want to do is have a button that starts and stops the recording and another button that will play back the recorded sound.

Sounds simple enough. Here is a small screen shot of the finished result.

The classes that we are using for this project are : AVAudioSession, AVAudioRecorder and AVAudioPlayer

These can be located in the AVFoundation framework.

It took some work to figure out exactly how to add the AVFoundation framework to the iPhone project. If you dont do this you will get linker errors when you try to compile the project. You will also want to add the CoreAudio.Framework. Some of the constants are defined in the headers in this framework.

From your XCode interface you are going to select the Frameworks folder, ctl>click and choose ADD and then select Existing Frameworks… Then choose both the CoreAudio.Framework and the AVFoundation.Framework

The AVFoundation framework was not originally available in my selections and it took me some work to actually locate the proper folder where the framework was located. I ended up using the search tool to find the proper location for the framework.

Once You have added the framework you then need to #Import the headers into your project.

The 2 headers you need to #import into your controller.h are:

AVFoundation.h
CoreAudioTypes.h

This gives you access to the class and allows the linker to properly find everything to build the binaries properly.

Also in the class definition for the controller class you will need to be sure that you are implementing the AVAudioRecorder delegate methods. That looks like this:

@interface record_audio_testViewController : UIViewController

{


I have a couple member variables that I use in a couple of places and those are also defined here…


//Variables setup for access in the class:
NSURL * recordedTmpFile;
AVAudioRecorder * recorder;
NSError * error;


Here is a screen shot of the controller.h file for reference:

Now that we have the basics set for the class, lets start to work on the meat of getting recording to with with AVAudioRecorder

In my -(void)ViewDidLoad you will see where I set up the Audio session. You have to do this to let the iPhone OS know that your application is going to handle audio and what kind of session it will be. There is a larger discussion around this, but for this example I am using the Play and Record session type.

// Implement viewDidLoad to do additional setup after loading the view, typically from a nib.
- (void)viewDidLoad {
[super viewDidLoad];

//Start the toggle in true mode.
toggle = YES;
btnPlay.hidden = YES;

//Instanciate an instance of the AVAudioSession object.
AVAudioSession * audioSession = [AVAudioSession sharedInstance];
//Setup the audioSession for playback and record.
//We could just use record and then switch it to playback leter, but
//since we are going to do both lets set it up once.
[audioSession setCategory:AVAudioSessionCategoryPlayAndRecord error: &error];
//Activate the session
[audioSession setActive:YES error: &error];

}

See the screen shot for reference:

Here we have the AVAudioRecorder in action. When the button is pressed we instantiate an instance with settings that tell the recorder what kind of audio file to make.

This is a little squished… I will work on this to make things easier in the future.


//Begin the recording session.
//Error handling removed. Please add to your own code.

//Setup the dictionary object with all the recording settings that this
//Recording sessoin will use
//Its not clear to me which of these are required and which are the bare minimum.
//This is a good resource: http://www.totodotnet.net/tag/avaudiorecorder/
NSMutableDictionary* recordSetting = [[NSMutableDictionary alloc] init];
[recordSetting setValue :[NSNumber numberWithInt:kAudioFormatAppleIMA4] forKey:AVFormatIDKey];
[recordSetting setValue:[NSNumber numberWithFloat:44100.0] forKey:AVSampleRateKey];
[recordSetting setValue:[NSNumber numberWithInt: 2] forKey:AVNumberOfChannelsKey];

//Now that we have our settings we are going to instanciate an instance of our recorder instance.
//Generate a temp file for use by the recording.
//This sample was one I found online and seems to be a good choice for making a tmp file that
//will not overwrite an existing one.
//I know this is a mess of collapsed things into 1 call. I can break it out if need be.
recordedTmpFile = [NSURL fileURLWithPath:[NSTemporaryDirectory() stringByAppendingPathComponent: [NSString stringWithFormat: @"%.0f.%@", [NSDate timeIntervalSinceReferenceDate] * 1000.0, @"caf"]]];
NSLog(@"Using File called: %@",recordedTmpFile);
//Setup the recorder to use this file and record to it.
recorder = [[ AVAudioRecorder alloc] initWithURL:recordedTmpFile settings:recordSetting error:&error];
//Use the recorder to start the recording.
//Im not sure why we set the delegate to self yet.
//Found this in antother example, but Im fuzzy on this still.
[recorder setDelegate:self];
//We call this to start the recording process and initialize
//the subsstems so that when we actually say "record" it starts right away.
[recorder prepareToRecord];
//Start the actual Recording
[recorder record];
//There is an optional method for doing the recording for a limited time see
//[recorder recordForDuration:(NSTimeInterval) 10]

We are loading up the audio into a temp file that we are then saving on the device. Later on we are going to use this same file to play back the audio when the button is pressed:

This is the example from the play_button_pressed code:

//The play button was pressed…
//Setup the AVAudioPlayer to play the file that we just recorded.
AVAudioPlayer * avPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL:recordedTmpFile error:&error];
[avPlayer prepareToPlay];
[avPlayer play];

Armed with these basics you should be able to get Sound recording and playback working on your iPhone pretty easily.

In this sample I have removed the error handling so it is easier to read, so remember to add back in your error handling when you put this code into action.

Questions? Please feel free to send them along.

To download the entire sample project for AVAudioSession with source code

Click Here

Wednesday, February 2, 2011

Map Kit with Annotations

Hello readers in today’s post we will see how to add annotations to the map, and in case you forgot some basics of the map kit framework you can always refer to my earlier post.

Design Phase: The design phase is quite simple we will be showing a pin onto our map just like the once present in google map. Here’s a view at our final output in case you are getting confused.


Step 1: In order to display an annotation we have to first create a NSObject class give it an appropriate name and then add some piece of code to it, but before even doing that make sure you have added the map kit framework to your project. Coming to the coding section now.

Create a class which inherits from NSObject and save it with the name LocationwithAnnotations,






import the map kit framework and add a protocol to your class called MKAnnotation here’s how the entire code looks like

#import

//import the map kit header file
#import

@interface LocationwithAnnotation : NSObject {

}
@end

MKAnnotation is a protocol which is used to provide annotation related information in the map view.


Step 2: Select the LocationwithAnnotation.m file and add this piece of code, you can always refer the MKAnnotation in the documentation in order to get its method.

@implementation LocationwithAnnotation

- (NSString *)title
{
return @"Current Location";
}

- (NSString *)subtitle
{
return @"Here i am";
}

-(CLLocationCoordinate2D) coordinate
{
CLLocationCoordinate2D coord = {19.12,73.02};
return coord;
}
@end



Code Explanation: In the documentation for MKAnnotation protocol there are two methods and one property.

The methods are quite simple they just display the title and subtitle that will be appearing once the user hits the annotation.

The property says that "fine you wana display an annotation then tell me the location where you want to place the annotation" that’s all and since it’s a read only property I have added my own method to it so that it knows the location where to place the annotation.

Step 3: Create a map interface by referring my earlier map kit post and then select the the .h file of myviewController and import the LoacationwithAnnotation header file and declare its object in the .h file

#import
#import
#import "LocationwithAnnotation.h"

@interface myviewController : UIViewController {

MKMapView *mymapview;
LocationwithAnnotation * mylocation;
}
@end

now in the .m file of myviewController select the init method and allocate memory for the LocationwithAnnotation object.

- (id)initWithNibName:(NSString *)nibNameOrNil bundle:(NSBundle *)nibBundleOrNil {
if (self = [super initWithNibName:nibNameOrNil bundle:nibBundleOrNil]) {
// Custom initialization
mylocation = [[LocationwithAnnotation alloc]init];
}
return self;
}

now the final step is to add the annotation to the object of the MKMapView in the loadView method

[mymapview addAnnotation:mylocation];


Step 4: Carry out the basics steps to show the view into the iPhone simulator window by selecting the AppDelegate.m file and then press build and Go

Step 5: You will get the below output.



Extra Points: In the above example i have hard coded the coordinates and if you want to display the coordinates by letting the user specify the values then you can do this by adding some parameters to the init method of the NSObject class that you will be making in this case LocationwithAnnotations.


I hope that this post helped you out, Happy iCoding and have a great Day

How To Make A Simple RSS Reader iPhone App Tutorial

file:///Users/nagamuralikrishnapanja/Desktop/How%20To%20Make%20A%20Simple%20RSS%20Reader%20iPhone%20App%20Tutorial%20%7C%20Ray%20Wenderlich.webarchive

Drawing polyines or routes on a MKMapView with Annotations.

http://spitzkoff.com/craig/