Note that there are some explanatory texts on larger screens.

plurals
  1. POExtracting audio channel from Linear PCM
    text
    copied!<p>I would like to extract a channel audio from the an LPCM raw file ie extract left and right channel of a stereo LPCM file. The LPCM is 16 bit depth,interleaved, 2 channels,litle endian. From what I gather the order of byte is {LeftChannel,RightChannel,LeftChannel,RightChannel...} and since it is 16 bit depth there will be 2 bytes of sample for each channel right? </p> <p>So my question is if i want to extract the left channel then I would take the bytes in 0,2,4,6...n*2 address? while the right channel would be 1,3,4,...(n*2+1).</p> <p>Also after extracting the audio channel, should i set the format of the extracted channel as 16 bit depth ,1 channel?</p> <p>Thanks in advance</p> <p>This is the code that I currently use to extract PCM audio from AssetReader.. This code works fine with writing a music file without its channel being extracted so I it might be caused by the format or something...</p> <pre><code> NSURL *assetURL = [song valueForProperty:MPMediaItemPropertyAssetURL]; AVURLAsset *songAsset = [AVURLAsset URLAssetWithURL:assetURL options:nil]; NSDictionary *outputSettings = [NSDictionary dictionaryWithObjectsAndKeys: [NSNumber numberWithInt:kAudioFormatLinearPCM], AVFormatIDKey, [NSNumber numberWithFloat:44100.0], AVSampleRateKey, [NSNumber numberWithInt:2], AVNumberOfChannelsKey, // [NSData dataWithBytes:&amp;channelLayout length:sizeof(AudioChannelLayout)], AVChannelLayoutKey, [NSNumber numberWithInt:16], AVLinearPCMBitDepthKey, [NSNumber numberWithBool:NO], AVLinearPCMIsNonInterleaved, [NSNumber numberWithBool:NO],AVLinearPCMIsFloatKey, [NSNumber numberWithBool:NO], AVLinearPCMIsBigEndianKey, nil]; NSError *assetError = nil; AVAssetReader *assetReader = [[AVAssetReader assetReaderWithAsset:songAsset error:&amp;assetError] retain]; if (assetError) { NSLog (@"error: %@", assetError); return; } AVAssetReaderOutput *assetReaderOutput = [[AVAssetReaderAudioMixOutput assetReaderAudioMixOutputWithAudioTracks:songAsset.tracks audioSettings: outputSettings] retain]; if (! [assetReader canAddOutput: assetReaderOutput]) { NSLog (@"can't add reader output... die!"); return; } [assetReader addOutput: assetReaderOutput]; NSArray *dirs = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES); NSString *documentsDirectoryPath = [dirs objectAtIndex:0]; //CODE TO SPLIT STEREO [self setupAudioWithFormatMono:kAudioFormatLinearPCM]; NSString *splitExportPath = [[documentsDirectoryPath stringByAppendingPathComponent:@"monoleft.caf"] retain]; if ([[NSFileManager defaultManager] fileExistsAtPath:splitExportPath]) { [[NSFileManager defaultManager] removeItemAtPath:splitExportPath error:nil]; } AudioFileID mRecordFile; NSURL *splitExportURL = [NSURL fileURLWithPath:splitExportPath]; OSStatus status = AudioFileCreateWithURL(splitExportURL, kAudioFileCAFType, &amp;_streamFormat, kAudioFileFlags_EraseFile, &amp;mRecordFile); NSLog(@"status os %d",status); [assetReader startReading]; CMSampleBufferRef sampBuffer = [assetReaderOutput copyNextSampleBuffer]; UInt32 countsamp= CMSampleBufferGetNumSamples(sampBuffer); NSLog(@"number of samples %d",countsamp); SInt64 countByteBuf = 0; SInt64 countPacketBuf = 0; UInt32 numBytesIO = 0; UInt32 numPacketsIO = 0; NSMutableData * bufferMono = [NSMutableData new]; while (sampBuffer) { AudioBufferList audioBufferList; CMBlockBufferRef blockBuffer; CMSampleBufferGetAudioBufferListWithRetainedBlockBuffer(sampBuffer, NULL, &amp;audioBufferList, sizeof(audioBufferList), NULL, NULL, 0, &amp;blockBuffer); for (int y=0; y&lt;audioBufferList.mNumberBuffers; y++) { AudioBuffer audioBuffer = audioBufferList.mBuffers[y]; //frames = audioBuffer.mData; NSLog(@"the number of channel for buffer number %d is %d",y,audioBuffer.mNumberChannels); NSLog(@"The buffer size is %d",audioBuffer.mDataByteSize); //Append mono left to buffer data for (int i=0; i&lt;audioBuffer.mDataByteSize; i= i+4) { [bufferMono appendBytes:(audioBuffer.mData+i) length:2]; } //the number of bytes in the mutable data containing mono audio file numBytesIO = [bufferMono length]; numPacketsIO = numBytesIO/2; NSLog(@"numpacketsIO %d",numPacketsIO); status = AudioFileWritePackets(mRecordFile, NO, numBytesIO, &amp;_packetFormat, countPacketBuf, &amp;numPacketsIO, audioBuffer.mData); NSLog(@"status for writebyte %d, packets written %d",status,numPacketsIO); if(numPacketsIO != (numBytesIO/2)){ NSLog(@"Something wrong"); assert(0); } countPacketBuf = countPacketBuf + numPacketsIO; [bufferMono setLength:0]; } sampBuffer = [assetReaderOutput copyNextSampleBuffer]; countsamp= CMSampleBufferGetNumSamples(sampBuffer); NSLog(@"number of samples %d",countsamp); } AudioFileClose(mRecordFile); [assetReader cancelReading]; [self performSelectorOnMainThread:@selector(updateCompletedSizeLabel:) withObject:0 waitUntilDone:NO]; </code></pre> <p>The output format with audiofileservices is as follows:</p> <pre><code> _streamFormat.mFormatFlags = kLinearPCMFormatFlagIsSignedInteger | kLinearPCMFormatFlagIsPacked; _streamFormat.mBitsPerChannel = 16; _streamFormat.mChannelsPerFrame = 1; _streamFormat.mBytesPerPacket = 2; _streamFormat.mBytesPerFrame = 2;// (_streamFormat.mBitsPerChannel / 8) * _streamFormat.mChannelsPerFrame; _streamFormat.mFramesPerPacket = 1; _streamFormat.mSampleRate = 44100.0; _packetFormat.mStartOffset = 0; _packetFormat.mVariableFramesInPacket = 0; _packetFormat.mDataByteSize = 2; </code></pre>
 

Querying!

 
Guidance

SQuiL has stopped working due to an internal error.

If you are curious you may find further information in the browser console, which is accessible through the devtools (F12).

Reload