Note that there are some explanatory texts on larger screens.

plurals
  1. POWriting and encoding remote I/0 output to file - Core Audio
    primarykey
    data
    text
    <p>I'm coding a music recording app using Audio Units and I'm having some issues getting my resulting M4A file to play anything other than a not so awesome buzzing noise. I've used these SO <a href="https://stackoverflow.com/questions/10113977/recording-to-aac-from-remoteio-data-is-getting-written-but-file-unplayable">sources</a> <a href="https://stackoverflow.com/questions/8615358/can-anybody-help-me-in-recording-iphone-output-sound-through-audio-unit">as</a> <a href="https://stackoverflow.com/questions/7118429/how-to-record-sound-produced-by-mixer-unit-output-ios-core-audio-audio-graph">references</a>, and I've tried everything to troubleshoot.</p> <p>I've got an <code>AUGraph</code> with 2 nodes: a multi channel mixer and a remote I/O. I've got two input callbacks on my mixer: one that pulls input from the mic, and one that pulls from an audio file. The mixer output is connected to the input element of the output scope on the I/O unit. This enables simultaneous I/O.</p> <p>To capture the output I've added a callback and two methods:</p> <p><strong>The callback</strong></p> <pre><code>static OSStatus recordAndSaveCallback (void * inRefCon, AudioUnitRenderActionFlags * ioActionFlags, const AudioTimeStamp * inTimeStamp, UInt32 inBusNumber, UInt32 inNumberFrames, AudioBufferList * ioData) { Mixer* THIS = (__bridge Mixer*)inRefCon; AudioBufferList bufferList; OSStatus status; status = AudioUnitRender(THIS.ioUnit, ioActionFlags, inTimeStamp, 0, inNumberFrames, &amp;bufferList); SInt16 samples[inNumberFrames]; // A large enough size to not have to worry about buffer overrun memset (&amp;samples, 0, sizeof (samples)); bufferList.mNumberBuffers = 1; bufferList.mBuffers[0].mData = samples; bufferList.mBuffers[0].mNumberChannels = 1; bufferList.mBuffers[0].mDataByteSize = inNumberFrames*sizeof(SInt16); OSStatus result; if (*ioActionFlags == kAudioUnitRenderAction_PostRender) { result = ExtAudioFileWriteAsync(THIS.extAudioFileRef, inNumberFrames, &amp;bufferList); if(result) printf("ExtAudioFileWriteAsync %ld \n", result);} return noErr; } </code></pre> <p><strong>Recording Method:</strong></p> <pre><code>- (void)recordFile { OSStatus result; NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES); NSString *documentsDirectory = [paths objectAtIndex:0]; NSString *recordFile = [documentsDirectory stringByAppendingPathComponent: @"audio.m4a"]; CFURLRef destinationURL = CFURLCreateWithFileSystemPath(kCFAllocatorDefault, (__bridge CFStringRef)recordFile, kCFURLPOSIXPathStyle, false); AudioStreamBasicDescription destinationFormat; memset(&amp;destinationFormat, 0, sizeof(destinationFormat)); destinationFormat.mChannelsPerFrame = 1; destinationFormat.mFormatID = kAudioFormatMPEG4AAC; UInt32 size = sizeof(destinationFormat); result = AudioFormatGetProperty(kAudioFormatProperty_FormatInfo, 0, NULL, &amp;size, &amp;destinationFormat); if(result) printf("AudioFormatGetProperty %ld \n", result); result = ExtAudioFileCreateWithURL(destinationURL, kAudioFileM4AType, &amp;destinationFormat, NULL, kAudioFileFlags_EraseFile, &amp;extAudioFileRef); if(result) printf("ExtAudioFileCreateWithURL %ld \n", result); AudioStreamBasicDescription clientFormat; memset(&amp;clientFormat, 0, sizeof(clientFormat)); UInt32 clientsize = sizeof(clientFormat); result = AudioUnitGetProperty(ioUnit, kAudioUnitProperty_StreamFormat, kAudioUnitScope_Output, 0, &amp;clientFormat, &amp;clientsize); if(result) printf("AudioUnitGetProperty %ld \n", result); UInt32 codec = kAppleHardwareAudioCodecManufacturer; result = ExtAudioFileSetProperty(extAudioFileRef, kExtAudioFileProperty_CodecManufacturer, sizeof(codec), &amp;codec); if(result) printf("ExtAudioFileSetProperty %ld \n", result); result = ExtAudioFileSetProperty(extAudioFileRef,kExtAudioFileProperty_ClientDataFormat,sizeof(clientFormat), &amp;clientFormat); if(result) printf("ExtAudioFileSetProperty %ld \n", result); result = ExtAudioFileWriteAsync(extAudioFileRef, 0, NULL); if (result) {[self printErrorMessage: @"ExtAudioFileWriteAsync error" withStatus: result];} result = AudioUnitAddRenderNotify(ioUnit, recordAndSaveCallback, (__bridge void*)self); if (result) {[self printErrorMessage: @"AudioUnitAddRenderNotify" withStatus: result];} } </code></pre> <p><strong>Saving Method:</strong></p> <pre><code>- (void) saveFile { OSStatus status = ExtAudioFileDispose(extAudioFileRef); NSLog(@"OSStatus(ExtAudioFileDispose): %ld\n", status); } </code></pre> <p><strong>This is what I see in my console:</strong> </p> <pre><code>Stopping audio processing graph OSStatus(ExtAudioFileDispose): 0 ExtAudioFileWriteAsync -50 ExtAudioFileWriteAsync -50 ExtAudioFileWriteAsync -50 </code></pre> <p>It seems to me that my code is very similar to that of people who have gotten this to work, but clearly I've made a crucial error. I'm sure there must be others struggling with this.</p> <p>Does anyone have any insight?</p> <p>Thanks.</p>
    singulars
    1. This table or related slice is empty.
    1. This table or related slice is empty.
    plurals
    1. This table or related slice is empty.
    1. This table or related slice is empty.
 

Querying!

 
Guidance

SQuiL has stopped working due to an internal error.

If you are curious you may find further information in the browser console, which is accessible through the devtools (F12).

Reload