Note that there are some explanatory texts on larger screens.

plurals
  1. POExtracting ID3 tags from MP3 over HTTP Live Streaming
    primarykey
    data
    text
    <p>I've been having quite a difficult time extracting ID3 information from an MP3 being streamed over Live HTTP Streaming (using the Wowza media server, if anyone is curious). I know that the tags (right now the album tag and the album artwork tag) are being properly embedded in each of the file segments because when I download them manually I can see them in each segment as listed in the <code>.m3u</code> index file generated by the server.</p> <p>I am using the <code>AVFoundation</code> classes to do this, and I have it setup as such:</p> <pre><code>- (void)initializeAudioStream { NSURL *streamUrl = [NSURL URLWithString:self.urlField.text]; AVPlayerItem *playerItem = [AVPlayerItem playerItemWithURL:streamUrl]; self.musicPlayer = [AVPlayer playerWithPlayerItem:playerItem]; self.musicPlayer.actionAtItemEnd = AVPlayerActionAtItemEndNone; [self.musicPlayer addObserver:self forKeyPath:@"status" options:NSKeyValueObservingOptionNew context:NULL]; } </code></pre> <p>Once my KVO method is triggered, I start playing <code>self.musicPlayer</code> and I call <code>addPeriodicTimeObserverForInterval</code> on it for each 1/4 second. It is in this method that I try to extract the ID3 metadata.</p> <p>I have tried everything I can think of on the iOS side of things to accomplish this, including printing out</p> <pre><code>self.musicPlayer.currentItem.asset.commonMetadata </code></pre> <p>as well as iterating over each of the <code>AVAssetTrack</code> instances and printing out <em>their</em> metadata.</p> <pre><code>for (AVAssetTrack *track in self.musicPlayer.currentItem.asset.tracks) { NSLog(@"Media type of track: %@", track.mediaType); NSLog(@"Track metadata: %@", track.commonMetadata); } </code></pre> <p>What's interesting is that the asset always says it has 2 tracks. When I print out their <code>mediaType</code> property I get "soun" for the first one and "tmet" for the second. My assumption is that the first track is the audio data itself and the second track is metadata. However, I only ever see an empty array in <code>commonMetadata</code>.</p> <p>I also check the status of the properties using <code>statusOfValueForKey:error</code> on the tracks, and the <code>commonMetadata</code> key always comes back as <code>AVKeyValueStatusLoaded</code>.</p> <p>Any ideas? I'm at a complete loss here.</p> <p>Also, I am currently running this through the iPhone 4 simulator running iOS 4.2.1. I can't yet put it on a device since Apple is still approving my company's developer account.</p>
    singulars
    1. This table or related slice is empty.
    1. This table or related slice is empty.
    plurals
    1. This table or related slice is empty.
    1. This table or related slice is empty.
 

Querying!

 
Guidance

SQuiL has stopped working due to an internal error.

If you are curious you may find further information in the browser console, which is accessible through the devtools (F12).

Reload