Note that there are some explanatory texts on larger screens.

plurals
  1. POiOS: capturing CAEmitterLayer particles on screen
    text
    copied!<p>Is there a way to capture CAEmitterCells (generated using a CAEmitterLayer) when capturing the ios device screen?<br> <strong>UIGetScreenImage()</strong> works, but since it's a private method im not allowed to use it.<br> <strong>UIGraphicsBeginImageContext</strong> doesn't seem to work, the particles are simply omitted from the resulting image.</p> <p><strong>EDIT:</strong> Here is the code I'm currently using to capture the view. I'm actually recording a 30-second-long video of the screen, using the code provided by aroth at <a href="http://codethink.no-ip.org/wordpress/archives/673" rel="noreferrer">here</a>. It works by recording 25 images of itself (its a UIView subclass) and its subviews (in our case including the UIView whose layer is the CAEmitterLayer) per second and uses AVAssetWriter to compose the recording.<br><br> It's quite a mouthful, so I'll just place the relevant lines here: I ARC-ed the code using the ARC tool in XCode, so the code might be a bit different memory management wise.</p> <pre><code>- (CGContextRef) createBitmapContextOfSize:(CGSize) size { CGContextRef context = NULL; CGColorSpaceRef colorSpace; int bitmapByteCount; int bitmapBytesPerRow; bitmapBytesPerRow = (size.width * 4); bitmapByteCount = (bitmapBytesPerRow * size.height); colorSpace = CGColorSpaceCreateDeviceRGB(); if (bitmapData != NULL) { free(bitmapData); } bitmapData = malloc( bitmapByteCount ); if (bitmapData == NULL) { fprintf (stderr, "Memory not allocated!"); return NULL; } context = CGBitmapContextCreate (bitmapData, size.width, size.height, 8, // bits per component bitmapBytesPerRow, colorSpace, kCGImageAlphaNoneSkipFirst); CGContextSetAllowsAntialiasing(context,NO); if (context== NULL) { free (bitmapData); fprintf (stderr, "Context not created!"); return NULL; } CGColorSpaceRelease( colorSpace ); return context; } //static int frameCount = 0; //debugging - (void) drawRect:(CGRect)rect { NSDate* start = [NSDate date]; CGContextRef context = [self createBitmapContextOfSize:self.frame.size]; //not sure why this is necessary...image renders upside-down and mirrored CGAffineTransform flipVertical = CGAffineTransformMake(1, 0, 0, -1, 0, self.frame.size.height); CGContextConcatCTM(context, flipVertical); [self.layer renderInContext:context]; CGImageRef cgImage = CGBitmapContextCreateImage(context); UIImage* background = [UIImage imageWithCGImage: cgImage]; CGImageRelease(cgImage); self.currentScreen = background; //debugging //if (frameCount &lt; 40) { // NSString* filename = [NSString stringWithFormat:@"Documents/frame_%d.png", frameCount]; // NSString* pngPath = [NSHomeDirectory() stringByAppendingPathComponent:filename]; // [UIImagePNGRepresentation(self.currentScreen) writeToFile: pngPath atomically: YES]; // frameCount++; //} //NOTE: to record a scrollview while it is scrolling you need to implement your UIScrollViewDelegate such that it calls // 'setNeedsDisplay' on the ScreenCaptureView. if (_recording) { float millisElapsed = [[NSDate date] timeIntervalSinceDate:startedAt] * 1000.0; [self writeVideoFrameAtTime:CMTimeMake((int)millisElapsed, 1000)]; } float processingSeconds = [[NSDate date] timeIntervalSinceDate:start]; float delayRemaining = (1.0 / self.frameRate) - processingSeconds; CGContextRelease(context); //redraw at the specified framerate [self performSelector:@selector(setNeedsDisplay) withObject:nil afterDelay:delayRemaining &gt; 0.0 ? delayRemaining : 0.01]; } </code></pre> <p>Really hope this helps. Thanks for your support!</p>
 

Querying!

 
Guidance

SQuiL has stopped working due to an internal error.

If you are curious you may find further information in the browser console, which is accessible through the devtools (F12).

Reload