Note that there are some explanatory texts on larger screens.

plurals
  1. PO
    text
    copied!<p>If you're willing to drop support for pre-iPhone 3G S devices (iPhone and iPhone 3G), I'd suggest using OpenGL ES 2.0 shaders for this. While it may be easy to overlay a CALayer containing a pixelated version of the image, I think you'll find the performance to be lacking. </p> <p>In <a href="http://www.sunsetlakesoftware.com/2010/10/22/gpu-accelerated-video-processing-mac-and-ios" rel="nofollow">my tests</a>, performing a simple CPU-based calculation on every pixel of a 480 x 320 image led to a framerate of about 4 FPS on an iPhone 4. You might be able to sample only a fraction of these pixels to achieve the desired effect, but it still will be a slow operation to redraw a pixelated image to match the live video.</p> <p>Instead, if you use an OpenGL ES 2.0 fragment shader to process the incoming live video image, you should be able to take in the raw camera image, apply this filter selectively over the desired area, and either display or save the resulting camera image. This processing will take place almost entirely on the GPU, which I've found to do simple operations like this at 60 FPS on the iPhone 4.</p> <p>While getting a fragment shader to work quite right can require a little setup, you might be able to use <a href="http://www.sunsetlakesoftware.com/sites/default/files/ColorTracking.zip" rel="nofollow">this sample application</a> I wrote for processing camera input and doing color tracking to be a decent starting point. You might also look at the touch gesture I use there, where I take the initial touch down point to be the location to center an effect around and then a subsequent drag distance to control the strength or radius of an effect.</p>
 

Querying!

 
Guidance

SQuiL has stopped working due to an internal error.

If you are curious you may find further information in the browser console, which is accessible through the devtools (F12).

Reload