Note that there are some explanatory texts on larger screens.

plurals
  1. PO
    text
    copied!<h2>Core Image</h2> <p>Since that image in the screenshot is static, you could use <code>CIGaussianBlur</code> from Core Image (requires iOS 6). Here is sample: <a href="https://github.com/evanwdavis/Fun-with-Masks/blob/master/Fun%20with%20Masks/EWDBlurExampleVC.m">https://github.com/evanwdavis/Fun-with-Masks/blob/master/Fun%20with%20Masks/EWDBlurExampleVC.m</a> </p> <p>Mind you, this is slower than the other options on this page.</p> <pre><code>#import &lt;QuartzCore/QuartzCore.h&gt; - (UIImage*) blur:(UIImage*)theImage { // ***********If you need re-orienting (e.g. trying to blur a photo taken from the device camera front facing camera in portrait mode) // theImage = [self reOrientIfNeeded:theImage]; // create our blurred image CIContext *context = [CIContext contextWithOptions:nil]; CIImage *inputImage = [CIImage imageWithCGImage:theImage.CGImage]; // setting up Gaussian Blur (we could use one of many filters offered by Core Image) CIFilter *filter = [CIFilter filterWithName:@"CIGaussianBlur"]; [filter setValue:inputImage forKey:kCIInputImageKey]; [filter setValue:[NSNumber numberWithFloat:15.0f] forKey:@"inputRadius"]; CIImage *result = [filter valueForKey:kCIOutputImageKey]; // CIGaussianBlur has a tendency to shrink the image a little, // this ensures it matches up exactly to the bounds of our original image CGImageRef cgImage = [context createCGImage:result fromRect:[inputImage extent]]; UIImage *returnImage = [UIImage imageWithCGImage:cgImage];//create a UIImage for this function to "return" so that ARC can manage the memory of the blur... ARC can't manage CGImageRefs so we need to release it before this function "returns" and ends. CGImageRelease(cgImage);//release CGImageRef because ARC doesn't manage this on its own. return returnImage; // *************** if you need scaling // return [[self class] scaleIfNeeded:cgImage]; } +(UIImage*) scaleIfNeeded:(CGImageRef)cgimg { bool isRetina = [[[UIDevice currentDevice] systemVersion] intValue] &gt;= 4 &amp;&amp; [[UIScreen mainScreen] scale] == 2.0; if (isRetina) { return [UIImage imageWithCGImage:cgimg scale:2.0 orientation:UIImageOrientationUp]; } else { return [UIImage imageWithCGImage:cgimg]; } } - (UIImage*) reOrientIfNeeded:(UIImage*)theImage{ if (theImage.imageOrientation != UIImageOrientationUp) { CGAffineTransform reOrient = CGAffineTransformIdentity; switch (theImage.imageOrientation) { case UIImageOrientationDown: case UIImageOrientationDownMirrored: reOrient = CGAffineTransformTranslate(reOrient, theImage.size.width, theImage.size.height); reOrient = CGAffineTransformRotate(reOrient, M_PI); break; case UIImageOrientationLeft: case UIImageOrientationLeftMirrored: reOrient = CGAffineTransformTranslate(reOrient, theImage.size.width, 0); reOrient = CGAffineTransformRotate(reOrient, M_PI_2); break; case UIImageOrientationRight: case UIImageOrientationRightMirrored: reOrient = CGAffineTransformTranslate(reOrient, 0, theImage.size.height); reOrient = CGAffineTransformRotate(reOrient, -M_PI_2); break; case UIImageOrientationUp: case UIImageOrientationUpMirrored: break; } switch (theImage.imageOrientation) { case UIImageOrientationUpMirrored: case UIImageOrientationDownMirrored: reOrient = CGAffineTransformTranslate(reOrient, theImage.size.width, 0); reOrient = CGAffineTransformScale(reOrient, -1, 1); break; case UIImageOrientationLeftMirrored: case UIImageOrientationRightMirrored: reOrient = CGAffineTransformTranslate(reOrient, theImage.size.height, 0); reOrient = CGAffineTransformScale(reOrient, -1, 1); break; case UIImageOrientationUp: case UIImageOrientationDown: case UIImageOrientationLeft: case UIImageOrientationRight: break; } CGContextRef myContext = CGBitmapContextCreate(NULL, theImage.size.width, theImage.size.height, CGImageGetBitsPerComponent(theImage.CGImage), 0, CGImageGetColorSpace(theImage.CGImage), CGImageGetBitmapInfo(theImage.CGImage)); CGContextConcatCTM(myContext, reOrient); switch (theImage.imageOrientation) { case UIImageOrientationLeft: case UIImageOrientationLeftMirrored: case UIImageOrientationRight: case UIImageOrientationRightMirrored: CGContextDrawImage(myContext, CGRectMake(0,0,theImage.size.height,theImage.size.width), theImage.CGImage); break; default: CGContextDrawImage(myContext, CGRectMake(0,0,theImage.size.width,theImage.size.height), theImage.CGImage); break; } CGImageRef CGImg = CGBitmapContextCreateImage(myContext); theImage = [UIImage imageWithCGImage:CGImg]; CGImageRelease(CGImg); CGContextRelease(myContext); } return theImage; } </code></pre> <h2>Stack blur (Box + Gaussian)</h2> <ul> <li><a href="https://github.com/tomsoft1/StackBluriOS">StackBlur</a> This implements a mix of Box and Gaussian blur. 7x faster than non accelerated gaussian, but not so ugly as box blur. See a demo in <a href="http://incubator.quasimondo.com/processing/fast_blur_deluxe.php">here</a> (Java plugin version) or <a href="http://www.quasimondo.com/StackBlurForCanvas/StackBlurDemo.html">here</a> (JavaScript version). This algorithm is used in KDE and Camera+ and others. It doesn't use the Accelerate Framework but it's fast.</li> </ul> <h2>Accelerate Framework</h2> <ul> <li><p>In the session “Implementing Engaging UI on iOS” from <a href="https://developer.apple.com/wwdc/videos/">WWDC 2013</a> Apple explains how to create a blurred background (at 14:30), and mentions a method <a href="https://developer.apple.com/downloads/download.action?path=wwdc_2013/wwdc_2013_sample_code/ios_uiimageeffects.zip"><code>applyLightEffect</code></a> implemented in the sample code using Accelerate.framework. </p></li> <li><p><a href="https://github.com/BradLarson/GPUImage">GPUImage</a> uses OpenGL shaders to create dynamic blurs. It has several types of blur: GPUImageBoxBlurFilter, GPUImageFastBlurFilter, GaussianSelectiveBlur, GPUImageGaussianBlurFilter. There is even a GPUImageiOSBlurFilter that “should fully replicate the blur effect provided by iOS 7's control panel” (<a href="https://twitter.com/bradlarson/status/391684261369368576/">tweet</a>, <a href="http://www.sunsetlakesoftware.com/2013/10/21/optimizing-gaussian-blurs-mobile-gpu">article</a>). The article is detailed and informative.</p></li> </ul> <pre> -(UIImage *)blurryGPUImage:(UIImage *)image withBlurLevel:(NSInteger)blur { GPUImageFastBlurFilter *blurFilter = [GPUImageFastBlurFilter new]; blurFilter.blurSize = blur; UIImage *result = [blurFilter imageByFilteringImage:image]; return result; } </pre> <ul> <li><p>From indieambitions.com: <a href="http://indieambitions.com/idevblogaday/perform-blur-vimage-accelerate-framework-tutorial/">Perform a blur using vImage</a>. The algorithm is also used in <a href="https://github.com/alexdrone/ios-realtimeblur">iOS-RealTimeBlur</a>.</p></li> <li><p>From Nick Lockwood: <a href="https://github.com/nicklockwood/FXBlurView">https://github.com/nicklockwood/FXBlurView</a> The example shows the blur over a scroll view. It blurs with dispatch_async, then syncs to call updates with UITrackingRunLoopMode so the blur is not lagged when UIKit gives more priority to the scroll of the UIScrollView. This is explained in Nick's book <a href="http://www.informit.com/store/ios-core-animation-advanced-techniques-9780133440751">iOS Core Animation</a>, which btw it's great.</p></li> <li><p><a href="https://github.com/JagCesar/iOS-blur">iOS-blur</a> This takes the blurring layer of the UIToolbar and puts it elsewhere. Apple will reject your app if you use this method. See <a href="https://github.com/mochidev/MDBlurView/issues/4">https://github.com/mochidev/MDBlurView/issues/4</a></p></li> <li><p>From Evadne blog: <a href="http://blog.radi.ws/post/61836396624/livefrost-fast-synchronous-uiview-snapshot-convolving">LiveFrost: Fast, Synchronous UIView Snapshot Convolving</a>. Great code and a great read. Some ideas from this post: </p> <ul> <li>Use a serial queue to throttle updates from CADisplayLink.</li> <li>Reuse bitmap contexts unless bounds change.</li> <li>Draw smaller images using -[CALayer renderInContext:] with a 0.5f scale factor.</li> </ul></li> </ul> <h2>Other stuff</h2> <p>Andy Matuschak <a href="http://twitter.com/andy_matuschak/status/345677479534542848">said</a> on Twitter: “you know, a lot of the places where it looks like we're doing it in real time, it's static with clever tricks.” </p> <p>At <a href="http://www.doubleencore.com/2013/09/the-essential-ios-7-design-guide/">doubleencore.com</a> they say “we’ve found that a 10 pt blur radius plus a 10 pt increase in saturation best mimics iOS 7’s blur effect under most circumstances”. </p> <p>A peek at the private headers of Apple's <a href="https://github.com/JaviSoto/iOS7-Runtime-Headers/blob/master/PrivateFrameworks/SpringBoardFoundation.framework/SBFProceduralWallpaperView.h">SBFProceduralWallpaperView</a>.</p> <p>Finally, this isn't a real blur, but remember you can set rasterizationScale to get a pixelated image: <a href="http://www.dimzzy.com/blog/2010/11/blur-effect-for-uiview/">http://www.dimzzy.com/blog/2010/11/blur-effect-for-uiview/</a></p>
 

Querying!

 
Guidance

SQuiL has stopped working due to an internal error.

If you are curious you may find further information in the browser console, which is accessible through the devtools (F12).

Reload