Swift: Bitmap data and filters

Note: Source code (FilterExample) available on Github

I was wondering how bitmap programming works on iOS, just like with BitmapData in Flash, I wanted to perform simple operations for painting but also curious about how filters work. Again, in AS3, you would use the applyFilter API on BitmapData, so here is how things work on iOS/MacOS with Swift and the Quartz/Core Image APIs.

Graphics Context and the Quartz drawing engine

In Flash, to perform drawing operations, you create a BitmapData object and use the pixel APIs defined on it. On iOS/MacOS, things are different, you work with a graphics context (which is offscreen) that you manipulate through high-level functions like CGContextSetRGBFillColor, the CG at the beginning stands for Core Graphics which leverages the powerful Quartz drawing engine behind the scenes.

To initiate the drawing, we create a context, by specifying its size, opaque or not and its scaling:

UIGraphicsBeginImageContextWithOptions(CGSize(width: 200, height: 200), true, 1)

Note that we will make our bitmap of 200 by 200px and will be opaque. We pass 1 for the scaling, because to get the size of the bitmap in pixels, you must multiply the width and height values by the value in the scale parameter. If we had passed 0, it would have taken the scaling of the device's screen.

We now have our context created offscreen, ready for the drawing commands to be passed, but we still don't have a reference to it. The previous high-level function UIGraphicsBeginImageContextWithOptions creates the context but did not give us a reference to it, for this, we call the UIGraphicsGetCurrentContext API:

let context = UIGraphicsGetCurrentContext()

Ok, now we are ready to draw, so we use the high-level APIs for that, the names are pretty explicit about their purpose:

CGContextSetRGBFillColor (context, 1, 1, 0, 1)
CGContextFillRect (context, CGRectMake (0, 0, 200, 200))
CGContextSetRGBFillColor (context, 1, 0, 0, 1)
CGContextFillRect (context, CGRectMake (0, 0, 100, 100))
CGContextSetRGBFillColor (context, 1, 1, 0, 1)
CGContextFillRect (context, CGRectMake (0, 0, 50, 50))
CGContextSetRGBFillColor (context, 0, 0, 1, 0.5);
CGContextFillRect (context, CGRectMake (0, 0, 50, 100))

We are now drawing offscreen. Note that at this point, this is not really a bitmap yet that can be displayed, this is really just raw pixels painted. To display this on screen, we need a high-level wrapper, just like in Flash and the relationship between Bitmap and BitmapData. So we will use the UIGraphicsGetImageFromCurrentImageContext API, which will basically take a snapshot/raster of our drawing:

var image = UIGraphicsGetImageFromCurrentImageContext()

At this point, we could just display our UIImage object returned here. Because I am using SpriteKit for my experiments, we need to wrap the UIImage object into an SKSprite object that holds a SKTexture object, so that gives us:

// we create a texture, pass the UIImage
var texture = SKTexture(image: image)
// wrap it inside a sprite node
var sprite = SKSpriteNode(texture:texture)
// we scale it a bit
sprite.setScale(0.5);
// we position it
sprite.position = CGPoint (x: 510, y: 280)
// let's display it
self.addChild(sprite)

This is what you get:

Simple bitmap data

Here is the full code:

import SpriteKit

class GameScene: SKScene {
    override func didMoveToView(view: SKView) {
        /* Setup your scene here */
        
        // we create the graphics context
        UIGraphicsBeginImageContextWithOptions(CGSize(width: 200, height: 200), true, 1)
        
        // we retrieve it
        let context = UIGraphicsGetCurrentContext()
        
        // we issue drawing commands
        CGContextSetRGBFillColor (context, 1, 1, 0, 1);
        CGContextFillRect (context, CGRectMake (0, 0, 200, 200));// 4
        CGContextSetRGBFillColor (context, 1, 0, 0, 1);// 3
        CGContextFillRect (context, CGRectMake (0, 0, 100, 100));// 4
        CGContextSetRGBFillColor (context, 1, 1, 0, 1);// 3
        CGContextFillRect (context, CGRectMake (0, 0, 50, 50));// 4
        CGContextSetRGBFillColor (context, 0, 0, 1, 0.5);// 5
        CGContextFillRect (context, CGRectMake (0, 0, 50, 100));
        
        // we query an image from it
        let image = UIGraphicsGetImageFromCurrentImageContext()
        
        // we create a texture, pass the UIImage
        let texture = SKTexture(image: image)
        // wrap it inside a sprite node
        let sprite = SKSpriteNode(texture:texture)
        // we scale it a bit
        sprite.setScale(0.5);
        // we position it
        sprite.position = CGPoint (x: 510, y: 380)
        // let's display it
        self.addChild(sprite)
    }
    
    override func update(currentTime: CFTimeInterval) {
        /* Called before each frame is rendered */
    }
}

Pretty simple right? Now you can move your sprite, animate it, scale it, etc. But what if we have an existing image and we want to apply filters on it. In Flash, a loaded bitmap resource would give us a Bitmap object that had a bitmapData property pointing to the bitmap data that we could work with. How does that work here? This is where Core Image comes into play.

Core Image

This is where it gets really cool. If you need to apply filters and perform any video or image processing, real time, you use the powerful Core Image APIs. So let's take the image below, unprocessed:

Ayden no filter

Now, let's apply a filter with the code below. In that example we use the CIPhotoEffectTransfer, that applies a nice Instagramy kind of effect, look at all the filters available, pretty endless capabilities:

// we create Core Image context
let ciContext = CIContext(options: nil)
// we create a CIImage, think of a CIImage as image data for processing, nothing is displayed or can be displayed at this point
let coreImage = CIImage(image: image)
// we pick the filter we want
let filter = CIFilter(name: "CIPhotoEffectTransfer")
// we pass our image as input
filter.setValue(coreImage, forKey: kCIInputImageKey)
// we retrieve the processed image
let filteredImageData = filter.valueForKey(kCIOutputImageKey) as CIImage
// returns a Quartz image from the Core Image context
let filteredImageRef = ciContext.createCGImage(filteredImageData, fromRect: filteredImageData.extent())
// this is our final UIImage ready to be displayed
let filteredImage = UIImage(CGImage: filteredImageRef);

This gives us the following result:

Ayden filtered

And here is the full code:

import SpriteKit

class GameScene: SKScene {
    override func didMoveToView(view: SKView) {
        /* Setup your scene here */
        
        // we reference our image (path)
        let data = NSData (contentsOfFile: "/Users/timbert/Documents/Ayden.jpg")
        // we create a UIImage out of it
        let image = UIImage(data: data)
        
        // we create Core Image context
        let ciContext = CIContext(options: nil)
        // we create a CIImage, think of a CIImage as image data for processing, nothing is displayed or can be displayed at this point
        let coreImage = CIImage(image: image)
        // we pick the filter we want
        let filter = CIFilter(name: "CIPhotoEffectTransfer")
        // we pass our image as input
        filter.setValue(coreImage, forKey: kCIInputImageKey)
        // we retrieve the processed image
        let filteredImageData = filter.valueForKey(kCIOutputImageKey) as CIImage
        // returns a Quartz image from the Core Image context
        let filteredImageRef = ciContext.createCGImage(filteredImageData, fromRect: filteredImageData.extent())
        // this is our final UIImage ready to be displayed
        let filteredImage = UIImage(CGImage: filteredImageRef);
        
        // we create a texture, pass the UIImage
        let texture = SKTexture(image: filteredImage)
        // wrap it inside a sprite node
        let sprite = SKSpriteNode(texture:texture)
        // we scale it a bit
        sprite.setScale(0.5);
        // we position it
        sprite.position = CGPoint (x: 510, y: 380)
        // let's display it
        self.addChild(sprite)
    }
    
    override func update(currentTime: CFTimeInterval) {
        /* Called before each frame is rendered */
    }
}

We can also apply filters and play with the parameters to customize them, we could also use shaders for more flexibility, more on that later :) In the code below, we apply a pinch distortion effect to our initial image, that will give us the following:

Simple distortion

And here is the full code:

import SpriteKit

class GameScene: SKScene {
    override func didMoveToView(view: SKView) {
        /* Setup your scene here */
        
        // we reference our image (path)
        let data = NSData (contentsOfFile: "/Users/timbert/Documents/Ayden.jpg")
        // we create a UIImage out of it
        let image = UIImage(data: data)
        
        // we create Core Image context
        let ciContext = CIContext(options: nil)
        // we create a CIImage, think of a CIImage as image data for processing, nothing is displayed or can be displayed at this point
        let coreImage = CIImage(image: image)
        // we pick the filter we want
        let filter = CIFilter(name: "CIPinchDistortion")
        // we pass our image as input
        filter.setValue(coreImage, forKey: kCIInputImageKey)
        // we pass a custom value for the inputCenter parameter, note the use of the CIVector type here
        filter.setValue(CIVector(x: 300, y: 200), forKey: kCIInputCenterKey)
        // we retrieve the processed image
        let filteredImageData = filter.valueForKey(kCIOutputImageKey) as CIImage
        // returns a Quartz image from the Core Image context
        let filteredImageRef = ciContext.createCGImage(filteredImageData, fromRect: filteredImageData.extent())
        // this is our final UIImage ready to be displayed
        let filteredImage = UIImage(CGImage: filteredImageRef);
        
        // we create a texture, pass the UIImage
        let texture = SKTexture(image: filteredImage)
        // wrap it inside a sprite node
        let sprite = SKSpriteNode(texture:texture)
        // we scale it a bit
        sprite.setScale(0.5);
        // we position it
        sprite.position = CGPoint (x: 510, y: 380)
        // let's display it
        self.addChild(sprite)
    }
    
    override func update(currentTime: CFTimeInterval) {
        /* Called before each frame is rendered */
    }
}

Now, can we apply a filter to the first bitmap we created through drawing commands? Sure. Here is the code for a blur effect:

import SpriteKit

class GameScene: SKScene {
    override func didMoveToView(view: SKView) {
        /* Setup your scene here */
        
        // we create the graphics context
        UIGraphicsBeginImageContextWithOptions(CGSize(width: 200, height: 200), true, 1)
        
        // we retrieve it
        let context = UIGraphicsGetCurrentContext()
        
        // we issue drawing commands
        CGContextSetRGBFillColor (context, 1, 1, 0, 1);
        CGContextFillRect (context, CGRectMake (0, 0, 200, 200));// 4
        CGContextSetRGBFillColor (context, 1, 0, 0, 1);// 3
        CGContextFillRect (context, CGRectMake (0, 0, 100, 100));// 4
        CGContextSetRGBFillColor (context, 1, 1, 0, 1);// 3
        CGContextFillRect (context, CGRectMake (0, 0, 50, 50));// 4
        CGContextSetRGBFillColor (context, 0, 0, 1, 0.5);// 5
        CGContextFillRect (context, CGRectMake (0, 0, 50, 100));
        
        // we query an image from it
        let image = UIGraphicsGetImageFromCurrentImageContext()
        
        // we create Core Image context
        let ciContext = CIContext(options: nil)
        // we create a CIImage, think of a CIImage as image data for processing, nothing is displayed or can be displayed at this point
        let coreImage = CIImage(image: image)
        // we pick the filter we want
        let filter = CIFilter(name: "CIGaussianBlur")
        // we pass our image as input
        filter.setValue(coreImage, forKey: kCIInputImageKey)
        // we retrieve the processed image
        let filteredImageData = filter.valueForKey(kCIOutputImageKey) as CIImage
        // returns a Quartz image from the Core Image context
        let filteredImageRef = ciContext.createCGImage(filteredImageData, fromRect: filteredImageData.extent())
        // this is our final UIImage ready to be displayed
        let filteredImage = UIImage(CGImage: filteredImageRef);
        
        // we create a texture, pass the UIImage
        let texture = SKTexture(image: filteredImage)
        // wrap it inside a sprite node
        let sprite = SKSpriteNode(texture:texture)
        // we scale it a bit
        sprite.setScale(0.5);
        // we position it
        sprite.position = CGPoint (x: 510, y: 380)
        // let's display it
        self.addChild(sprite)
    }
    
    override func update(currentTime: CFTimeInterval) {
        /* Called before each frame is rendered */
    }
}

And here is the result:

Simple Bitmap data filtered

I hope you guys enjoyed it! Lots of possibilities, lots of fun with these APIs.

Comments (6)

  1. dan wrote:

    it would be nice if you could share one of these as a xcode project.

    Wednesday, June 11, 2014 at 8:58 am #
  2. Thibault Imbert wrote:

    Hi Dan,

    Sure, good point. I will do that for the next posts. Will update this post with the XCode project asap too and let you know here.

    Thibault

    Wednesday, June 11, 2014 at 9:06 am #
  3. Thibault Imbert wrote:

    There you go dan: https://github.com/thibaultimbert/swift-experiments

    Thibault

    Wednesday, June 11, 2014 at 9:34 am #
  4. Zsolt wrote:

    Hi Thibault,

    I am working on my game using Adobe Air SDK and it uses a lot of dynamic cubic bezier drawings and I am considering to give the native route a try with Swift as after a certain point it gets slow obviously.

    I found some Cocoa example for dynamic drawings here:
    http://www.knowstack.com/cocoa_drawing/

    Yesterday I gave it a try, but I cannot even use the Cocoa framework in Xcode 6 so I got stuck at the moment. I realized that I could use the SKShapeNode class as well, but I didn’t spend any time with that.

    Anyway, my question is: what do you think, is it worth to move to iOS SDK and Swift from Adobe Air considering the extra amount of code and the loss of multiplatform possibilities and in return the performance gain?

    Thanks for your answer in advance.

    Kind regards,
    Zsolt

    Thursday, June 26, 2014 at 5:56 pm #
  5. Thibault Imbert wrote:

    Hi Zsolt,

    Indeed, you can use the SKShapeNode API for that through the path property.

    For your question about moving to the iOS SDK, you sure lose multiplatform support, but you get best performance and the ability to leverage all of the latest APIs available on iOS.

    Now, if you don’t have a performance problem with AIR, it is sure tempting to stick with it. The thing is that by going native, you will probably never feel limited in terms of performance. I know it’s been a big change for a lot of Flash devs who have moved to native, the sandbox is much bigger suddenly when it comes to performance limitations. That is tempting too :)

    If I were you, I would try to develop it with Swift, see how it goes and make it an awesome game on iOS, then if successful, see how to port it to Android. Learning Swift is a good thing for your skills/experience too.

    But I am not you! :) So take this with a grain of salt ;)

    Hope that is helpful.

    Thibault

    Thursday, June 26, 2014 at 11:15 pm #
  6. Zsolt wrote:

    Thanks for your answer Thibault, I appreciate your help very much.

    I think your blog is a unique resource for as3 developers who would like Swift a try and looking for solutions for similar problems we had through our Flash/Air projects.

    At the moment I am looking for answers for how the event bus system works in swift, how we load-unload assets, how we have to handle (or not) the garbage collection, how to achieve flash style drawing api curves and shapes and how to implement some kind of MVC approach in Swift projects.

    I assume you will touch some of these topics in the future.

    Thanks again and I wish you good luck for your work!

    Kind regards,
    Zsolt

    Friday, June 27, 2014 at 5:43 pm #