Preamble: Core Image is a powerful framework that allows you to easily apply filters to images. You can get a variety of effects such as modified vitality, tone or exposure. It can process image data using a CPU or GPU, and it’s very fast – enough for real-time processing of video frames! Core image filters can also be linked together to apply multiple effects to an image or video frame at once. Multiple filters are combined into a single filter that is applied to the image. This is very efficient compared to processing the image once through each filter.
#### Getting Started Before we get started, let’s discuss some of the most important classes in the Core Image framework: CIContext. All processing of core images is done with CIContext. This is somewhat similar to a Core Graphics or OpenGL context. CIImage. This class holds image data. It can be created from UIImage, from image files, or from pixel data. CIFilter. The CIFilter class has a dictionary that defines the attributes of the particular filter it represents. Examples of filters are vibration, color inversion, cropping, etc.
#### Basic image filtering starts by simply running your image CIFilter and displaying the image on the screen. Every time you want to apply CIFilter to an image, there are four steps:
Create a CIImage object. CIImage(CIImage(contentsOf: ), CIImage (data :), CIImage (CGImage :), CIImage (bitmapData: bytesPerRow: size: format: colorSpace :), etc. Create a CIContext using CIImage(contentsOf:). CIContext can be CPU – or GPU-based. CIContext initialization is relatively resource-intensive, so you can reuse it instead of creating it over and over again. When you output a CIImage object, you will always need one. 3. Create a CIFilter. When creating a filter, you can configure a number of properties that depend on the filter you use. 4. Get the filter output. The filter provides you with the output image as CIImage – you can convert it to UIImage using CIContext, as shown below
/ / 1let fileURL = Bundle.main.url(forResource: "beauty", withExtension: "jpg") / / 2letbeginImage = CIImage(contentsOf: fileURL!) / / 3let filter = CIFilter (name: "CISepiaTone") filter? .setValue(beginImage,forKey: kCIInputImageKey) filter? SetValue (0.5,forKey: kCIInputIntensityKey)
// 4
letnewImage = UIImage(ciImage: (filter? .outputImage)!) self.imageView.image = newImage;Copy the code
Let’s start with this section:
1. This line creates an NSURL object that holds the path to the image file. 2. Next, create your CIImage using the CIImage(contentsOf:) constructor. 3. Next, you will create your CIFilter object. The CIFilter constructor takes the name of the filter and specifies a dictionary of keys and values for that filter. Each filter will have its own unique key and a valid set of values. The CISepiaTone filter requires only two values :KCIInputImageKey (a CIImage) and kCIInputIntensityKey between 0 and 1, which you give 0.5. Most filters have default values that will be used if no value is provided. One exception is CIImage, which must be provided because there is no default. 4. Restoring the CIImage from the filter is as simple as using the outputImage attribute. Once you output CIImage, you will need to convert it to UIImage. The UIImage(ciImage:) constructor converts ciImage to UIImage. Once you have converted it to a UIImage, you simply display it in the imageView that you added earlier.
Run the project and you will see the image filtered by a dark brown filter.
#### put context before you go any further, you should know about an optimization. I mentioned earlier that you need a CIContext application CIFilter, but this object was not mentioned in the example above. As it turns out, the UIImage(ciImage:)code constructor does all the work for you. It creates CIContext and uses it to perform filtering of images. This makes using the Core Image API very simple. There is one major drawback – CIContext creates a new one every time it is used. CIContext instances are intended to be reusable to improve performance. If you were to use a slider to update the filter value, as you did in this tutorial, it would be too slow to create a new CIContext every time you changed the filter. We are right to do so. Remove Step 4 from the code added by viewDidLoad() and replace it with the following:
/ / 1let context = CIContext(options:nil)
// 2
letcgimg = context.createCGImage(filter! .outputImage! , from: filter! .outputImage! .extent) // 3let newImage = UIImage(cgImage: cgimg!)
self.imageView.image = newImage;
Copy the code
Again, let’s take a look at this section. Here, you set the CIContext object and use it to draw the CGImage. The CIContext(options:) construct takes an NSDictionary with specified options such as color format, or whether the context should run on the CPU or GPU. For this application, the default is good, so you pass it as nil for that argument. CreateCGImage (outputImage:from:) Called in context with the supplied CIImage returns a new CGImage instance. Next, you use the UIImage(cgImage:) constructor to create UIImage from the newly created cgImage, rather than directly from the CIImage as before. Note that after this is done, there is no need to explicitly release the CGImage, as in Objective-C. In Swift, ARC can automatically release Core Foundation objects. Build and run, and make sure it works as before. In this case, handling the creation of CIContext yourself isn’t much different. But in the next section, you’ll see why this is important for performance, because you implement the ability to dynamically modify filters! Changing filter values
Let’s add a slider, and each time the slider changes, we need to redo the image filter with a different value. However, you don’t want to redo the whole process, which would be very inefficient and take too long. You will need to change something in the class so that you can preserve some of the objects created in the viewDidLoad method. In order to reuse CIContext, the program will run very slowly every time you recreate it. Add some instance variables to accomplish this task. Add the following three properties to the ViewController class:
var context: CIContext!
var filter: CIFilter!
var beginImage: CIImage!
Copy the code
Change the code so that viewDidLoad() uses these properties instead of declaring new local variables as follows:
beginImage = CIImage(contentsOf: fileURL!)
filter = CIFilter (name: "CISepiaTone") filter? .setValue(beginImage,forKey: kCIInputImageKey) filter? SetValue (0.5,forKey: kCIInputIntensityKey)
let outputImage = filter.outputImage
context = CIContext(options:nil)
letcgimg = context.createCGImage(filter! .outputImage! , from: filter! .outputImage! .extent)Copy the code
Implement the changeValue method. Change the inputIntensity in the CIFilter dictionary. Once you change this value, you need to repeat several steps:
1. Obtain the output CIImage from CIFilter. 2. Convert CIImage to CGImage. 3. Convert CGImage to UIImage and display it in image view.
Create a method amountSliderValueChanged (Sender 🙂 :
@IBAction func amountSliderValueChange(_ sender: UISlider) {
let sliderValue = sender.value
filter.setValue(sliderValue, forKey: kCIInputIntensityKey)
let outputImage = filter.outputImage
letcgimg = context.createCGImage(outputImage! , from: outputImage! .extent)let newImage = UIImage(cgImage: cgimg!)
self.imageView.image = newImage
}
Copy the code
#### Old photo Effects In this Demo, you’ll get a more refined old photo effect, complete with tan, a bit of noise and some halo
Func oldPhoto (img: CIImage, withAmount intensity: Float) -> CIImage {//1 CISepiaTone brownlet sepia = CIFilter(name: "CISepiaTone") sepia? .setValue(img,forKey: kCIInputImageKey) sepia? .setValue(intensity,forKey: "inputIntensity") //2 Set a filter to create a random noise patternlet random = CIFilter(name: "CIRandomGenerator"//3 Changes the output of the random noise generatorlet lighten = CIFilter(name:"CIColorControls") lighten? .setValue(random? .outputImage,forKey:kCIInputImageKey) lighten? .setValue(1 - intensity,forKey:"inputBrightness") lighten? .setValue(0,forKey:"inputSaturation"//4 cropping(to rect: CGRect) outputs the CIImage and applies it to the supplied RECTletcroppedImage = lighten? .outputImage? .ice (to beginimage.extent) //5 Combine the output of the tan filter with the output of the CIRandomGenerator filter.let composite = CIFilter(name:"CIHardLightBlendMode") composite? .setValue(sepia? .outputImage,forKey:kCIInputImageKey) composite? .setValue(croppedImage,forKey: kCIInputBackgroundImageKey) / / 6 synthetic output running on halation filter (vignette filter), make the edge of the picture is darklet vignette = CIFilter(name:"CIVignette") vignette? .setValue(composite? .outputImage,forKey:kCIInputImageKey) vignette? .setValue(intensity * 2,forKey:"inputIntensity") vignette? .setValue(intensity * 30,forKey:"inputRadius"//7 Returns the output of the filterreturnvignette! .outputImage! }Copy the code
Effect:
Parse the above code: 1. Set the tan filter as you did in the simple scene. You pass floating point values in the method to set the intensity of the dark effect. This value will be provided by the slider. 2. Set a filter and create a random noise pattern as follows:
It doesn’t take any arguments. You will use this noise mode to add textures to the final “old photo” look. 3. Change the output of random noise generator. You want to grayscale it and lighten it up a little bit, so it’s less dramatic. You’ll notice that the input image key is set to the outputImage property of the random filter. This is a convenient way to pass the output of one filter as the input of the next. 4. Cropping (to rect: CGRect) outputs the CIImage and applies it to the supplied RECT. In this case, you need to crop the output of the CIRandomGenerator filter because it breaks bricks without limit. If you don’t crop at some point, you get an error indicating that the filter has “infinite length.” CIImages don’t actually contain image data, they describe the “recipe” for creating it. Until you call a method on CIContext to actually process the data. 5. Combine the output of the tan filter with the output of the CIRandomGenerator filter. This filter performs exactly the same operation as the “hard Light” setting in Photoshop layers. Most of the filter options in Photoshop can be implemented using Core Image. 6. Run a halo filter on this composite output to darken the edges of the photo. You are using the value of the slider to set the radius and intensity of this effect. Returns the output of the last filter.