Archive for category CoreImage

Thinking about my tests

I’ve installed Yosemite and of course the first thing I did was to run my tests

Almost every test failed. Generated images are all different. They look the same to my poor eyesight but pixel values can be quite different with the compare tolerance increased to 26* from 0 needed for an image to be identified as the same. I had previously only needed to do this when comparing images created from windows on different monitors. I think perhaps I need to have a think about exactly what it is I’m testing. These tests have saved me a lot of time and given me confidence that I’ve not been breaking stuff but for so many to break with an os upgrade doesn’t help.

For now the failure of the tests beyond the image generation described above, has informed me about the following changes to ImageIO and CoreImage filters.

Information returned about functionality provided by ImageIO and CoreImage

ImageIO can now import three new formats: “public.pbm”, “public.pvr”, “”
ImageIO has lost one import format: “public.xbitmap-image”

I’ve no idea what these formats are and I’ve been unsuccessful at finding information about them.

ImageIO has added export formats: “public.pbm”, “public.pvr”, “”

Apple has added these new CoreImage filters:

CIAccordionFoldTransition CIAztecCodeGenerator CICode128BarcodeGenerator CIDivideBlendMode CILinearBurnBlendMode CILinearDodgeBlendMode CILinearToSRGBToneCurve CIMaskedVariableBlur CIPerspectiveCorrection CIPinLightBlendMode CISRGBToneCurveToLinear CISubtractBlendMode

There are minor configuration or filter property changes to the filters listed below with a brief description of the change:

  • CIBarsSwipeTransition inputAngle given updated values for default and max. Identity attributes removed for inputWidth and inputBarOffset.
  • CIVignetteEffect inputIntensity slider min changed from 0 to -1.
  • CIQRCodeGenerator has spaces added to description of one property, and a description added for another.
  • CILanczosScaleTransform has a fix for the filter display name.
  • CIHighlightShadowAdjust inputRadius has minimum slider value changed from 1 to 0.
  • CICMYKHalftone inputWidth attribute minimum changed from 2 to -2. inputShapness attribute type is CIAttributeTypeDistance not CIAttributeTypeScalar
  • CICircleSplashDistortion inputRadius has a new identity attribute with value 0.1
  • CIBumpDistortionLinear inputScale, inputRadius and inputCenter given slightly more rational default values.
  • CIBumpDistortion inputScale, and inputRadius are given slightly more rational defaults.
  • CIBarsSwipeTransition inputAngle given updated values for default and max. Identity attributes removed for inputWidth and inputBarOffset.

*This is comparing images created from a 8 bit per color component bitmap context. So out of a range of 256 possible values images generated on Mavericks compared to ones generated on Yosemite are different by up to 26 of those 256 values. That’s huge.

Core Image Filter Rendering. Performance & color profiles

The Apple documentation for rendering a core image filter chain notes that allowing the filter chain to render in the Generic Linear color space is faster. If you need better performance and are willing to trade that off against better color matching then allowing the filter chain to render in the generic linear color space should be faster.

I thought I better look at what the impact of this was both for performance and color matching. I also wanted to see what the difference was if the core graphics context that the filter chain rendered to was created with a sRGB color profile or a Generic Linear RGB profile when the context bitmap was saved as an image to an image file.

All the tests were done on my laptop with the following configuration:

OS: Mavericks 10.9.2
System information: MacBookPro non retina, model: MacBookPro9,1
Chipset Model:	NVIDIA GeForce GT 650M 500MByte.
Chipset Model:	Intel HD Graphics 4000
A 512GByte SSD, 16GByte RAM.

I installed gfxCard Status tool sometime ago which allows me to manually switch which cards to use, and also to inform me when the system automatically changes which card is in use. I use to get changes reported regularly but after one of the Mavericks updates this happened much less. After that update the only consistent way for the discrete card to be switched on automatically by the system was having an external monitor plugged in. I think the OS is trying much harder to keep the discrete graphics card turned off. I have NSSupportsAutomaticGraphics switching key in my info.plist set to YES. I have tried setting the value to NO, and if I run the tests then as long as software render is not specified I’m informed that the system has turned the discrete graphics card on but the CoreImage filter render performance is still poor. The consequence is I’m not really sure that the discrete graphics card is being used for these tests. Perhaps I’d get different results as to whether GPU rendering or software rendering was faster if I had a more complex filter chain so what I might be seeing here is the time needed to push the data to the graphics card, and then pull it back dominating the timing results.

First up, when comparing images where the only difference in image generation has been whether they are rendered to a CGContext with a sRGB profile or a Generic Linear RGB profile then when I view the images in Preview they look identical. The reported profiles are different, the image generated from a context with Generic Linear RGB has a reported profile of Generic HDR profile while the image from a context with a SRGB profile has a reported profile of sRGB IEC61966-2.1.

When the filter chain has the straighten filter and it rotates the image 180 degrees the colors of the output image are exactly the same as the input image when viewed in Preview, no matter the options for generating the output image.

When the filter chain has the box blur filter applied with a radius of 10 pixels the image rendered in the linear generic rgb profile is lighter than the one rendered using the sRGB profile when viewing the output images in preview. The image rendered using the sRGB looks to match better the original colors of the image. The generic linear rgb profile appears to lighten the image. The color change is not large and would be probably be acceptable for real time rendering purposes.

Setting kCIContextUseSoftwareRenderer to YES or NO when creating the CIContext makes no difference in terms of the color changes.

However I get the opposite of what I’d expect with speed.

Asking the filter chain with filter CIBoxBlur with radius of 10 to render 200 times to a Core Graphics context with a sRGB color profile:

Software render using sRGB profile: 4.1 seconds
Software render using Linear Generic RGB profile: 5.3 seconds
GPU render using sRGB profile: 7.0 seconds
GPU render using Linear Generic RGB profile: 7.5 seconds

If I create a Core Graphics context with a Generic Linear RGB color profile then:

Software render using sRGB profile: 4.0 seconds
Software render using Linear Generic RGB profile: 5.3 seconds
GPU render using sRGB profile: 7.3 seconds
GPU render using Linear Generic RGB profile: 7.7 seconds
  1. These results are completely 180º turned around from the results that I’d expect. If I was to accept them as unquestioned truth then I’d have to decide to always just work using the sRGB profile and to do all rendering via software and not worry about using the GPU unless I needed to offload work from the CPU.

A later observation (Friday 2nd Mary 2014), when drawing text into a bitmap context and running off battery power, I’m informed that the system has switched temporarily to using the discrete graphics card and then informed soon after it has switched back.

Tags: , ,

MovingImages CoreImage Transition Filter Example

I’ve written a number of ruby scripts that use MovingImages. One of the recent ones takes advantage of the CoreImage filter functionality that I’ve recently hooked into Moving Images. You’ll get to see this in the second alpha release which I’m pleased to say will be released soon.

The script is called exactly as shown below:

./dotransition --count 30 --sourceimage "/Users/ktam/Pictures/20140422 Pictures/DSC01625Small2.JPG" --destinationimage "/Users/ktam/Pictures/20140422 Pictures/DSC01625Small2.JPG" --basename PageCurl --outputdir "~/Desktop/deleteme/" --backsideimage "/Users/ktam/Pictures/20140422 Pictures/DSC01625SmallCropped.jpg" --angle=-3.07 --transitionfilter CIPageCurlTransition --extent 0,0,463,694 --verbose

I then used a script that you can download that works with the first alpha release of MovingImages and was called exactly as shown below:

./createanimation --delaytime 0.1 --outputfile ~/Desktop/PageTurningAnimation.gif --verbose ~/Desktop/deleteme/PageCurl*.tiff

The result of running both those scripts

Gif animation where the same page is turned over forever

The MovingImages documentation shows the output images at each step of the filter chain. Scroll down past the generated json code to see the images.

The create animation script can be viewed here.

The do transition script can be viewed here.

And an earlier demonstration using the embossmask script

Moving Images

Backside image supplied to CIPageCurlTransition filter doesn’t take

Please see note at end.

I’ve not been able to get setting the “inputBacksideImage” key to work when setting an image to use as the image displayed on the reverse side when a page is being curled over. I’ve not seen any reports that this is broken anywhere on the internet, so I thought I’d just let people know here.

As of OS X 10.9.2 using Xcode 5.1 developer tools, this option doesn’t work.

I’ve posted sample code for a command line tool that demonstrates the problem. This is the same code I used in my bug report to Apple. The sample code can be viewed as a gist on git hub:

Note: This is not broken. The circle in the shading image needs to be partially transparent. The shading image is applied on top of the backside image and covers it up unless it is partially transparent. I’ve updated the code in the gist and everything works as it should.

CoreImage, CIPageCurlTransition, Cocoa, inputBacksideImage, broken, OS X


SSD versus HDD, Movie Frame Grabs and the importance of profiling

There was an e-mail to Apple’s cocoa-dev e-mail list that provoked a bit of discussion. The discussion thread starts with this e-mail to cocoa dev by Trygve.

Basically Trygve was wanting to get better performance from his code for taking frame grabs from movies and drawing those frame grabs as thumbnails to what I call a cover sheet. Trygve was using NSImage to do the drawing and was complaining that based on profiling his code was spending 90% of the time in a method called drawInRect.

Read the rest of this entry »

Tags: , , , , , , , ,

Using CoreImage on iOS and Mac OS X

WWDC 2011 Session 422 

You use CoreImage to filter images on a per pixel basis.
Filters can be changed together.

Various filter types can have their filter matrix concatenated together when they both work with the same type of filter matrix. For example both the hue adjust filter and the modify contrast filter can both be combined into a single processing matrix. When filters that are very different and work in ways where the processing is a different type (cropping versus hue adjust etc.) they can still be combined and evaluation will still be delayed until the final output image is requested but you won’t get the same level of optimisation as you do when the filters manipulate the pixel data in similar ways. CoreImage has a set of built in filters that can be combined in various interesting ways.

Read the rest of this entry »