This week I've been mostly trying to draw the actual NSView contents into the attached CALayer, and have it rendered via CARenderer.

Last week I had a proof that this is possible using the work from previous phases - I’ve managed to draw some custom graphics into a CALayer, which is then drawn into a NSView-managed OpenGL context using a CARenderer. You can read that report here.

The concept is relatively simple. In the drawLayer:inContext: method on the NSView, we need to draw its contents into the given context. However, the issue is that CALayer is given a Core Graphics context, while NSView knows how to draw itself only into a NSGraphics context.

There is a neat solution for this - the graphicsContextWithCGContext:flipped: method, which takes a CGContext and returns a subclass of CGContext that has the interface of a NSGraphics context. We then pass the created context into the displayRectIgnoringOpacity:inContext: method on the NSView, which causes it to draw itself into the context.

But it doesn't work. What doesn't work? Is the graphicsContextWithCGContext:flipped: method broken, or is it somewhere else?

Ivan came to help with a nice idea - lets draw the context into a png image. This came out:

While it is ugly, a huge NSButton is what I was trying to draw. So that works. The issue isn't in the drawing of the NSView into a context, it is somewhere else.

Where? Who knows. It's somewhere in Opal backend. There are a couple of remaining candidates. I've been digging through callstacks and am working on finding the culprit, with some help of Ivan, who knows Opal.

Until next time,