Apart from getting all my existing CA examples onto the iPhone I've also been toying with what the best way to build out the 'photo city' demo from WWDC 2008 would be (my next Core Animation screen cast series). The basic idea of the demo was that you had a set of perhaps 30 or 40 images, the images were combined into cubes and the cubes were used to make a 'city'. After getting a basic cube working I got distracted by some of the stuff I did to make the demo. Namely I finally got around to porting the OpenGL trackball example code to Core Animation.
For those that are not familiar with the trackball example; the idea is that you have a transparent sphere around your scene, you can move the scene around by moving the trackball. As you move your finger to the right it pushed this imaginary sphere around its center to the right (exposing the left side of the scene).
I'm not 100% sure this is the right API to have for such an object but I was able to use it in a couple of examples for a course I'm working on. I also will be using in one of the demos for my talk at iPhone Live. So while it might not be perfect I figure its good enough to post now. Please feel free to comment with what you think would be better.
Now on to document the TrackBall class. The idea is that you have a 2D viewport into a 3D scene, this view port has a width and height (i.e. the CGRect that defines the layers bounds). In this 3D world you construct an imaginary sphere with a radius of the minimum of height or width of your view port centered on the center of your scene. When the event begins (with a touchesBegan:withEvent:) you initialize the trackball with the touches location as the starting point. A vector is constructed from the center of the sphere to the touch (the depth dimension is calculated based on the radius of the sphere). As the user moves her finger around on the screen another vector is constructed from the center of the sphere to the current touch location (as received in touchesMoved:withEvent:). The cross product of these two vectors is the vector of rotation and the angle between them is the magnitude of the rotation.
Practically what all this means is that in the touchesBegan:withEvent: method you call the setStartPointFromLocation: method with the location of the touch (if you don't have multi touch turned on for the view there will be only one touch in the touches set, so you can use the anyObject method to get the touch, code to follow shortly). That initializes the trackball so it knows the first vector (from the center of the sphere to the starting point). As the user drags his finger around on the screen you call rotationTransformForLocation: to get a CATransform3D. This transform encapsulates the rotation vector and angle so you don't really have to grok it to use it (although it helps:). Next you set your layer's sublayersTransform property to this transform.
The scene contained in your layer will now rotate as if it existed in a sphere and you were moving that sphere around. Its a cool effect if you've never seen it before. If you want the trackball to remember where it is when the user picks up her finger you simply call finalizeTrackBallForLocation: with the touch location from the touchesEnded:withEvent: method. Now onto some code. Here is the code to initialize the trackball;
In this example I'm keeping the trackball and finalizing it in the touchesEnded: method (we will see that shortly). Next up I get the transformation from the trackball in the touchesMoved:withEvent: method.
Then in the touchesEnded:withEvent: method I finalize the trackball so it knows where it left off on the next event cycle.
And finally here is the code. Happy hacking!