This is the second post in my TimeStory for iPad Dev Notes series. The first post was many months ago, and called “Dev Journal”. Well, the app’s nearly done, and I didn’t end up journaling as I worked, but I did take notes which I still want to write up, so “Dev Notes” it is.
An iPad app can get “touch” input three ways: from direct touches on the screen, from an Apple Pencil, and from a pointer driven by an external mouse or trackpad. These each behave differently, but are all delivered by the system as touch events.
App code often doesn’t care. A button is pressed, some text is selected, or a scroll view is panned, and it doesn’t matter how. But when building custom gestures, or a direct drag-and-drop canvas like TimeStory’s main timeline view, these differences often matter very much. Here are a few of the things I learned during development.
(The code-level notes all deal with UIKit; while I do use SwiftUI to construct most of the rest of my app’s UI, the main timeline view and its gesture recognizer code is all UIKit.)
Dealing with touch precision
Any form of custom hit testing needs to be aware of how precise and accurate each touch actually is. For example, Apple Pencil can be treated almost like a Mac mouse cursor. The user can easily hit small targets and perform precise drags. Direct touch, on the other hand, is both imprecise (the touch has a rather large radius around its center) and inaccurate (neither that center point, nor that radius, are exact). Pointer actions sit in between; the iPadOS pointer is a touch-like disk, not a Mac-like arrow.
The key is to read UITouch.majorRadius
(for touch precision) and .majorRadiusTolerance
(for touch accuracy) and use them to test a radius around the center of the tap or drag or other gesture, rather than just testing that center point. This can result in multiple hits if targets are close to each other; it may be as simple as choosing the closest to the center, but sometimes an app will have specific targets (like a draggable resize handle) which should take precedence over more general ones (like the image to which the handle is attached). I iterated on this part of the code several times; it was useful having testers with different preferences (touch, Pencil, mouse) and different types of documents.
Showing the right kind of menu
Per Apple’s guidelines, we should show edit menus in the horizontal style when tapping with a finger and in the vertical style when right-clicking with a mouse. I initially just directly presented my (horizontal) edit menu (via presentEditMenu(with:)
) from the tap gesture callback, but that same callback happens on left click. This meant that mouse users got them in both styles: horizontal from left click, vertical from right click! The fix was just to inspect UITouch.type
first.
Getting UITouch
info from gesture recognizers
Unfortunately, UIGestureRecognizer
and its subclasses don’t expose the underlying UITouch
objects when delivering gesture events. The easiest way to capture this information is by implementing gestureRecognizer(_:shouldReceive:)
in the delegate and saving the radius and type values from the incoming UITouch
objects.
Ensuring Scribble works
Scribble is the feature of iPadOS which lets you handwrite into any text input area with your Apple Pencil. (I love this feature.) Normally, there’s nothing you have to do to support it; it just works.
In my case, however, I support editing event titles directly in the timeline. This places an editable UITextView
with a transparent background right on top of my scrolling, draggable document view. When I first tried Scribble there, the Pencil’s movement would pan the scroll view or drag the underlying event, making the whole thing unusable.
Fortunately, this is easy to fix. UIScribbleInteractionDelegate
offers callbacks when writing starts and ends; use them to disable and re-enable the appropriate gestures.
Distinguishing pans from drags
Unlike a Mac, an iPad uses exactly the same gesture for panning a scroll view as for dragging an object within that scroll view: you slide your finger across the screen. I found that it could be frustrating when the interaction between the two were not clear.
I decided to only allow dragging of a selected object; if you want to drag something without selecting it first, a long-press will select it and start the drag together. I then told the scroll view’s pan gesture recognizer to not start until my own drag recognizer failed (via require(toFail:)
).
(This is one of many areas where I often went back to Keynote as my favorite benchmark app. Zoom in on a slide in Keynote, then drag your finger across an object; it scrolls if the object isn’t selected, and drags if it is. It’s always nice to have good benchmark apps.)
Taking advantage of double taps on the Apple Pencil (2nd generation)
The Apple Pencil (2nd generation) is the one with wireless charging and magnetic attachment. It has the ability to detect when you tap twice on the side of it; this interaction was designed to quickly switch tools within a drawing app or similar context.
My app has a variety of editing tools, arranged as a strip of buttons in the center of the toolbar. I made double-tapping the pencil cycle through them, left to right (with “no tool” being the N+1th state). Combined with Scribble, this makes it possible to do quite a lot of timeline editing without using anything but the pencil.
(Unfortunately, Apple excluded this feature from the newer and cheaper USB-C model, creatively named “Apple Pencil (USB-C)”. I don’t know if this was purely for cost savings or if they’ve decided to phase the feature out; I hope it’s the former!)