How to track finger position and touch force in iOS 9

Female holds and touch tablet computer isolated on white background

In iOS 9.1 the UITouch Class has been amended to track several new options:

  • the force property will track how hard the user presses onto the screen
  • the preciseLocationInView property will track positions more accurately
  • altitude and azimuth detection for Apple Pencil

Here’s how to track all these options in our view.

Intercepting Touch Events

A UIViewController inherits from the UIResponder Class. Therefore we can implement and override the following three methods, which are called when a touch event starts, moves or ends:

#pragma mark - Responders

- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
    
    [self handleTouches:touches withEvent:event];
}

- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
    
    [self handleTouches:touches withEvent:event];
}

- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
    
    [self handleTouches:touches withEvent:event];
}

These methods are called automatically and keep track of each touch event, so there’s no need for us to grab a reference to it. In my simple app I will display the numeric output during the touch event, but in real world apps it would be easy to animate or track objects with the same principle.

It’s easy to get confused here, so please bear with me:
Each of these methods will deliver a UITouch event that describes the position and other properties of the current touch. In addition, it also delivers a UIEvent that tells us when the touch happened and what type of event it was (aside from touches, there are also remote-control and motion events).

To handle each touch event we can use a method like this:

- (void)handleTouches:(NSSet *)touches withEvent:(UIEvent *)event {
    
    // is it a touch?
    if (event.type == UIEventTypeTouches) {
        
        // for each touch update the label
        for (UITouch *currentTouch in touches) {
            [self trackTouch:currentTouch];
        }
    }
}

Because I’m a belt and braces guy, I’ll ask first if it’s actually a touch event. This is probably not strictly necessary since all we’ll ever be passed are touch events, but hey – old coding habits die hard. All our touches will come in as an NSSet, so potentially we may have more than one touch at a time.

To react to all of them, we’ll fast enumerate through the set and track each touch in a separate method.

Reacting to touches (position)

Let’s see where the current touch event is in our current view:

- (void)trackTouch:(UITouch *)touch {
    
    // track the current position
    CGPoint position;
    NSString *xPosition;
    NSString *yPosition;
        
    // use regular precision (iOS 9.0 and lower)
    position = [touch locationInView:self.view];
    xPosition = [NSString stringWithFormat:@"%f", position.x];
    yPosition = [NSString stringWithFormat:@"%f", position.y];
    
    // update position labels
    NSString *positionText = [NSString stringWithFormat:@"Position: X = %@ / Y = %@", xPosition, yPosition];
    self.positionLabel.text = positionText;
}

Each UITouch will deliver a locationInView position, given as a CGPoint with x and y coordinates. We’ll read that out and populate some labels with this knowledge. While the user drags around on the screen, those labels are continually updated with the most recent position.

iOS 9.1 allows us a more precise tracking both via finger or Apple Pencil. This will undoubtedly come in handy in larger and higher density screens (such as the iPhone 6s Plus and iPad Pro). Receiving this extra accuracy is easy:

// use precise location (iOS 9.1 and above)
position = [touch preciseLocationInView:self.view];
xPosition = [NSString stringWithFormat:@"%f", position.x];
yPosition = [NSString stringWithFormat:@"%f", position.y];

Instead of locationInView, we’ll use preciseLocationInView – everything else stays the same.

How did the user touch the screen?

To check whether the touch was created by a finger or the Apple Pencil, we can check the type property. I’m updating a label with this knowledge:

// how did the user touch the screen?
if (touch.type == UITouchTypeDirect) {
    self.touchTypeLabel.text = @"Touch Type: Finger";
}

if (touch.type == UITouchTypeIndirect) {
    self.touchTypeLabel.text = @"Touch Type: Indirect";
}

if (touch.type == UITouchTypeStylus) {
    self.touchTypeLabel.text = @"Touch Type: Apple Pencil";
}

The type property returns three possible values:

  • UITouchTypeDirect (created with a finger)
  • UITouchTypeStylus (created with the Apple Pencil)
  • UITouchTypeIndirect (not yet announced)

Sadly I can’t tell you exactly what an indirect touch is. Thanks to Rick Byers who told me that this could be used for for tracking mouse clicks, or maybe clicks generated by game controllers & co. Or perhaps it’s for mind control (UIMindControl Class?) – only the future will tell us more about this type of touch, as this event is currently undocumented.

In the meantime, we can at least distinguish between fingers and styluses 🙂

Tracking 3D Touch Force

In iOS 9 we can also check how hard the user presses down on the screen:

// update force label (if 3D Touch is available)
if (self.traitCollection.forceTouchCapability == UIForceTouchCapabilityAvailable) {
    self.forceLabel.text = [NSString stringWithFormat:@"Force: %f", touch.force];
}

Since we can only do that on compatible devices, I’ll check if 3D Touch is available on this device before updating another label.

Tracking Apple Pencil

At the time of writing, the Apple Pencil has not been released yet. The UITouch class does have properties to track azimuth and altitude. I’ll tell you more about this in another article, as soon as I can get my hands on that Apple Pencil.

Demo Project

Feel free to check out my demo project on GitHub. It implements all these events in one handy app.

Further Reading





Leave a Reply