One of the areas where SwiftUI's declarative architecture shows its fundamental departure from UIKit's object-oriented approach is in the field of gesture handling. Today we will explore a particular difficulty caused by this, and a potential solution.
First, we need an example of a view with some kind of drag interaction. For this, we will create a view that shows a colour, and allows the user to modify the opacity of the colour by dragging horizontally:
struct OpacityChangingView: View {
let colour: Color
@State private var opacity = 1.0
var body: some View {
GeometryReader { geometryReader in
Rectangle()
.foregroundColor(colour.opacity(opacity))
.gesture(DragGesture(minimumDistance: 0)
.onChanged { value in
opacity = 1.0 - (value.location.x / geometryReader.size.width)
})
}
.frame(idealHeight: 80)
}
}
We use GeometryReader
to find the width of our view, and set the opacity according to how far across the view the user has touched. Note that our drag gesture has a minimum distance of 0. This allows it to recognise taps as well as drags:
One colour is fine, but multiple colours is more fun, so let's add some more inside a scroll view:
struct ContentView: View {
private let colours: [Color] = [.black, .gray, .red, .orange, .yellow, .green, .mint, .teal, .cyan, .blue, .indigo, .purple, .pink, .brown]
var body: some View {
NavigationStack {
ScrollView {
VStack(spacing: 20) {
ForEach(colours, id: \.description) { colour in
OpacityChangingView(colour: colour)
}
}
}.navigationTitle("Colours")
}
}
}
Now we have a veritable smorgasbord of colours to play with:
Lovely stuff. Now let's scroll down our list of colours to see them in all their glory:
Oh dear. Our attempts to scroll the ScrollView
fail as the touches are greedily gobbled up by our OpacityChangingView
s.
UIGestureRecognizer
?What we have here is a good old-fashioned case of clashing gesture recognisers. We want our scroll view's pan gesture recogniser to take precedence when the user is scrolling vertically, so we need a way to cancel our OpacityChangingView
's drag gesture when the user drags vertically instead of horizontally. We could achieve this in UIKit trivially by subclassing UIPanGestureRecognizer
and setting the state
property to .failed
when detecting a vertical pan. By requiring the subclassed pan-gesture to fail in order for the scroll view's pan gesture to recognise (which can be achieved via UIGestureRecognizerDelegate
), we get the desired behaviour - horizontal panning recognised by our view, with vertical scrolling falling back to the scroll view's pan gesture recogniser.
However, in SwiftUI we have no way to programatically cancel gesture recognition. Once the gesture recogniser begins, only it can decide when to stop. Creating a custom gesture is also out of the question, as the Gesture
protocol contains functions with private type parameters. As such, we will have to find a different way to solve the problem.
First things first, we need to stop our DragGesture
from immediately capturing touch input. We can do this by modifying the minimumDistance
parameter, changing it from 0 to 20:
Now our scrolling works, so that's a step in the right direction. However, because our drag gesture's minimumDistance
is no longer 0, it does not recognise taps. As such, we will also need to add a SpatialTapGesture
to handle this case:
struct OpacityChangingView: View {
let colour: Color
@State private var opacity = 1.0
var body: some View {
GeometryReader { geometryReader in
let dragGesture = DragGesture(minimumDistance: 20)
.onChanged { value in
setOpacity(forTouchLocation: value.location, width: geometryReader.size.width)
}
let tapGesture = SpatialTapGesture()
.onEnded { value in
setOpacity(forTouchLocation: value.location, width: geometryReader.size.width)
}
Rectangle()
.foregroundColor(colour.opacity(opacity))
.gesture(tapGesture.simultaneously(with: dragGesture))
}
.frame(idealHeight: 80)
}
private func setOpacity(forTouchLocation touchLocation: CGPoint, width: CGFloat) {
opacity = 1.0 - (touchLocation.x / width)
}
}
Now we have working scrolling, and we can change the opacity by tapping as well as dragging:
Whist this appears to solve the problem in its entirety, there's one interaction unaccounted for. If the user taps and holds on one of the OpacityChangingView
s, nothing happens. When our drag gesture's minimumDistance
was 0, this action would have set the opacity on the touched view. However, as a long press is not recognised by either SpatialTapGesture
or DragGesture
, nothing happens. In theory we could solve this problem by adding a LongPressGesture
; however this recogniser only reports whether a long press is detected, but it does not tell us where in the view that the touch occurred. As such, we can't handle this interaction. Until SwiftUI gains a SpatialLongPressGesture
, this problem will remain unsolved.
We may want to have a similar interaction in another part of our app, where the user dragging a view is used to customise a setting or value. As such, we will need to extract this gesture handling behaviour to make it reusable. Fortunately, we can do this relatively easily with an extension on View
and a ViewModifier
:
extension View {
func tapAndDragGestureRecognition(action: @escaping (CGPoint, TapAndDragViewModifier.GestureKind) -> Void) -> some View {
modifier(TapAndDragViewModifier(action: action))
}
}
struct TapAndDragViewModifier: ViewModifier {
var action: (CGPoint, GestureKind) -> Void
enum GestureKind {
case tap
case drag
}
func body(content: Content) -> some View {
let dragGesture = DragGesture(minimumDistance: 20)
.onChanged { value in
action(value.location, .drag)
}
let tapGesture = SpatialTapGesture()
.onEnded { value in
action(value.location, .tap)
}
content.gesture(tapGesture.simultaneously(with: dragGesture))
}
}
Here we have a TapAndDragViewModifier
that performs the same gesture recognition as we saw earlier, but calls an action handler in response to the tap and drag gestures. It also passes a GestureKind
along with the detected touch location in case the type of touch is important at the call site. We can then update our OpacityChangingView
as follows:
struct OpacityChangingView: View {
let colour: Color
@State private var opacity = 1.0
var body: some View {
GeometryReader { geometryReader in
Rectangle()
.foregroundColor(colour.opacity(opacity))
.tapAndDragGestureRecognition { location, _ in
opacity = 1.0 - (location.x / geometryReader.size.width)
}
}
.frame(idealHeight: 80)
}
}
Solving layout, performance, and gesture interaction issues in SwiftUI requires a fundamentally different approach to how we work with UIKit. As we have seen today, sometimes we can't replicate exactly what we could do with UIKit, so we have to make the best with what is available. Perhaps with iOS 17 just around the corner, SwiftUI will reach feature parity with the UIKit gesture recognition system 🤞. Until then, we'll just have to be creative.