Qt Quick 3D - XR Simple Touch Example

Demonstrates hand-tracking input in Qt Quick 3D Xr.

This example shows how to use the hand-tracking API in Qt Quick 3D Xr to interact with 2D and 3D objects in the scene. It follows the basic structure of the xr_simple example.

We start by creating a reusable component TouchHand that encapsulates all the logic that we need for each hand. This is located in the separate sub-project xr_shared, so you can easily re-use it in other projects.

 // Copyright (C) 2025 The Qt Company Ltd.
 // SPDX-License-Identifier: LicenseRef-Qt-Commercial OR BSD-3-Clause
 import QtQuick
 import QtQuick3D
 import QtQuick3D.Xr

 Node {
     id: root
     property color color: "#ddaa88"
     required property int touchId
     required property int hand
     required property XrView view

     property alias touchPosition: handController.scenePos
     property bool touchActive: false

     XrController {
         id: handController
         controller: root.hand

         property vector3d scenePos: view.xrOrigin.mapPositionToScene(pokePosition)

         onScenePosChanged: {
             const touchOffset = view.processTouch(scenePos, root.touchId)
             handModel.position = touchOffset
             root.touchPointGrabbed = view.touchpointState(handComponentRoot.touchId).grabbed
         }
     }

     XrHandModel {
         id: handModel
         hand: root.hand
         materials: PrincipledMaterial {
             baseColor: root.color
             roughness: 0.5
         }
     }
 }

The component contains an XrController, giving us the 3D position of the index finger, and an XrHandModel to show where the hand is. The onTouchPositionChanged handler is where the magic happens. We call XrView.processTouch(), which does the heavy lifting: It tries to map the 3D touch position to a 2D position on an XrItem and sends a touch event if it finds an item. It then returns the offset from the 3D position to the touch point on the 2D surface. We then use that offset to shift the position of the XrHandModel, to give the illusion that the hand is stopped by the surface.

Note: This effect will not work on the Apple Vision Pro since the system shows the real hands of the user, and the XrHandModel is not shown.

We create two instances of the TouchHand component inside the XrOrigin, one for each hand:

 XrOrigin {
     id: theOrigin
     z: 50
     TouchHand {
         id: rightHandModel
         hand: XrHandModel.RightHand
         view: xrView
         touchId: 0
         onTouchPositionChanged: buttons.handleTouch(touchPosition)
     }
     TouchHand {
         id: leftHandModel
         hand: XrHandModel.LeftHand
         view: xrView
         touchId: 1
         onTouchPositionChanged: buttons.handleTouch(touchPosition)
     }
 }
 xrOrigin: theOrigin

Here, buttons is a group of 3D buttons that has a handleTouch function. (The implementation is not XR specific, so the details are not documented here.)

We position the XrOrigin 50 centimeters from the origin of the scene, which should be a comfortable touching distance.

The rest of the scene contains some 3D models and an XrItem that reacts to touch events.

Example project @ code.qt.io