60.60 Standard published Mar 29, 2023
CEN
CEN/TC 122 Ergonomics
Technical Specification
13.100 Occupational safety. Industrial hygiene | 13.180 Ergonomics | 35.180 IT Terminal and other peripheral equipment
Published
This document provides guidance on the design, selection and optimization of non-contacting hand and arm gestures for human-computer interaction. It addresses the assessment of usability and fatigue associated with different gesture set designs and provides recommendations for approaches to evaluating the design and selection of gestures. This document also provides guidance on the documentation of the process for selecting gesture sets.
This document applies to gestures expressed by humans. It does not consider the technology for detecting gestures or the system response when interpreting a gesture. Non-contacting hand gestures can be used for input in a variety of settings, including the workplace or in public settings and when using fixed screens, mobile, virtual reality, augmented reality or mixed-mode reality devices.
Some limitations of this document are:
— The scope is limited to non-contacting gestures and does not include other forms of inputs. For example, combining gesture with speech, gaze or head position can reduce input error, but these combinations are not considered here.
— The scope is limited to non-contacting arm, hand and finger gestures, either unilateral (one-handed) or bilateral (two-handed).
— The scope assumes that all technological constraints are surmountable. Therefore, there is no consideration of technological limitations with interpreting ultra-rapid gestures, gestures performed by people of different skin tones or wearing different colours or patterns of clothing.
— The scope is limited to UI-based command-and-control human computer interaction (HCI) tasks and does not include gaming scenarios, although the traversal of in-game menus and navigation of UI elements is within scope.
— The scope does not include HCI tasks for which an obviously more optimal input method exists. For example, speech input is superior for inputting text than gesture input.
— The scope includes virtual reality (VR), augmented reality (AR) and mixed reality (MR) and the use of head-mounted displays (HMDs).
— The scope does not include the discoverability of gestures but does include the learnability and memorability of gestures. It is assumed that product documentation and tutorials will adequately educate end users about which gestures are possible. Therefore, assessing gesture discoverability is not a primary goal of the recommendations in this document.
PUBLISHED
CEN ISO/TS 9241-430:2023
60.60
Standard published
Mar 29, 2023