Paper

Modeling User Behaviors in Machine Operation Tasks for Adaptive Guidance

An adaptive guidance system that supports equipment operators requires a comprehensive model, which involves a variety of user behaviors that considers different skill and knowledge levels, as well as rapid-changing task situations. In the present paper, we introduced a novel method for modeling operational tasks, aiming to integrate visual operation records provided by users with diverse experience levels and personal characteristics. For this purpose, we investigated the relationships between user behavior patterns that could be visually observed and their skill levels under machine operation conditions. We considered 144 samples of two sewing tasks performed by 12 operators using a head-mounted RGB-D camera and a static gaze tracker. Behavioral features, such as the operator's gaze and head movements, hand interactions, and hotspots, were observed with significant behavioral trends resulting from continuous user skill improvement. We used a two-step method to model the diversity of user behavior: prototype selection and experience integration based on skill ranking. The experimental results showed that several features could serve as appropriate indices for user skill evaluation, as well as providing valuable clues for revealing personal behavioral characteristics. The integration of user records with different skills and operational habits allowed developing a rich, inclusive task model that could be used flexibly to adapt to diverse user-specific needs.

Results in Papers With Code
(↓ scroll down to see all results)