Understanding human tasks from wearable sensor signals

June 9th, 2021

If you have missed it, recording available here: https://csiro.webex.com/recordingservice/sites/csiro/recording/59fc60f194fa1039b6bb005056bab5c0/playback

May 12, 1300 AEST

Title: Understanding human tasks from wearable sensor signals

Speaker: Prof. Julien Epps  https://research.unsw.edu.au/people/professor-julien-epps

Bio: Julien Epps received the BE and PhD degrees in Electrical Engineering from UNSW Australia, in 1997 and 2001 respectively. After working as a Senior Research Engineer at Motorola Labs and then as a Senior Researcher and Project Leader at National ICT Australia, he was appointed as a Senior Lecturer at UNSW Electrical Engineering and Telecommunications in 2007, where he is now Professor and Head of School. Prof Epps is also a Contributed Principal Researcher at Data61, CSIRO and a Scientific Advisor for Sonde Health. He has authored more than 250 publications and three patents, which have been cited more than 8000 times, and has supervised 17 PhD students to completion. He has delivered invited tutorials to major international conferences INTERSPEECH and IEEE SMC, and invited keynotes to the Int. Workshop on Context-Based Affect Recognition (part of IEEE ACII), the Int. Workshop on Audio-Visual Emotion Challenge (part of ACM Multimedia) and the Workshop on Eye Gaze in Intelligent Human Machine Interaction (part of ACM ICMI). Prof Epps is serving as an Associate Editor for IEEE Transactions on Affective Computing, recently coordinated an invited chapter on “Task Load and Stress” in the Wiley Handbook for Human-Computer Interaction, and had his team’s work profiled by Nature News in 2020.

Abstract: A ‘task’ is arguably the most basic unit of human activity and perhaps the most logical basis from which computing systems can understand humans, yet at present we have only extremely limited means by which to detect a change in task and to estimate the level of physical, mental and other types of load experienced during tasks. Recent activity in wearable computing promises change, with the opportunity to position non-invasive near-field sensors directly where they are most useful for task analysis, for example in front of the eye, near the mouth and fixed to the head. Head-mounted wearable sensors allow ‘always-on’ automatic analysis of tasks and behaviour based on these sensor signals, and hence automatic approaches to the long-studied but historically manually-intensive problems of workload assessment and task switching analysis. This seminar introduces a framework for automatic task analysis and highlights a number of findings from automatic processing of eye activity, speech and head movement that advance the state of the art towards realising it. Finally, some methods for automatically detecting and modelling low-level behaviour indicators in a longitudinal manner will be outlined, which pave the way towards a big data understanding of human behaviour.