The Human Centric AI Seminars Series

The Human Centric Security team are running a new monthly series “The Human Centric AI Seminars” that will focus on various research topics in human centered AI.
For more info contact: Kristen Moore and Tina Wu
Free access to anyone interested in Humans and AI

Next seminar:

June 9, 13.00 AEST

Title: From perception to decision: what could human behaviours tell

Speaker: Dr Kun Yu (UTS)Bio:  Kun Yu received the BE degree from Beihang University, Master by research degree in Chinese Academy of Sciences in 2001 and 2004 respectively. He joined Nokia Research Centre (Asia) as a research fellow, and senior research scientist afterwards. In 2010 he came to Australia and started his PhD journey in UNSW, and worked as a research fellow in National ICT Australia and Data61, CSIRO after graduation. He has published in various HCI venues including IUI, CHI, Interact etc., and over 30 international patents, with more than 900 citations so far. Currently Dr. Yu is leading the Human Performance Analytics (HPA) team of Data Science Institute (DSI), UTS, and also managing the Predator Lab, UTS. Dr. Yu is also serving the distinguished reviewer board of the ACM TIIS journal. For more info: https://profiles.uts.edu.au/Kun.Yu

Abstract: Immersed in a world of data and models, humans always find their own ways to process the huge amount of information surrounding them. However, the increasingly complicated machine learning methods and their dynamic nature have made it difficult for humans to understand their mechanism, and take full advantage of their capabilities. Incorrect judgements on the data systems may occur from time to time, while in the worst cases the human may avoid the data science technique at all. Via investigation of the human behaviours, we will be able to identify the problems arising in human-data interaction, and devise corresponding resolutions to improve the human-system interaction efficiency. This seminar introduces our exploration in this field, in particular how the human behaviours could be linked to their perceptions and decisions, and applied in specific contexts such as human-machine collaboration and cybersecurity examination to guide better data system design.

Previous Seminar:

If you have missed it, recording available here: https://csiro.webex.com/recordingservice/sites/csiro/recording/59fc60f194fa1039b6bb005056bab5c0/playback

May 12, 1300 AEST

Title: Understanding human tasks from wearable sensor signals

Speaker: Prof. Julien Epps https://research.unsw.edu.au/people/professor-julien-epps

Bio: Julien Epps received the BE and PhD degrees in Electrical Engineering from UNSW Australia, in 1997 and 2001 respectively. After working as a Senior Research Engineer at Motorola Labs and then as a Senior Researcher and Project Leader at National ICT Australia, he was appointed as a Senior Lecturer at UNSW Electrical Engineering and Telecommunications in 2007, where he is now Professor and Head of School. Prof Epps is also a Contributed Principal Researcher at Data61, CSIRO and a Scientific Advisor for Sonde Health. He has authored more than 250 publications and three patents, which have been cited more than 8000 times, and has supervised 17 PhD students to completion. He has delivered invited tutorials to major international conferences INTERSPEECH and IEEE SMC, and invited keynotes to the Int. Workshop on Context-Based Affect Recognition (part of IEEE ACII), the Int. Workshop on Audio-Visual Emotion Challenge (part of ACM Multimedia) and the Workshop on Eye Gaze in Intelligent Human Machine Interaction (part of ACM ICMI). Prof Epps is serving as an Associate Editor for IEEE Transactions on Affective Computing, recently coordinated an invited chapter on “Task Load and Stress” in the Wiley Handbook for Human-Computer Interaction, and had his team’s work profiled by Nature News in 2020.

Abstract: A ‘task’ is arguably the most basic unit of human activity and perhaps the most logical basis from which computing systems can understand humans, yet at present we have only extremely limited means by which to detect a change in task and to estimate the level of physical, mental and other types of load experienced during tasks. Recent activity in wearable computing promises change, with the opportunity to position non-invasive near-field sensors directly where they are most useful for task analysis, for example in front of the eye, near the mouth and fixed to the head. Head-mounted wearable sensors allow ‘always-on’ automatic analysis of tasks and behaviour based on these sensor signals, and hence automatic approaches to the long-studied but historically manually-intensive problems of workload assessment and task switching analysis. This seminar introduces a framework for automatic task analysis and highlights a number of findings from automatic processing of eye activity, speech and head movement that advance the state of the art towards realising it. Finally, some methods for automatically detecting and modelling low-level behaviour indicators in a longitudinal manner will be outlined, which pave the way towards a big data understanding of human behaviour.