Skip to main content

Human Factors

Human Factors, or Human Factors Engineering (HF/HFE) is the systematic assessment and accommodation of the human element in a man-machine interface.  It is a broad field, including both physical human factors (i.e. operation of industrial machinery, or maintenance of correct posture in an office environment), and cognitive human factors (i.e. human-centred design of in-vehicle information systems; also known as cognitive ergonomics).  In the Cognitive informatics team we have a number of capabilities in this category.

 

Human-Machine Interaction

As automation becomes increasingly ubiquitous in everyday life, we investigate the design of systems where the humans and computer-based systems (or machines in the broader sense), co-operate in a functional and safe way.  In this sense, we strive to design systems that can account for the human element, and in which the human can leverage his/her natural ability via the appropriate use automation.

One of the main goals of our team is to scientifically test assumptions and interpretations from the literature. For this we design and conduct experimantal studies with human participants to test our hypotheses. We have published various articles on subjects such as Fitts Law, Visual Indexing Theory, and various image processing concepts and how these relate to Human-Machine Interaction. We contribute with our Human Factors capability to many internal and external collaborations.

Figure 1: Robot tele-operation pick and place task
Figure 1: Robot tele-operation pick and place task a) Experimental Set up (left), and b) actual system (right)

Assessment of operator workload

Over the past two years we have started developing a methodology for the assessment of cognitive workload, which can be applied in a number of operational environments.  Traditional workload testing technique such as Instantaneous Situation Awareness and Subjective Workload Assessment generally interrupt the task being assessed, in order to obtain a rating.  As an alternative, we have been looking into methodologies including Electro-Dermal Activity, Electroencephalography, Decision Response Task (DRT), and Pupillometry, as alternate means of gathering workload data. In an ongoing collaborative effort (internal and external collaborations) we are developing tools for lightweight workload assessment to further investigate the relationship between automation, trust and workload.

Psycho-physiology

Psycho-physiology deals with measuring and interpreting physiological signals from humans. Such signals can be used for example to get an objective measure of a user study participant’s reaction to certain events or stimuli. Furthermore, systems can be designed in a way to leverage physiological signals as novel input modalities. We investigate both of these areas. Used as a measurement tool such signals give a direct insight into how the body reacts. While this reaction still has to be interpreted by researchers (e.g. an increase in sweat level can indicate an affective response), psycho-physiology allows direct and real time measurement or user monitoring. Used as an interface, these signals afford the design of novel interaction modalities such as Brain-Computer Interfaces (BCI) (see Figure 2) or contact-less interaction based on eye-gaze tracking. Such systems may be used as a novel interface for computers or robots in situations in which other interaction options are non available (e.g. interfaces for severely physically disabled people).

healthtech_001

Figure 2: Electroencephalography (EEG) measurements to monitor brain activity.

Our work with psycho-physiological sensors includes:

  • Designing an EEG-based BCI for a motor rehabilitation system: This system also includes eye-gaze tracking as an additional input and interaction modality.
  • Eye-gaze tracking to measure changes in pupil dilation: We conduct research on using this information, also called pupillometry, to improve the accuracy of the EEG-based BCI, and as a measure for cognitive processes and operator workload.
  • Eye-gaze tracking for measuring visual-cognitive performance to natural as well as to artificial, fully controlled stimuli.
  • GSR as an interactive measure of the user’s affective state and real-time system input (e.g. measuring fear response to virtual spiders); and GSR as a real-time measure of operator workload as well as team workload.
  • Psycho-physiology based human robot interaction (HRI): we use physiological measures both for quantifying user performance in HRI as well as a control modality for HRI (see Figure 1).