Humans are naturally born in a 3D world wherein we see and interact with 3D objects all the time. Our perception is therefore strongly conditioned to respond to visual stimuli if we are immersed in interactive systems. With the advances in technology of many display devices for augmented or virtual reality, multimodal immersive interfaces have the potential to be widely used to support visual analytics work. Immersive analytics allows a large amount of data to be concurrently investigated and complex structures of data to be closely visualised in space. The immersion therefore can facilitate the visual perception of users in a natural way, which, in turn, helps them quickly identify areas of interest, meaningful patterns, anomalies, and structures between artefacts that are hard to discover without spatial representations.
The integration of interaction techniques is one of the fundamental requirements of immersive analytics systems. These techniques allow users to have direct interactions with data through exploration, manipulation, filtering, annotation activities. Furthermore, the automatic data analysis process using learning models can be interactively and repetitively improved by users’ feedback on its performance. Beyond classical 2D input and output devices such as monitor, keyboard or mouse, immersive analytics systems have to provide human-computer interfaces that respond to natural, intuitive user actions.
We are developing a collaborative immersive analytics system (CoDIVA) that supports various levels of immersion for visual analytics according to working conditions of users. Our system will provide:
- Interaction tools and user interfaces that facilitate visual analytics processes for users within different areas of expertise ranging from interested novice users, industrial partners to experienced experts and scientists.
- Distributed data storage and information management that help users to collect, access, and share data sets on a local and global scope.
- Multimodal interfaces for ubiquitous data access in a mixed reality framework for visual analytics and capability to support different levels of immersion in data visualisation and exploration.
In order to build a mixed reality framework for visual analytics supporting multimodal user interfaces, we have to deal with some design challenges and technical issues. Many of them come from the limitations of technologies in the Augmented Reality (AR) and Virtual Reality (VR) domains.
- One of the biggest challenges in immersive big-data visualisation is the difficulty to interactively and immersively visualise massive data sets on portable display devices (e.g., HMDs, portable AR glasses, tablets) . Since these data sets are often up to several terabytes, there is always a possibility of a low rendering frame rate, which is critical in immersive visualisation. This problem is due to costly computational tasks that allow users to explore, manipulate data and artifacts collectively, and generate and render them on display devices in real time.
- We can improve the data exploration process of users, engaging them in the problem solving of visual analytics using AR and VR interfaces. However, an all-in-one-screen interface with many menus, buttons and configurations that are available in a desktop application can be difficult to design in mixed reality interfaces. Moreover, using portable and affordable display devices has several limitations: limited field of view and obstruction problems causing the problem of losing situation awareness; unsuitability of traditional I/O devices such as mouse, keyboard; lagging and juddering problems in VR using HMDs; limited calculating performance on these devices.
- The collaboration in ubiquitous systems requires a careful design of interaction techniques between distant participants using various I/O devices with numerous computational capability of devices. Furthermore, the interaction techniques have to make sure that the limitations of hardware and infrastructure do not gravely downgrade the quality of collaboration.
Figure 2: Virtual Reality user interfaces for immersive analytics of honey bee behaviour: a CAVE-like system (left) and an Oculus Rift head-mounted display (right).