We are world leaders in 3D Lidar-based simultaneous localisation and mapping (3D SLAM), in terms of research, commercialization and impact. Our research in 3D Lidar- based simulation led to the development of the Zebedee system and a suite of technologies. GeoSLAM is our worldwide technology commercialisation partner, which CSIRO co-founded.
The 3D SLAM algorithms and software developed allow us to generate highly accurate 3D maps of indoor and outdoor environments, including both built (artificial) and natural environments. Our system also automatically provides a highly accurate estimate of the trajectory followed by the sensor. As a result, this technology allows direct digitalisation of real 3D landscapes into information that can be utilised for analysis, synthesis and decision-making.
Application examples include:
Most notably, Hovermap (flying 3D mapping system), HeatWave (3D + thermal mapping), autonomous unmanned ground vehicles (UGVs) and the Guardian suite of technologies use our SLAM technology.
We have developed dependable, fully autonomous UAV systems suitable for real-world tasks. To allow UAVs to sense their surroundings and react accordingly, we added on-board sensing and autonomy . Data61 is one of three civilian research groups in the world who have demonstrated robust beyond line-of-site autonomous mission execution for UAVs. We demonstrated world-first autonomous beyond-line-of-sight (BLOS) infrastructure inspection mission in Project ResQu.
Data61 have also demonstrated fully autonomous invasive plant surveys in challenging, mountainous rainforest terrrain. The Miconia UAV survey is more effective at detecting miconia than manned helicopter surveys. Data61 is the sole provider of unmanned survey solutions. Our autonomous UAVs also include smaller-scale electric multirotors ( quadcopters or drones), medium-scale helicopters, and larger-scale helicopters (3.4 m Yamaha RMAX). Other highly successful technologies we have developed is Hovermap. The Hovermap lidar mapping payload generates 3D point clouds of infrastructure, power lines, construction sites, stock piles, telecom towers .
We are world leaders in combining high-accuracy 3D lidar mapping with other sensor modalities. We have create a new area in robotics and computer vision, called Augmented World Models (AWMs) or 3D++. We are developing a comprehensive set of sensor fusion algorithms that allow us to generate world models that combine 3D models obtained from lidar, stereo, or other sensors, with data obtained from RGB, thermal, hyperspectral, environmental, gas, and many other sensors. This allows us to generate AWMs with augmented 3D data for enhanced understanding of scenes, quantifying the world and collecting data beyond “our eyes”.
Examples of some of the 3D++ technologies we developed include:
We develop autonomous unmanned ground vehicles (UGVs) for mining, industrial, manufacturing, agricultural, biosecurity/biodiversity, science surveys, and other applications. These vehicles provide autonomous smart sensing, mapping and inspection. Additionally, mobile manipulators also provide adaptive sensing and object manipulation. This R&D area focuses on the combination of localization and mapping methods, motion planning, obstacle detection, obstacle avoidance and situation awareness, translating all these aspects into operational and useful platforms that can increase productivity and safety across a wide range of applications.
The Robotic’s team has a very strong track history with pull from the mining and manufacturing industries. In recent years the capability autonomous navigation has expanded to include industrial environments and environmental sensing. Some examples of systems developed include: world-first automation of a mining dragline, fully autonomous Load Haul Dump (LHD) vehicle, fully autonomous 20-tonne Hot Metal Carrier (HMC) vehicle, autonomous vehicles for on-road and off-road navigation based on the Gator platform, the Seeker Science Rover, and many other systems.
Our legged robot research focuses on the development of autonomous robots. These robots are capable of traversing extreme environments (uneven, unstable and multi-type terrain) and complex indoor or confined spaces with the goal of providing remote in situ sensing, mapping, sample acquisition and actuation. We have two foci: 1) small- to medium-scale legged robots that are designed for indoor operation or for confined spaces, and 2) large-scale ultralight legged robots for challenging indoor and outdoor enviroments.
Our legged robots enables sensing, mapping (3D and 3D++) and actuation in extreme environments, including:
. Forest and rainforest floors
. Muddy riverbanks, beaches and mud flats
. Mountainous, rocky and unstable terrain
. Thick undergrowth, crops, and easily damaged terrain
. Confined spaces: inside ships, power plants, aeroplane wings
We develop advanced mechatronics systems (integrating systems with communications, sensing, processing, actuation, etc. subsystems) to support of all our R&D areas, as well as other groups within Data61 and other CSIRO Business Units.
We have developed the robots and systems for 3D Lidar SLAM, 3D++, UAVs, Autonomous Vehicles, legged robotics, Guardian, etc. The team also supports the Sensor Networks Group, as well as projects for the Agriculture, Energy, Biosecurity, Oceans & Atmosphere and other CSIRO Business Units. The Advanced Mechatronics team also plays a core role in the design and implementation of the I3Hub at QCAT, as well as supporting I3Hubs in Melbourne and Tasmania.
A small subset of the world leading research hardware developed include the Miconia survey UAV research platform, the Zebedee 3D scanner, the world first Load Haul Dump (LHD) automated vehicle, an intra-rumen gas sensing systems for cattle, the Starbug Submarine and the small Camazotz device used to do continental-wide tracking of fruit bats.