Research

Capability Overview

|Advanced 3D Lidar-based Mapping and Localisation

We are world leaders in 3D Lidar-based simultaneous localisation and mapping (3D SLAM), in terms of research, commercialisation and impact. Our research in 3D Lidar- based simulation  led to the development of the Zebedee system and a suite of technologies. GeoSLAM is our worldwide technology commercialisation partner, which CSIRO co-founded.

The 3D SLAM algorithms and software developed allow us to generate highly accurate 3D maps of indoor and outdoor environments, including both built (artificial) and natural environments. Our system also automatically provides a highly accurate estimate of the trajectory followed by the sensor. As a result, this technology allows direct digitalisation of real 3D landscapes into information that can be utilised for analysis, synthesis and decision-making.

Application examples include:

  • Recording of homicide crime scenes (Queensland Police)
  • Mapping caves for the first time (Jenolan, Konalda, Wellington, Cliefden)
  • Mapping dinosaur footprints and trackways (Perth)
  • Assessing beach erosion
  • Monitoring structural changes in mines (UK, Australia, Indonesia)
  • Mapping buildings for security (ahead of G20 summit)
  • Mapping nuclear facilities and UN building (Switzerland)
  • Generating art (Heide Museum project)
  • Assessing forest density and vineyard yields (Adelaide)
  • Recording fragile cultural heritage sites (Italy: Tower of Pisa; Australia: prison barracks, old leper colonies, forts, woolsheds, WWI Submarine, etc. India: Temples, Kashmir. Historic towns: Hong Kong and Osaka, Japan.)

Most notably, Hovermap (flying 3D mapping system), HeatWave (3D + thermal mapping), autonomous unmanned ground vehicles (UGVs) and the Guardian suite of technologies use our SLAM technology.

 

|Autonomous Vehicles: Unmanned Aerial Vehicles (UAVs)

We have developed dependable, fully autonomous UAV systems suitable for real-world tasks. To allow UAVs to sense their surroundings and react accordingly, we added onboard sensing and autonomy. We are one of three civilian research groups in the world who have demonstrated robust beyond line-of-site autonomous mission execution for UAVs. We demonstrated world-first autonomous beyond-line-of-sight (BLOS) infrastructure inspection mission in Project ResQu.

We have also demonstrated fully autonomous invasive plant surveys in challenging, mountainous rainforest terrrain. The Miconia UAV survey is more effective at detecting miconia than manned helicopter surveys.  We are the sole provider of unmanned survey solutions. Our autonomous UAVs also include smaller-scale electric multirotors (quadcopters or drones), medium-scale helicopters, and larger-scale helicopters (3.4 m Yamaha RMAX). Other highly successful technologies we have developed is Hovermap. The Hovermap lidar mapping payload generates 3D point clouds of infrastructure, power lines, construction sites, stock piles, telecom towers and much more.

 

|Augmented World Models

We are world leaders in combining high-accuracy 3D lidar mapping with other sensor modalities. We have create a new area in robotics and computer vision, called Augmented World Models (AWMs) or 3D++. We are developing a comprehensive set of sensor fusion algorithms that allow us to generate world models that combine 3D models obtained from lidar, stereo, or other sensors, with data obtained from RGB, thermal, hyperspectral, environmental, gas, and many other sensors. This allows us to generate AWMs with augmented 3D data for enhanced understanding of scenes, quantifying the world and collecting data beyond “our eyes”.

Examples of some of the 3D++ technologies we developed include:

  • Colour Zeb: 3D Lidar + RGB. Broad application in the 3D mapping area.
  • HeatWave: 3D + Thermal + RGB. We are exploring applications in medicine, thermal mapping for manufacturing, construction and infrastructure surveys, and disaster assessment and recovery.
  • 3D + Gas: Can be used for mapping of gas emissions and plumes in indoor/underground structures (LNG plants, mines, etc.) and outdoor environments (coal seam gas fugitive emissions mapping, airborne pollution assessments, etc.).
  • In-Situ Hyperspectral Mapping (ISHM): 3D + Hyperspectral + RGB + Thermal. This technology has been identified by other CSIRO Business Units (Agriculture and Biosecurity) as essential for phenotyping assessment and identification of pest and diseases. One of our systems, AgScan3D+ (which combines 3D + Hyperspectral + RGB), is being used in joints projects with QLD DAF funded by HIA, projects funded by Vinyculture Australia, phenotyping projects funded by CSIRO Agriculture Business Unit.

 

 |Autonomous Vehicles: Unmanned Ground Vehicles (AGVs)

We develop autonomous unmanned ground vehicles (AGVs) for mining, industrial, manufacturing, agricultural, biosecurity/biodiversity, science surveys, and other applications. These vehicles provide autonomous smart sensing, mapping and inspection.  Additionally, mobile manipulators also provide adaptive sensing and object manipulation. This R&D area focuses on the combination of localisation and mapping methods, motion planning, obstacle detection, obstacle avoidance and situation awareness, translating all these aspects into operational and useful platforms that can increase productivity and safety across a wide range of applications.

The Robotic’s team has a very strong track history with pull from the mining and manufacturing industries. In recent years the capability in autonomous navigation has expanded to include industrial environments and environmental sensing. Some examples of systems developed include world-first automation of a mining dragline, fully autonomous Load Haul Dump (LHD) vehicle, fully autonomous 20-tonne Hot Metal Carrier (HMC) vehicle, autonomous vehicles for on-road and off-road navigation based on the Gator platform, such as the Woodside autonomous vehicle; apart from the Seeker Science Rover and many other systems.

 

 |Autonomous Vehicles: Legged Robots

Our legged robot research focuses on the development of autonomous robots. These robots are capable of traversing extreme environments (uneven, unstable and multi-type terrain) and complex indoor or confined spaces with the goal of providing remote in situ sensing, mapping, sample acquisition and actuation.

We have two focus areas: Small to medium scale legged robots that are designed for indoor operation or for confined spaces, and large-scale ultralight legged robots for challenging indoor and outdoor environments.

Our legged robots enable sensing, mapping (3D and 3D++) and actuation in extreme environments, including:
forest and rainforest floors; muddy riverbanks, beaches and mudflats; mountainous, rocky and unstable terrain; thick undergrowth, crops, and easily damaged terrain; and confined spaces: inside ships, power plants, aeroplane wings.

 

 |Advanced Mechatronics Systems

We develop advanced mechatronics systems (integrating systems with communications, sensing, processing, actuation, etc. subsystems) to support of all our R&D areas, as well as other groups within Data61 and other CSIRO Business Units.

We have developed the robots and systems for 3D Lidar SLAM, 3D++, UAVs, Autonomous Vehicles, legged robotics, Guardian, etc. The team also supports the Sensor Networks Group, as well as projects for the Agriculture, Energy, Biosecurity, Oceans & Atmosphere and other CSIRO Business Units. The Advanced Mechatronics team also plays a core role in the design and implementation of the I3Hub at QCAT, as well as supporting I3Hubs in Melbourne and Tasmania.

A small subset of the world-leading research hardware developed include the Miconia survey UAV research platform, the Zebedee 3D Scanner, the world first Load Haul Dump (LHD) automated vehicle, an intra-rumen gas-sensing systems for cattle, the Starbug Submarine and the small Camazotz device used to do continental-wide tracking of fruit bats.

 

|Evolutionary Robotics

In conjunction with the Autonomous Design Testbed in the Active Integrated Matter Future Science Platform, we are actively pursuing several evolutionary robotics themes:

  • We can evolve robot end effectors, for example legs and arms, that are environmentally specialized and can be printed and attached to our robots on a per-mission basis for enhanced mission performance.
  • Integration of evolutionary techniques to perform ‘design exploration’ in a space of possible robot configurations.
  • Evolutionary approaches Sim2real and reality gap.
  • Experimentation with soft robotic systems, which can morph and flex to navigate extreme environments.
  • Development of testbeds that allow for evolutionary robotics experimentation to occur reliably and repeatedly in hardware:
    • a testbed for aerial robots, allowing mission-specific controllers to be evolved safely and repeatedly evolved for arbitrary UAVs with no human intervention required.
    • A testbed that performs high-dimensional optimization on legged robots
  • Automatic design SNN for control in hardware, e.g. on FPGAs.
  • Design of spiking ensemble controllers.

Our work has been featured on SCOPE, ABC, The Australian Financial Review, and in Wired magazine, with publications in top IEEE and ACM conferences, as well as Nature Machine Intelligence.

 

 

View Our Projects