About

The Spark toolkit

Spark is a toolkit for the end-to-end processing, simulation and analysis of wildfires. Users can design custom fire propagation models by building on Spark’s computational fire propagation solver and incorporating various input, processing and visualisation components, each tailored for wildfire modelling.

The Spark fire simulation workflow; input, processing, visualisation and output.

The Spark fire simulation workflow

Easily source input data

Many types of input and output data formats can be used including text files, comma separated variable (CSV) files, most raster image types, ESRI raster and vector formats (vector formats are converted to raster in the framework) and GDAL raster data types. Once the data layers are imported, the first (optional) step of the processing stage runs a user defined initialisation model over the data layers. Possible uses of this stage could include defining land classifications from input data sets or creating fuel map layers, ready to use in the propagation solver.

Configurable fire propagation model

The Spark propagation solver can simulate the propagation any number of closed fire perimeters over an unbounded spatial area. Under the hood, the solver uses high-performance OpenCL technology, allowing it to run models quickly and scalably on both GPU-accelerated desktops and compute-cluster hardware. The underlying algorithm uses the raster-based level set method. A key feature of this method is the ability to control exactly how the fire perimeter is propagated, which Spark enables by allowing the user to define the speed function that controls the outward propagation at each point on the perimeter. The speed function is input directly as a C function in the Spark framework. Examples of speed functions for several fuel types are available in the model library.

User-defined input layers

The speed function can be linked to any number of named user-defined layers. For wildfire modelling these layers will typically include fuel parameters, topography and meteorological conditions such as temperature and relative humidity. The speed function can be written in a concise and clear manner using the given name for each layer. For example, if the propagation depends on the current moisture level from a user-defined layer named moisture the speed function could, for example, be set to speed = 1 + 1/moisture. User-defined layers can be static, such as elevation maps, or can be time varying, such as gridded temperature or moisture predictions. The framework handles all spatial and temporal interpolation of the layers, ensuring that the speed function uses the correct value for each point on the perimeter.

Fig_layers

Spark also uses a classification layer corresponding to a particular fuel type, or other classification, within the domain. Each classification can be used to run a corresponding speed function, allowing multiple speed functions for different fuel types to be defined over the domain. For example, cells with a classification value of 1 may be grassland, with a corresponding grassland speed function. Cells with a classification 2 may be forest, with a different speed function. The framework will automatically use the speed function corresponding to each classification within each cell to update the fire perimeter.

On-the-fly visualisation and analysis

The framework can periodically suspend the simulation to allow analysis, visualisation or output of the simulation results. There are a number of built-in modules for data analysis ranging from inspection of point values, data plotting and statistical calculations over the domain to image analysis using OpenCV. This analysis can be performed on individual simulations or over ensemble sets of simulations. Both the propagation and the resulting analysis can be interactively visualised in a graphical user interface (GUI) using OpenLayers or a range of two and three dimensional display types.

Spark GUI example. Colour scheme shows arrival time.

Solid foundation

Spark is built using Geostack, an open-source platform for accelerated geospatial modelling and simulation. Spark is written in Python for compatibility with common GIS processing and data workflows. Spark uses scalable GPU-based processing, allowing it to be run on computers ranging from desktops to supercomputer clusters. The processing speed of Spark allows rapid evaluation of changing input conditions such as wind changes, the testing of suppression scenarios and construction of risk metrics.