Skip to main content

FAQ

What is Spark?

Spark is a computational system, or framework for wildfire spread prediction. It contains parts to read, write, analyse and display geospatial data, as well as a computational solver to predict the spread of a wildfire. You can read more detail on the system in the about page.

Are the applications on the download page ‘Spark’?

No, these are applications which demonstrate how Spark can be used. They also show what an end product using Spark can look like. Spark is made up of any parts which can be customised into different applications, depending on how it needs to be used.

How is Spark customisable?

The different parts can be connected together to perform reading, writing and display for different applications. The spread rate of the fire can be directly programmed as computer code. Any number of named user-defined data layers can be used in the fire spread function. Users can also write customised libraries to add functionality to Spark.

How can I create a custom application built on Spark?

Currently, we are only allowing downloads of the demonstration applications. Spark is built on the CSIRO Workspace framework and we hope to release the Spark Workspace plugin in future. The Spark parts are individual operations in the Workspace, which can be connected into a logical workflow within the Workspace editor.

Which version of the Spark demonstrators should I use?

If you have a 64-bit operating system you should use the 64-bit releases, otherwise you should use the 32-bit releases.

How do I know whether my system meets the Spark system requirements?

The core part of Spark will run on almost any recent computer or server. However, there is no easy way to check whether a computer supports OpenCL without a third-party tool (such as OpenCL-z). Most computers with dedicated GPU cards will come with OpenCL support. Otherwise, free OpenCL drivers can be installed, please see the system requirements page for more information.

Why does the Spark GUI demonstrator just show a white screen?

To run the gui demonstrator, you will need to open a project. Two are included, click ‘Open’ then choose ‘proj1’ or ‘proj2’. To run the project, press the ‘Start’ button.

Pressing ‘Start’ in Spark GUI just takes me to a white screen?

It takes a few moments for the OpenCL driver to initialise. If you don’t see anything after a few moments there may be an OpenCL driver issue. To see any output, click on ‘View’ then select ‘Show log’. The log window will appear at the bottom of the screen, this can be detached and resized for a better view.

When I try and run Spark I get an error message saying openCL.dll is missing, how do I fix this?

This means you do not have OpenCL support, which is required to run Spark GUI. If your machine has a graphics card (i.e. an NVIDIA or ATI graphics processor), the latest graphics drivers from the vendor’s website come packaged with OpenCL, so installing these should fix the problem. Alternately, in order to run the program without a graphics card (i.e. on the CPU), you can install an appropriate software-based OpenCL platform. In this case we recommend the AMD APP SDK. Once this has been done, reinstall the version of Spark appropriate to your machine.

I’m getting an OpenCL error in the log, what does this mean?

Error ‘-11’: Some graphics drivers report that OpenCL is capable of more than the graphics card can actually handle, so when the program is run the graphics cards returns an error. We’re aware this is an issue with older NVidia cards and we’re looking into a solution.

Error ‘-54’: This appears to be an issue with running OpenCL on the CPU on the OSX platform. We are currently investigating this issue. It is recommended that Spark is run using the GPU on OSX, if possible.

Error ‘-1001’: This may occur if Spark is being run on Linux system using the beta AMD 3.0 APP SDK driver. We strongly recommend using the stable 2.9.1 driver if possible.

Please post other errors on the forum where our team will be able to look into the issue.