Developing technologies for SKA-Low

An instrument as large and complex as the SKA project requires a broad range of innovative technologies to make it work. We both lead and contribute to a number of technology development areas associated with the SKA telescopes.

We work with international partners on:

  • developing digital signal processing systems to process the massive amounts of data that will be generated by SKA-Low, the central signal processor (CSP)
  • writing software for the supercomputers that will process the high volume of data flowing from the SKA, contributing to the science data processing (SDP) and the SKA-Low monitor, control and calibration system (MCCS) teams
  • working in the assembly integration and verification team which is responsible for coordinating the testing and integrating of all the telescope parts and and its complex systems
  • inventing and testing new technologies for the SKA telescopes using precursor instruments including our ASKAP radio telescope.

Central signal processor (CSP)

Open allClose all
What is the CSP?

The SKA-Low central signal processing (CSP) team is building the correlator and beamformer, colloquially known as the backend of the telescope. This system combines incoming digital signals from from the telescope’s 512 antenna stations (each station is made up of 256 individual antennas) to generate 500,000 visibilities, the data that is processed to generate astronomical images.

The Central Signal Processor (CSP) for the SKA-Low telescope is responsible for accepting data from the 512 stations, generating a range of astronomical data products (e.g. visibility spectra, Very Long Baseline Interferometry tied-array voltage beams, performing Pulsar Search and Pulsar Timing processing in real-time, and forwarding the resulting data to the Science Data Processor.

The CSP will allow SKA-Low to operate concurrently in two modes, ‘imaging’ and ‘non-imaging’. The SKA-Low telescope will concurrently operate as one homogeneous array or as up to and 16 separate ‘sub-arrays’. Each ‘sub-array’ being programmable as a separate conceptual telescope in terms of beams, frequency selection and the setting of configurable imaging and non-imaging parameters.

“Imaging mode” refers to dividing the full 300MHz of bandwidth (50MHz to 350MHz) into more than 55,000 frequency channels, cross-correlating each and every pair of stations in the array (including correlating each station with itself or auto-correlation), for each frequency channel.

These correlation products are then delivered to the Science Data Processor (SDP) for processing into high-quality continuum and/or spectral-line ‘images’. In addition, each subarray can be configured in frequency ‘zoom mode’ (to zoom in on a smaller range of frequencies at much higher frequency resolution), or operate on sub-stations (dividing a single station into smaller units).

The “non-imaging mode” comprises a range of special purpose astronomical signal processing tasks:

  • Pulsar Search, where the telescope will be capable of to forming up to 504 individual pulsar search beams
  • Pulsar Timing, the telescope will be capable of to forming 16 individual pulsar timing beams
  • VLBI, the telescope will be capable of forming 4 VLBI beams.

CSP Low accepts commands and ‘metadata’ (configuration and status information) from, and sends status and other information and ‘metadata’ to, the Telescope Monitoring and Control. (see following figure)

A single CSP chassis, comprised of 20 Xilinx Alveo U50 [TM] FPGA cards installed in a computer server. The SKA-Low CBF will consist of many such servers.

The SKA-Low Correlator and Beamformer is called ‘Atomic COTS’; instead of using bespoke hardware, Atomic COTS is based on commercial-off-the-shelf (COTS) equipment, the heart of which is the XilinxTM AlveoTM Field Programmable Gate Array processing card designed for use in computing in data centres.

The system also includes firmware, comprised of data files or the ‘program’ which describes the logic elements implemented on the FPGA hardware to carry out the necessary signal processing functions and is loaded onto each AlveoTM card, and software to control the system. The firmware is mostly developed in a low level programming language called VHDL that provides efficient utilisation of FPGA resources and low power consumption.

The control software, Local Monitor and Control (LMC) is being developed separately with the SKA TangoTM monitoring and control system.

A team called “Perentie” led by our Dr Grant Hampson, works as an international collaboration of CSIRO and ASTRON to design and prototype the SKA-Low Correlator and Beamformer (CBF) system. This system consists of FPGA hardware, firmware (code that executes on the FPGA), and software to control the system. ASTRON and CSIRO both have a long history in the development of astronomical signal processing systems.

Pulsar Search (PSS) and Pulsar Timing (PST) are being developed in partnership with institutions around the world, and the focus for CSIRO is on the SKA-Low CBF system.

In the lead-up to construction, the engineering teams are connecting up, or ‘integrating’, the various parts, or ‘sub-elements’, of the backend which the team has developed, and then running tests to see if the system works on a small scale, before scaling up to the full system.

Engineering room filled with computer racks

Our SKA-Low Prototype System Integration facility (SKA-Low PSI) in Marsfield, Sydney, aimed at accelerating the development of SKA-Low.

CSIRO’s SKA engineering team transformed a space in its Marsfield facility, previously used to test technologies for the ASKAP radio telescope, into the SKA-Low Prototype System Integration facility (SKA-Low PSI) – aimed at accelerating the development of SKA-Low.

The SKA-Low PSI mimics the Murchison Radio-astronomy Observatory’s (MRO) ‘super-computing’ control building – a major centre of telescope control, monitoring, signal processing and communications. The MRO is located in outback Western Australia, and for now, this Sydney-based facility offers an accessible location for prototype testing.

In March 2020, the CSP team integrated two SKA-Low backend ‘sub-elements’; the Correlator and Beamformer, and the Pulsar Timing Engine (which measures the arrival time, frequency and pulse characteristics of known pulsars, to high accuracy). The test was successful and the team was delighted to receive positive feedback on their progress from the SKA Deputy Director General.

Science data processing (SDP)

Open allClose all
What is SDP?

One of the major technological challenges of the SKA is to be able to process the vast amount of data that comes from each of the telescopes. At the Observatory sites, each telescope will produce up to ~5 Tb/s (or ~ 700 GB/s) of measurement data, which is equivalent to downloading ~200 High-Definition movies in one second.

In Australia these data will be transported via dedicated fibre from the Murchison Radio-astronomy Observatory to the SKA-Low science processing centre located in Perth. SKA-Mid’s science processing centre will be located in Cape Town.

Each of the science processing centre for the SKA telescopes will have dedicated supercomputers running bespoke software applications that produce high-quality images of the radio sky and other science products for further analysis.

flow diagram

SKA Big Data Flow challenge. Image Credit: P. Diamond, Shanghai SKA Meeting Nov 2019

It is estimated that these supercomputers will be producing a combined 600 PB of science data per year. This data will be distributed to a worldwide network of Regional Centres (See diagram below). These Regional Centres will be the main point of contact for astronomers around the world to access the science-ready data from the SKA telescopes.

CSIRO has teamed-up with ICRAR Data Intensive Astronomy group and formed the team YANDA (Wajarri word for picture) to design and develop key components of the SKA Science Data Processing system during pre-construction and bridging phases of SKA.

CSIRO has contributed with its existing calibration and imaging software, named YANDAsoft used mainly to process ASKAP data and running on a dedicated supercomputer at the Pawsey Supercomputing Research Centre in Perth. CSIRO has also made significant contributions in the Software Architecture and Radio Interferometry Algorithmic research.

SKA-Low Monitor control and calibration system (MCCS)

Open allClose all
What is the MCCS?

The monitor, control and calibration system (MCCS) is the software system that monitors and controls the receiver front-end electronics of the SKA-Low telescope, as well as implementing real-time calibration and performing complex diagnostic support. The MCCS system is part of a complex monitoring and control software system in charge of the overall data acquisition of the telescope.

Flow diagram

SKA Low Telescope software context diagram, showing where MCCS system sits within the complex monitoring and control system of the telescope. Image Credit: MCCS Team

The MCCS will take in data from across the SKA-Low telescope antenna systems and feed back to each element within the signal flow, adjusting systems and controlling the telescope to ensure data quality.

Since 2019, CSIRO has been part of the MCCS software team responsible for designing prototypes during the SKA bridging phases in preparation for construction. CSIRO has contributed with expertise in developing monitoring and control software system for radio astronomy instrumentation and other embedded and real-time systems.

Assembly integration and verification (AIV)

Open allClose all
What is AIV?

Inside the SKA Integration Facility at CSIRO, the Assembly, Integration and Verification (AIV) team has been testing SKA prototype subsystems. Recently expanded, the facility is being configured in a way to mimic the real system. Engineers have been feeding simulated telescope data into the system and producing a valid measurement set – which means it is working as modelled!

Much of this SKA-Low engineering work is focussed at the integration centre, currently located in Sydney, where the various parts of the ‘back-end’ are being brought together on a small scale. This effort involves CSIRO collaborating with SKA partner institutes such as ASTRON in the Netherlands, SKA industry partners, and university partners in Australia and overseas.

The SKA-Low PSI is enabling engineers to set up the telescope’s electronic systems and experience some real-world challenges in an environment that mimics the site central facility. They can be executing continual tests which allows them to resolve issues and iron out unforeseeable bugs.

The PSI installed base will further expand over the coming months to continue to support the SKA Prototype sub systems as they develop. The skills, knowledge and experience developed by the AIV team in this process will be invaluable for the setup and operation of the Integrated Test Facility (ITF) and subsequent roll out of systems on site once construction commences.

Tour group of people in a technical room filled with computing racks

Dr Grant Hampson, CSIRO, leading a tour of the SKA-Low engineering test facility.

Managing mega-data

The SKA will generate data on the scale of petabits, or a million billion bits, per second – more than the global internet rate today. We’re building supercomputers to process the enormous amounts of data.

The data from SKA-Low will flow into a custom supercomputing facility at the MRO, where initial processing of the data will take place. From there, it is expected to be transferred via fibre-optic cable to the Pawsey Supercomputing Centre, in Perth.

Data is likely to be transferred to international telescope users via new and existing undersea cables from Perth to Singapore, and then onwards around the world.

CSIRO’s ASKAP radio telescope is demonstrating the high-performance processing required to meet the SKA data challenges. Using the Pawsey supercomputer and custom-written software developed at CSIRO. ASKAP produces science-ready datasets of many Terabytes for each observation, served to astronomers through ASKAP’s science archive.

  • The SKA Low telescope will generate 300 petabytes of data per year.
  • The SKA Mid telescope will generate 300 petabytes of data per year.

Case Studies