SP-4: Quantum head
Summary: This pattern implements an architecture where a quantum inference engine operates in the pipeline after a classical neural network. The classical component is employed to reduce the dimensionality of the input data.

Figure 1. Graphical representation of the quantum head pattern.
Problem: In real-world applications, input data often has large dimensions and cannot fit into modern NISQ computers. This limitation poses challenges in processing large-dimensional input data.
Solution: To leverage the quantum advantage with high-dimensional graphical data, a hybrid architecture has been proposed. In this architecture, the dimensionality of the input data is first reduced through data pre-processing and feature extraction on classical computers. Subsequent inference is then performed on a quantum computer. This pattern works well with pre-trained deep neural networks taking advantage of the quantum transfer learning technology.
Benefits:
- Scalability for NISQ computers. Implementations of this pattern require quantum components with a small number of qubits and shallow quantum circuits, making the system compatible with most existing NISQ computers. This pattern makes it possible to use NISQ-era quantum computers for practical applications that process high-dimensional data.
- Efficiency. This architecture aims to substitute the several last layers in a classical neural network, most often the last fully-connected layers, by a quantum neural network, improving performance and reducing number of parameters.
- Trainability with transfer learning. For gradient descent methods used in training neural networks, computing gradients for quantum components and transferring this information between classical and quantum elements presents a challenge. To overcome these difficulties, quantum transfer learning technology can be utilised with this pattern.
Drawbacks:
- Portability and deployability. The Quantum Head pattern implies broadband and high-speed communication channels between classical and quantum components, which is not always provided by default.
Known uses:
- (Jahin et al. 2023) and (Schetakis et al. 2022) tested a hybrid architecture, which consists of an input layer, several classical neural network hidden layers, and one quantum layer. In these works, the authors applied this architecture to supply chain backorder prediction and abstract binary classification, respectively. To compute the gradients of quantum layers, the authors inĀ (Bergholm et al. 2022) utilised PennyLane, which introduced the concept of differentiable quantum nodes that can be used in conjunction with classical layers.
- (Kim et al. 2023) utilised the quantum head pattern with transfer learning for classifying the MNIST dataset.
- (Chao et al. 2023) employed the quantum head pattern and a tree tensor network as part of reinforcement learning algorithms for AlphaZero.