1. Introduction
Computers are omnipresent in almost every aspect of the music industry nowadays [
1,
2]. Emerging new quantum computing technologies are likely to continue this trend. They are bound to impact how we create, perform, listen, and commercialize music in the future.
Researchers and practitioners in the newborn field of
Quantum Computer Music have begun to explore ways to leverage the quantum-mechanical nature of quantum computing for applications in music [
3,
4,
5]. It is expected that this new technology will lead to new instruments and approaches to creating music. This paper introduces examples of this.
The majority of quantum computer music research to date has been focused on using quantum computers to generate music algorithmically. For instance, ref. [
6] adapted a quantum natural language processing (QNLP) method to implement a system for creating music exploring the relationship between music and language. Additionally, ref. [
7] introduced the Basak–Miranda generative music algorithm. This algorithm leverages a property of quantum mechanics known as constructive and destructive interference to compose tunes. Furthermore, ref. [
8] introduced methods to generate music using a type of quantum computing known as adiabatic quantum computing.
We are interested, however, in developing new musical instruments. Research on this front has been less common. This is probably due to the fact that the quantum representation of audio is not trivial. There is no agreed method to date for representing audio quantumly [
9]. Moreover, at present, there is no hardware available for the fully-fledged implementation of quantum audio.
Nevertheless, there have been some initiatives to synthesise sounds using quantum computing. For instance, ref. [
10] developed a system for controlling the parameters of a digital sound synthesiser with results from quantum computations. Another approach was proposed in [
11]. In this case, data acquired from the workings of a quantum processor were converted into audio signals in real time during a performance.
This paper introduces Q1Synth and an example of a performance using it. Q1Synth is a novel software-based musical instrument that renders sounds from quantum state vectors representing the properties of a qubit and its measurement.
Q1Synth is presented on the computer screen as a Bloch sphere, which represents a qubit (
Figure 1). The performer plays the instrument by rotating and measuring the qubit. It can be rotated using the computer’s mouse or an external MIDI controller. While the qubit is rotated, a continuously changing sound is produced. Additionally, when the qubit is measured, the system also produces a dynamic sound.
The paper begins with a brief introduction to quantum computing, focusing on the qubit and Bloch’s sphere. This introduction is limited to what is deemed necessary to understand how Q1Synth works. For more detailed explanations of quantum computing, please refer to [
12,
13]. Then, it explains how the instrument synthesises sound. It focuses on just one of its synthesis techniques: frequency modulation. Next, it shows how we networked three Q1Synths to form a trio for a live performance. The paper ends with a discussion on ongoing developments.
2. The Qubit and Bloch’s Sphere
A qubit is to a quantum computer what a bit is to a digital one: it is a basic unit for representing information. The state of a qubit is characterised within a two-dimensional vector space, known as 2D Hilbert space. The canonical basis vectors in this space are notated as and , which is an abbreviated way to represent such vectors. In quantum mechanics, the bra–ket notation, or Dirac’s notation, is used to represent quantum states.
The state of a qubit is expressed mathematically as a linear combination of basis vectors as follows: with and . This linear combination expresses a state of superposition.
Simply put, a quantum computer processes information with qubits in a state of superposition. But it returns binary numbers (0 s and 1 s) when we read the qubits. In quantum computing terminology, the act of reading qubits is referred to as
projective measurement [
12].
A single qubit is represented visually as a sphere with opposite poles, with at the north pole and at the south. This sphere is called Bloch sphere. From its centre, a unitary vector can point to anywhere on the surface. This vector, which represents the state of the qubit, is referred to as a state vector.
In quantum mechanics, the variables and represent amplitudes. In practice, we can say that and encode the probabilities for the possible results of measurements made on a quantum system.
Quantum computers are programmed by applying sequences of operations to qubits. Programming languages for quantum computing provide a number of operations, referred to as
gates, which act on qubits. For instance, the
gate rotates the state vector of a qubit by 180 degrees around the x-axis of the Bloch sphere geometry: if the qubit vector is pointing to
, then this gate flips it to
, or vice versa (
Figure 2).
Essentially, quantum gates perform rotations. A generic rotation gate is typically available for programmers, with three angles called Euler angles. Thus, any rotation gate can be specified in terms of . For instance the gate is equivalent to ).
A quantum program is often depicted as a circuit with sequences of quantum gates operating on qubits (
Figure 3). Typically, the qubits start in the ground state
.
3. The Q1Synth System
Q1Synth works with one qubit.
Figure 1 shows a screenshot of the system in ‘Simple’ mode. It shows an artistic representation of a Bloch sphere and its respective axes:
z,
y, and
x. Note the labels
on the north and
on the south of the vertical
z axis. The labels
and
mark the opposing ends of the
x axis, as
and
denote
y. The meaning of
and
come from quantum mechanics definitions [
12] that are not so important to understand right now.
The coordinates of the Bloch sphere are given in the form of Euler angles . They describe rotation angles necessary to position the state vector around the sphere, starting from the north pole. The position of the state vector is represented by a red dot.
The angle will determine the inclination (or latitude) of the vector, whereas is responsible for its azimuth (or longitude). The angle does not change the position of the red dot, but it will influence the orientation of the sphere. We refer to as the phase of the vector.
Once the button ‘rotate’ is pressed, the instrument begins to make a sound. Rotating the sphere (e.g., by clicking and dragging the mouse) moves the state vector and modifies the sound continuously. The red dot indicates where the state vector is pointing.
When the ‘measure’ button is activated, the system takes control from the user, and moves the vector to either north or south, depending on the result of the measurement. The vector moves in slow motion and is accompanied by a respective sound. Once the vector reaches the destination the sound ends. Activating the ‘rotate’ button again recommences the process, and so on. This is how the instrument is played.
When the measuring command is detected, the system builds a quantum circuit that implements a
gate and sends it to a quantum computer over the cloud (
Figure 4). The coordinates of the sphere at the moment the ‘measure’ button is activated define the angles for the
gate. Then, the quantum computer returns the measurement result (
), which defines whether the vector moves north or south. Currently, Q1Synth connects to an IBM Quantum processor located in the USA.
At the time of writing, Q1Synth uses three types of sound synthesis techniques [
14]: frequency modulation (FM), additive synthesis, and granular synthesis. This paper focuses on the method for controlling FM. The methods for controlling the other two techniques are identical to the FM one.
FM synthesis is based upon the same principles used for FM radio transmission: the frequency of a waveform (referred to as the
carrier) is altered with a modulating signal (referred to as the
modulator) [
15]. There are a number of variations in FM synthesiser design. The most basic comprises two sine wave oscillators: one acting as the modulator and the other as the carrier signal, respectively (
Figure 5).
In FM sound synthesis, the carrier is the signal that we hear directly. By contrast, the modulator is heard only indirectly, because its output is added to the base frequency of the carrier (
Figure 5). When the frequency of the modulator is in the audio range, numerous additional partials, or sidebands, are added to the spectrum of the carrier’s wave.
Let us consider the vibrato effect as a starting point example to illustrate the FM technique. Vibrato is a tremulous effect imparted to vocal or instrumental notes. It is often used to add expressivity to musical notes. It is caused by variations in pitch. The fundamental difference, however, is that vibrato uses a sub-audio signal to modulate the pitch of a sound. A sub-audio signal is a low-frequency signal, well below the human hearing threshold, which is approximately 20 Hz. The resulting sound, in this case, has a perceptibly slow variation in its pitch. If the modulator’s frequency is set to a value above the human hearing threshold, then new components are added to the sound spectrum of the carrier. The frequency and amplitudes of these new components influence our perception of timbre.
The Q1Synth implementation of FM requires values for seven parameters:
Frequency (freq): defines the pitch of the sound.
Amplitude (amp): defines the loudness of the sound.
Reverberation (reverb): defines the amount of reverberation to add to the sound.
Modulation index (mod index): defines the number of components in the sound spectrum. The higher this index, the higher the number of components.
Harmonicity ratio (harmonicity): working alongside mod index, this parameter defines the richness of the sound. The higher this ratio, the richer the sound.
LFO frequency (lfo freq): the low-frequency oscillator (LFO) creates a vibrato or tremolo effect. The higher the frequency, the faster the vibrato.
LFO depth (lfo depth): defines the intensity of the vibrato.
In a nutshell, the system uses the angles (, , and ) of the Bloch sphere to interpolate parameter values for the FM algorithm. First, the user defines which synthesis parameters each angle will control. For instance, by default the system associates the following parameters:
inclination = {freq, amp, reverb},
azimuth = {mod index, harmonicity},
phase = {lfo freq, lfo depth}.
In ‘Advanced’ mode view, the system shows sliders on either side of the Bloch sphere. They are used to specify the range of values for each parameter (
Figure 6). Each sphere axis is assigned a pair of slider banks, enabling the values at each extreme to be interpolated as the sphere is rotated. For example, when the state vector is pointing north, (
position), the frequency, amplitude, and reverb parameter values will correspond to the sliders on the left side of the screen. When the state vector is pointing south (
position), these parameters correspond to the sliders on the right. When the sphere is between these poles we hear an interpolation between each slider pair, corresponding to the angle of inclination. It is possible to save presets.
Similarly, pairs of slider banks are assigned to the azimuth (interpolating between the and directions), and the sphere’s phase. By re-orienting the sphere around these planes, the user interpolates between the slider settings at each extreme. All interpolations are linear. For example, if a pair of controls is set to on the left and on the right, when the sphere is rotated to the middle, the parameter value will be equal to .
We note that in quantum computing, a measurement causes the state vector to ‘collapse’ instantly to either
or
[
16]. However, we took the liberty to convey this concept artistically: we added a virtual time delay to sonify the ‘collapse’. Thus, after a measurement, Q1Synth makes an animated trajectory of the red dot towards the corresponding pole. In doing so, the synthesiser performs a complex, multidimensional modulation, interpolating between all parameters simultaneously until the red dot reaches its final destination, at which point the sound fades out.
4. Q1Synth Networks for Group Performance
This section introduces the setup developed for a performance entitled Spinnings, with three Q1Synth units. A MIDI gesture controller is used to rotate the qubit with hand movements.
Three Q1Synth instruments are networked in a star topology with an additional machine acting as a hub (
Figure 7). As each player rotates the spheres, the instruments produce their respective sounds, which are mixed and relayed to the speakers (
Figure 7c). The hub tracks the state vectors of each instrument and simulates a combined three-qubit quantum state dynamically (
Figure 7b).
The hub is programmed to recognise the thumbs-up gesture as a command for measurement (
Figure 7b). When a player makes a thumbs-up, the hub stops tracking the instruments and builds a quantum circuit expressing the current status of the simulated state vector. Then, it sends the circuit to the quantum computer for measuring (
Figure 7d,e).
The measurement, which in this case will be three digits long (), is relayed back to the hub. The hub informs each instrument of its respective outcome and synthesises the ‘measurement sound’. This sound lasts for the period it takes for the vectors (i.e., red dots) to move to the north or sound poles of the respective instruments. It is programmed to take five seconds by default, but this is customisable. As soon as the sound ends, the players gain back control of the spheres and another cycle commences.
Even though each player has control over their own sound as they rotate the sphere, the measurement sound will be surprising: it depends on the measurement of the combined quantum state.
Spinnings was premiered on 8 November 2022. The concert took place at the Goethe-Institut in London, during a
Living in a Quantum State event [
17] to launch the book
Quantum Computer Music, edited by E.R.M. [
5]. The event was organised with Moth Quantum [
18].
The instruments and the hub were projected on a cinema screen, which enabled the audience to follow the actions of the performers (
Figure 8). A live recording of the performance is available [
19].
5. Implementation Technicalities
Q1Synth was developed as a Web application. First introduced in 2011, the Web Audio API provides a powerful system for handling audio on the Internet, turning any Web browser into a widely available, and potentially versatile, musical tool [
20].
Q1Synth’s sound synthesis engines were implemented with the Web Audio framework Tone.js [
21]. The user interface is built with React.js [
22] and written in TypeScript [
23]. The Bloch sphere visualisation is rendered using P5.js [
24].
The instruments were networked using OSC-Qasm, a Python package developed at ICCMR, University of Plymouth, as part of QuTune Project [
25]. OSC-Qasm connects music programming environments with quantum hardware [
26]. It is based on the OpenSoundControl (OSC) communication protocol, which is widely used by the music technology community [
27].
OSC-Qasm includes a server and client applications. For
Spinnings, the OSC-Qasm Client was developed with Max/MSP [
28] and QuTune’s QAC Toolkit package [
29]. The client builds the quantum circuits from the Q1Synth coordinates in OpenQASM (Quantum Assembly Language) [
30]. The OSC-Qasm Server is a cross-platform, Python-based application, that reads quantum circuits written in OpenQASM and runs them on quantum backends, either simulators or real hardware. It is powered by the Python-based quantum programming package Qiskit [
31].
Finally, note that the Bloch sphere coordinates captured by Q1Synth’s interface use the
ZYZ convention [
32] for defining the Euler angles passed to the quantum circuit. However, for visualisation, P5.js contains a different definition for its coordinate system. To achieve the desired visual result, it is necessary to use the coordinates in
YZY convention for rotating the sphere.
6. Conclusions and Ongoing Work
This paper introduced a quantum computer-based musical instrument. To the best of our knowledge, this is an unprecedented concept.
Q1Synth and Spinnings are examples of how creative practices open the doors to new application pathways for quantum computing technology. Conversely, they illustrate how emerging quantum computing technology is leading to new approaches to musical instrument design and musical creativity.
Currently, quantum hardware is commonly available only through the cloud. They are geared towards operating in batch processing regimes. But musical interaction requires real-time processing. With Q1Synth, we started to tackle issues in real-time interaction with quantum computers.
Developing the instrument as a Web application paves the way for a Quantum Internet of Things (QuIT);
Figure 9 shows Q1synth running on an iPhone. Additionally,
Spinnings glimpsed at how QuIT might be harnessed for creative practices in the not-so-distant future.
Quantum computers need classical ones to function. Seamless integration and connectivity of different computing technologies will be of paramount importance to harness quantum computers for novel software solutions. Practical connectivity requirements for Q1Synth and
Spinnings were resolved through OSC-Qasm [
26]. Although we have been developing OSC-Qasm to connect music programming environments with quantum hardware, the OSC protocol is generic enough to embrace domains other than music.
Currently, we are working on Spookings, a musical performance with dozens of networked instruments geographically distributed, e.g., different cities.
The qubits in
Spinnings do not interact with each other. For
Spookings, we are advancing ways to entangle instruments, e.g., put instruments in a Bell state [
12], whereby the measurement of one dictates the behaviour of another. This is a musical concept for group performance, which only quantum technology can truly afford. We intend to stage a performance of
Spookings in the Metaverse [
33].
A lite version of Q1Synth is freely available online [
34]. As it works only in quantum simulation mode, it does not require a connection to a quantum hardware backend. This version makes sounds solely with the FM synthesis technique.