Dataset Creating¶
Goal
Before any characterization and control calibration, we have to gather the data for us to extract information about the target quantum device first. The central object for holding data for us is ExperimentData instance. It is responsible for saving and loading dataset to and from disk. Furthermore, it allows us to organize the dataset better. Thus, the goal of this tutorial is to create the instance of ExperimentData.
First and foremost, let's import some package that necessary for us to use for our purpose. We also ignore the User warning raised by the ODE solver.
import warnings
warnings.filterwarnings("ignore", category=UserWarning)
import jax
import jax.numpy as jnp
import inspeqtor.v1 as sq
import pandas as pd
key = jax.random.key(0)
If you do not familar with jax.random.key, we refer to their document. But for brevity, you can think of it as how jax generate the pesudo random number.
Prepare experiments¶
We break the experiment preparation phase into the following steps.
- Gathering "prior" information about the quantum device.
- Defining the control action.
Quantum device specification¶
In inspeqtor, we focus on characterizing quantum device. In the finest level, user most likely want to perform control calibration on individual qubit which is part of the full system. Thus, we provide a dedicated dataclass responsible for holding "prior" information about the qubit. The information is often necessary for constructing the subsystem Hamiltonian which is used for open-loop optimization. Below is the code snippet to initialize QubitInformation object.
qubit_info = sq.data.QubitInformation(
unit="GHz",
qubit_idx=0,
anharmonicity=-0.2,
frequency=5.0,
drive_strength=0.1,
)
qubit_info
QubitInformation(unit='GHz', qubit_idx=0, anharmonicity=-0.2, frequency=5.0, drive_strength=0.1, date='2025-11-15 16:34:35')
Define the control¶
For composability of control action, we let user define an "atomic" control action by inheriting Control dataclass, then compose them together via ControlSequence class. Below is an example of defining the total control action with only single predefined DragPulseV2.
total_length = 320
dt = 2 / 9
pulse = sq.predefined.DragPulseV2(
duration=total_length,
qubit_drive_strength=qubit_info.drive_strength,
dt=2 / 9,
max_amp=0.5,
min_theta=0,
max_theta=2 * jnp.pi,
min_beta=-5.0,
max_beta=5.0,
)
control_sequence = sq.control.ControlSequence(
controls=[
pulse,
],
total_dt=total_length,
)
By default, ControlSequence performs validation during the object instantiation to detect early bugs. We can see our control in action by sample from it.
key, subkey = jax.random.split(key)
params = control_sequence.sample_params(subkey)
params
[{'theta': Array(4.3855351, dtype=float64),
'beta': Array(-4.46933642, dtype=float64)}]
We can also visualize the control sequence waveform by
import matplotlib.pyplot as plt
fig, ax = plt.subplots(figsize=(6, 3))
t_eval = jnp.linspace(0, total_length, total_length)
waveform = jax.vmap(control_sequence.get_envelope(params))(t_eval)
sq.visualization.plot_control_envelope(waveform, t_eval, ax)
Perform experiment¶
Now, we are ready to perform an experiment on the quantum device. In this tutorial, we will use a local simulator to generate the data for us. In fact, most of the time, we might want to work on the local before performing experiment on the quantum device without confidence.
Either ways, inspeqtor provides a unified and flexible pipeline for user to populate the data and store it with ExperimentData object. For now, let us shift our attention a bit to the quantum device simulator.
As an example, we define our device as a single transmon qubit. We can also rotate the Hamiltonian with a frame as follows.
from functools import partial
signal_fn = sq.physics.signal_func_v3(
get_envelope=control_sequence.get_envelope,
drive_frequency=qubit_info.frequency,
dt=dt,
)
hamiltonian = partial(
sq.predefined.transmon_hamiltonian, qubit_info=qubit_info, signal=signal_fn
)
frame = (jnp.pi * qubit_info.frequency) * sq.constant.Z
hamiltonian = sq.physics.auto_rotating_frame_hamiltonian(hamiltonian, frame=frame)
Then, we can solve the Schrodinger equation using sq.physics.solver. By itself, sq.physics.solver is a function of multiple arguments that will solve the system dynamics on the call. But most of the time, we just want to solve the dynamics of the system with different control parameters, and keep the rest fixed. So the common pattern of using sq.physics.solver is to used it with partial as follows.
solver = partial(
sq.physics.solver,
t_eval=t_eval * dt, # nanosecond.
hamiltonian=hamiltonian,
y0=jnp.eye(2, dtype=jnp.complex_),
t0=0.0, # nanosecond.
t1=total_length * dt, # nanosecond.
)
As per inspeqtor specification, the hamiltonian function required by solver should be a function of two arguments. The first argument can be arbitary while the second argument should be the a scalar of time.
signal_params = sq.physics.SignalParameters(pulse_params=params, phase=0)
hamiltonian(signal_params, jnp.array(0))
Array([[0.00000000e+00+0.j, 1.06878303e-05+0.j],
[1.06878303e-05+0.j, 0.00000000e+00+0.j]], dtype=complex128)
Now, we can solve the system dynamics by using the solver as a single argument function.
unitary = solver(signal_params)
We can visualize the trajectory of the system using the visualization module for quick inspection.
fig, axes = sq.visualization.plot_expectation_values(
sq.visualization.format_expectation_values(
sq.predefined.calculate_expectation_values(unitary).T
),
title="Sample trajectory",
)
We saw that by using sq.physics.signal_func_v3 as a signal function wih the hamiltonian in the predefined module require using sq.physics.SignalParameters as the first argument for solver. The advantage of this approach is the explicitness of the control parameters. While it needs a lot of setup for it to working properly.
Alternatively, we can use sq.physics.signal_func_v5 which will results in an jnp.array as the first argument. This approach might be preferred for model training, vectorization, and compatability with other library.
signal_fn = sq.physics.make_signal_fn(
# We have to transform the usual envelope with helper function.
get_envelope=sq.control.get_envelope_transformer(control_sequence),
drive_frequency=qubit_info.frequency,
dt=dt,
)
hamiltonian = partial(
sq.predefined.transmon_hamiltonian, qubit_info=qubit_info, signal=signal_fn
)
hamiltonian = sq.physics.auto_rotating_frame_hamiltonian(hamiltonian, frame=frame)
solver = partial(
sq.physics.solver,
t_eval=t_eval * dt, # nanosecond.
hamiltonian=hamiltonian,
y0=jnp.eye(2, dtype=jnp.complex_),
t0=0.0, # nanosecond.
t1=total_length * dt, # nanosecond.
)
We also provide a helper function to convert the list[ParameterDictType] returned from the sample_params method to jnp.narray and vice versa.
a2l_fn, l2a_fn = sq.control.get_param_array_converter(control_sequence)
array_params = l2a_fn(params)
array_params
Array([ 4.3855351 , -4.46933642], dtype=float64)
reverted_params = a2l_fn(array_params)
reverted_params
[{'theta': Array(4.3855351, dtype=float64),
'beta': Array(-4.46933642, dtype=float64)}]
Now, we can simply solve the system dynamics with solver and yield the same result.
unitary_a = solver(array_params)
# Assert that they yield the same unitary 😎
assert jnp.allclose(unitary_a, unitary)
fig, axes = sq.visualization.plot_expectation_values(
sq.visualization.format_expectation_values(
sq.predefined.calculate_expectation_values(unitary_a).T
),
title="Sample trajectory",
)
What we can measure in the experiment is not the unitary, but the finite-shot expectation value. inspeqtor provides a helper function sq.predefined.shot_quantum_device to turn the solver into just that! Again, we use partial to fix the solver and the number SHOTS. Note that, it is up to user to decide the partial strategy to suit their use case.
SHOTS = 1000
quantum_device = partial(
sq.predefined.shot_quantum_device,
solver=solver,
SHOTS=SHOTS,
expectation_value_receipt=sq.constant.default_expectation_values_order,
)
# Since, shot_quantum_device is a stochastic function, we have to provide the random key.
key, subkey = jax.random.split(key)
# Here the function accept a batch of control parameters, so we have to reshape it to have a batch dimension.
expvals = quantum_device(subkey, array_params.reshape(1, -1))
expvals
Array([[ 0.924, -0.91 , -0.17 , 0.242, -0.272, 0.272, 0.216, -0.228,
-0.368, 0.328, 0.896, -0.934, -0.26 , 0.32 , -0.918, 0.922,
-0.27 , 0.266]], dtype=float64)
We saw that the output is an array of shape (batch, 18) where the number 18 is the complete combinatation of initial states and Pauli observables. The order of the expectation values array is order as the same as provided expectation_value_receipt argument. in this case, the order is the default order used throughout inspeqtor.
It is important to note that, inspeqtor provide its functionality as functions so that user can provide a custom behavior by simply define function that have the same interface with the functions provided by inspeqtor. For example, solver that sq.predefined.shot_quantum_device is just a function of control parameters that return unitary at each time step as a result. Thus user can switch it out to user-defined solver such as physics-informed neural network.
With the quantum device ready to use, we proceed with experimental data collection. To handle performing experiment in at once, in batch, or a single experiment at a time, we model each control parameter as a single row of the pd.DataFrame table (you can think of it as a sql table as well). Here we provide sq.data.make_row to enforce the schema. Let us see it in action by consider the experiment with initial state $|+\rangle$ and measure in $\hat{X}$ basis.
exp = sq.constant.default_expectation_values_order[0]
exp
ExpectationValue(initial_state="+", observable="X", expectation_value=None)
Don't worry about expectation_value = None, since we merely use ExpectationValue in sq.constant.default_expectation_values_order for the code completion.
sq.data.make_row(
expectation_value=expvals[0][0].item(),
initial_state=exp.initial_state,
observable=exp.observable,
parameters_id=0,
parameters_list=params,
custom_id="stardust",
)
{'expectation_value': 0.924,
'initial_state': '+',
'observable': 'X',
'parameters_id': 0,
'parameter/0/theta': 4.385535096915008,
'parameter/0/beta': -4.469336421734063,
'custom_id': 'stardust'}
The returned value is a python dictionary with parsed keys required by inspeqtor specification for dataset schema. Note that, you can store additional custom information by providing kwargs to the function. Here we provided custom_id = "Auspicious Elephant" as an example.
As a good practice, we will define experiment configuration first before performing the dataset as it is required later. However, user can do it after the experiment as appropiate.
config = sq.data.ExperimentConfiguration(
qubits=[qubit_info],
expectation_values_order=sq.constant.default_expectation_values_order,
parameter_names=control_sequence.get_parameter_names(),
backend_name="red_demon",
shots=SHOTS,
EXPERIMENT_IDENTIFIER="0001",
EXPERIMENT_TAGS=["tutorial", "for", "you", "😉"],
description="One impossible step at a time",
device_cycle_time_ns=dt,
sequence_duration_dt=control_sequence.total_dt,
instance="black_rose",
sample_size=100,
)
Now, we proceed to perform a series of experiment and store the data in the pd.DataFrame instance as follows. First, we sample the control parameters and store them in a ready-to-use format.
control_params_list = []
temp_control_params: list[jnp.ndarray] = []
key = jax.random.key(0)
for control_idx in range(config.sample_size):
key, subkey = jax.random.split(key)
params = control_sequence.sample_params(subkey)
control_params_list.append(params)
temp_control_params.append(l2a_fn(params))
control_params = jnp.array(temp_control_params)
control_params.shape
(100, 2)
For simplicity, we perform experiments in a single batches.
key, subkey = jax.random.split(key)
expectation_values = quantum_device(subkey, control_params)
expectation_values.shape
(100, 18)
Next, we store the experimetal data using sq.data.make_row in the intermediate pd.DataFrame instance.
rows = []
for sample_idx in range(config.sample_size):
for exp_idx, exp in enumerate(sq.constant.default_expectation_values_order):
row = sq.data.make_row(
expectation_value=float(expectation_values[sample_idx, exp_idx]),
initial_state=exp.initial_state,
observable=exp.observable,
parameters_list=control_params_list[sample_idx],
parameters_id=sample_idx,
custom_id=f"{sample_idx}/{exp_idx}",
)
rows.append(row)
df = pd.DataFrame(rows)
df
| expectation_value | initial_state | observable | parameters_id | parameter/0/theta | parameter/0/beta | custom_id | |
|---|---|---|---|---|---|---|---|
| 0 | 0.914 | + | X | 0 | 4.385535 | -4.469336 | 0/0 |
| 1 | -0.932 | - | X | 0 | 4.385535 | -4.469336 | 0/1 |
| 2 | -0.176 | r | X | 0 | 4.385535 | -4.469336 | 0/2 |
| 3 | 0.238 | l | X | 0 | 4.385535 | -4.469336 | 0/3 |
| 4 | -0.324 | 0 | X | 0 | 4.385535 | -4.469336 | 0/4 |
| ... | ... | ... | ... | ... | ... | ... | ... |
| 1795 | -0.040 | - | Z | 99 | 6.077224 | 1.334615 | 99/13 |
| 1796 | -0.206 | r | Z | 99 | 6.077224 | 1.334615 | 99/14 |
| 1797 | 0.196 | l | Z | 99 | 6.077224 | 1.334615 | 99/15 |
| 1798 | 0.978 | 0 | Z | 99 | 6.077224 | 1.334615 | 99/16 |
| 1799 | -0.982 | 1 | Z | 99 | 6.077224 | 1.334615 | 99/17 |
1800 rows × 7 columns
Finally, we can store our data using sq.data.ExperimentData!
exp_data = sq.data.ExperimentData(experiment_config=config, preprocess_data=df)
exp_data.postprocessed_data
| parameters_id | expectation_value/+/X | expectation_value/-/X | expectation_value/r/X | expectation_value/l/X | expectation_value/0/X | expectation_value/1/X | expectation_value/+/Y | expectation_value/-/Y | expectation_value/r/Y | ... | expectation_value/r/Z | expectation_value/l/Z | expectation_value/0/Z | expectation_value/1/Z | expectation_value | initial_state | observable | parameter/0/theta | parameter/0/beta | custom_id | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0 | 0 | 0.914 | -0.932 | -0.176 | 0.238 | -0.324 | 0.280 | 0.244 | -0.194 | -0.336 | ... | -0.924 | 0.926 | -0.254 | 0.232 | 0.914 | + | X | 4.385535 | -4.469336 | 0/0 |
| 1 | 1 | 1.000 | -1.000 | -0.042 | -0.006 | 0.030 | -0.006 | -0.042 | 0.000 | 0.958 | ... | 0.254 | -0.218 | 0.980 | -0.958 | 1.000 | + | X | 0.276427 | 3.608326 | 1/0 |
| 2 | 2 | 0.974 | -0.972 | 0.214 | -0.206 | 0.124 | -0.088 | -0.162 | 0.180 | 0.462 | ... | -0.848 | 0.888 | 0.426 | -0.434 | 0.974 | + | X | 5.154778 | 2.333456 | 2/0 |
| 3 | 3 | 0.966 | -0.968 | -0.098 | 0.080 | -0.226 | 0.218 | 0.106 | -0.148 | -0.604 | ... | -0.838 | 0.792 | -0.542 | 0.598 | 0.966 | + | X | 4.094562 | -2.803718 | 3/0 |
| 4 | 4 | 1.000 | -0.994 | 0.034 | 0.002 | 0.080 | -0.068 | 0.098 | -0.032 | -0.754 | ... | 0.590 | -0.608 | -0.770 | 0.772 | 1.000 | + | X | 2.458046 | 2.477768 | 4/0 |
| ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... |
| 95 | 95 | 1.000 | -1.000 | 0.016 | -0.004 | 0.034 | 0.010 | -0.008 | -0.008 | 1.000 | ... | 0.042 | 0.006 | 1.000 | -1.000 | 1.000 | + | X | 0.007379 | 0.258322 | 95/0 |
| 96 | 96 | 1.000 | -1.000 | 0.054 | -0.016 | 0.000 | 0.010 | -0.106 | 0.018 | 0.456 | ... | 0.898 | -0.890 | 0.424 | -0.414 | 1.000 | + | X | 1.112030 | -4.666942 | 96/0 |
| 97 | 97 | 1.000 | -1.000 | 0.080 | -0.014 | 0.012 | 0.026 | 0.016 | -0.012 | 0.992 | ... | 0.066 | -0.104 | 0.994 | -0.998 | 1.000 | + | X | 0.119057 | -3.652233 | 97/0 |
| 98 | 98 | 1.000 | -1.000 | -0.048 | -0.004 | 0.000 | 0.040 | 0.004 | 0.016 | 0.902 | ... | 0.426 | -0.450 | 0.898 | -0.920 | 1.000 | + | X | 0.430843 | 3.605063 | 98/0 |
| 99 | 99 | 0.984 | -0.996 | 0.044 | -0.108 | -0.026 | 0.028 | -0.106 | 0.102 | 0.974 | ... | -0.206 | 0.196 | 0.978 | -0.982 | 0.984 | + | X | 6.077224 | 1.334615 | 99/0 |
100 rows × 25 columns
Note that exp_data.postprocessed_data is a dataframe that created from the aggregation of the exp_data.preprocess_data by combinding rows with the same parameters_id. So the extra fields provided by the user will be just the first entry of the expectaion values combination.
Save the experiment¶
There are several ways to save sq.data.ExperimentData. For the most common usage pattern, we provides a pair of save and load function in the predefined module. The sq.predefined.save_data_to_path function will save the dataset in the format that can be easily load back using sq.predefined.load_data_from_path.
import tempfile
from pathlib import Path
# path = Path("./test_data_v1")
tmpdir = tempfile.TemporaryDirectory()
path = Path(tmpdir.name)
# Create the path with parents if not existed already
path.mkdir(parents=True, exist_ok=True)
# Save the experiment with a single liner 😉.
sq.predefined.save_data_to_path(path, exp_data, control_sequence)
Load the experiment¶
Loading just sq.data.ExperimentData back is often not enough. We load the data back into a bundle of sq.utils.LoadedData dataclass instance. The dataclass provides access to the sq.data.ExperimentData, sq.control.ControlSequence, solver with provided specification of Hamiltonian sq.predefined.HamiltonianSpec and the jnp.ndarray of control parameters, ideal unitary operators, and the corresponding expectation values.
loaded_data = sq.predefined.load_data_from_path(
path,
hamiltonian_spec=sq.predefined.HamiltonianSpec(
method=sq.predefined.WhiteboxStrategy.TROTTER,
trotter_steps=1_000,
),
)
tmpdir.cleanup()
Note that in the case of custom solver, user has to manually load the experimental dataset back by the primitive load method of sq.data.ExperimentData and solve for the unitary by the solver. User can also bundle the data by manually instantiate the sq.utils.LoadedData for further usage too.