25-29th March 2024
Warning
These notes were taken as the lectures were happening. I need to revise them for errors. All errors are mine, all credit goes to the lecturers.
Day 1
Lecture 1: Introduction to Python
Lecture 2: Theory of two mode Quantum Interferometers
We will consider the lossless / ideal case
We will work with the Mach-Zehnder interometer
The goal is to count the number of particles in the result ( screens above ), and based on that understand the test.
The overall goal is to remove the Measurement uncertainty , where represents the Operator in our Sample Beam (SB). We need a Reference Beam (RB) to compare the 2 modes of the beam.
Our system satisfies the Canonical Commutation Relations
We have
where we have the relation , which defines the angle .
We have
We require
The condition is solved as
The condition is solved as
Which results in the relation .
So we can write as
satisfies the properties
- , the Special Unitary Group of dimention
Setting , which allows us to see in this Intorfermeter as applying rotations in space, as we are now working in the Orthogonal Group .
- See the result by Schwinfer
We define the Angular Momentum Operator, which are Casimir invariant:
Which satisfies the commutation relations
and .
We can write our operator as mapping
Our Phase Shift Operator is given by
Which is the rotation around described by
We need the Baker-Campbell-Hausdorff formula as a way to expand . This equation describes the transition of the phase in the Heisenberg picture.
If we’re interested in the Expectation
Giving us the relations
We describe our Beam Splitter as
Which gives the matrix in 3D
Which allows us to correct the initial splitters as
The Mach-Zehnder interferometer aggregates these operations as , as per the image on top. Hence
Setup
We setup particles in modes, creating the dimensional space spanned by the basis , for . We write and .
For simplicity we write instead of , with \frac{N}{2}-m\tau_X\tau_Y$ act as the raising operator and lowering operator, respectively.
We define the polarised state as , i.e. all particles are in the Ground state. This is an Eigenstate of , with eigenvalue .
- Our Phase Shift represents a rotation around ( green above )
- Our Beam Splitter represents a rotation around ( red above )
Hence
Hence by making the Measurement we can recover information on through the relation .
- Apply the Maximum Likelihood Estimation with observations of above, where we want the value of .
- Find the parameterized distribution for the errors of .
Letting be our estimation of the true value of , we have .
By the Central Limit Theorem, the estimated error is approximately .
Exercises: Introduction to Python
Day 2
Lecture 1: More Python
No notes taken
Lecture 2: Monte Carlo Methods
Setup
We build a well model with Hamiltonian , and we want to build a high precision model of how it works
We want to use a to describe the transition rate matrix for our Monte Carlo process.
We describe our system as a set of spherical particles in a box
We defined
- Our partition function as , where represents a particle configuration, and
- The Potential is described as
We can consider a few different cases:
- Classical fluids/solids
- Spin models, such as the Ising Model. ( )
- Lattice fields
we can consider the system
The Metropolis–Hastings algorithm consists of two sets:
- Generate a new sample based on our known but incomplete information on how the system evolves
- Reject/Approve the new sample based on it’s properties. The probability of acceptance is usually based on some invariants of the configuration that we want to preserve. This could be
- ensuring that the new energy is similar to the previous energy
- The momentum is maintained through time
- etc
Setup
- Data .
- Parameters . The goal is to find ( i.e. find the true underlying parameters based on our observations )
By Bayes theorem, . We assume that
- is normally distributed
- , where is the Expectation of given a certain distribution of . This can be as simple as .
I’m not familiar with the concept of a partition function in statistical mechanics? Help me ❓
In quantum systems, we can have , using the position basis. We always have that
( What is a mean field ?? Help me ❓)
With this
with as our “evolution” of the algorithm. We want , but to ensure that we don’t get locally stuck. In order to iterate, we need to calculate . This is calculated as
Improvement by considering . We get that ( I didn’t get this )
Lecture 3: Phase estimation numerics
We use as a, and Pauli Matrix
With this we get
We make a single measurement to get
By measuring on we get the expectation
Day 3
Lecture 1: Phase estimation numerics ( continued )
- No notes taken. I need to add the blackboard photos taken during class
Lecture 2: Monte Carlo Methods ( continued )
We want to calculate the integral
Where
We can reduce the Variance by rewriting
As a Markov Chain
We have a transition rate matrix gives us
It is unclear why is detailed balance needed, with equality without the sums ( see https://en.wikipedia.org/wiki/Detailed_balance#Reversible_Markov_chains ). Help me ❓
Applying the Metropolis–Hastings algorithm
With the detail balanced equation above satisfied, we can apply the Metropolis–Hastings algorithm with
- Acceptance rate given by where being the , with being the stationary distribution of the Markov Chain.
- New state generated by , where we generate ( i.e. normally distributed ).
Examples
In all the approaches below we apply Markov Chain Monte Carlo:
- We initialise
- We take a step using the transition matrix and acceptance rate, a la Metropolis–Hastings algorithm
- We take in an observable
Ising
VMC
aka Variational Monte Carlo
Day 4
Lecture 1: Learning the noise fingerprint of quantum devices
Presentation based on the paper Learning the noise fingerprint of quantum devices.
Overall idea:
- Run the circuit above in different quantum computers
- This is done in groups of 1000 times, to get an observed probability distribution function.
- We consider the probability distributions as our , used to predict the machine they ran on.
- We use Support Vector Machines to make this prediction, which lies under the binary classification group under Machine Learning.
Lecture 2:
Added the information directly to the Grover’s Algorithm /
- Add a note on the quantum threshold theorem
Day 5
Presentation on Qiskit