# Workflow (QAOA Algorithm)

In the previous series, we saw QAOA algorithm and Maxcut prioblem with detailed explanation. You can look into the same through this link: QAOA explained with maxcut problem.

Here we will be discussing the workflow of the algorithm in easy terms.

- We begin with an initial trial state which is a superposition of the 2ⁿ basis states for the corresponding n bits |ψ> = ∑ Ai|z> where Ai is the amplitude for the corresponding basis state and the square of the amplitude being the probability. The state being normalised, we can write Ai = 1/(2ⁿ)^(1/2).
- We inititialize the 2p parameters β and γ (these are the angles we are using for the gates we discussed in the last series). This step is performed on a classical computer.
- Now we construct the new state using the initial trial state and applying the series of the unitary gates on the state |γ,β> = U(B,β)U(C,γ)p……U(B,β)U(C,γ)₂U(B,β)U(C,γ)₁|ψ> with the angles for the parameters being defined in the above step. We get an updated |ψ>ɴᴇᴡ = |γ,β>. This step is being performed by a Quantum Computer.
- Next we use a Quantum Computer again to measure the above state |γ,β> in the computational basis and get a state |z>.
- We use a classical computer to calculate the expecation value of ∑ C(z) where the summation goes from 1 to m where m are the number of clauses.
- We keep on repeating the steps 1 to 4. We measure the |γ,β> many times and we get a distribution of states |z>. Each state |z> corresponds to an eigenvalue and we also get a distribution of the eigenvalues with the largest eigenvalue being Aᴍᴀx and the average expecation value of the operator C being
**<γ,β|C|γ,β>**where C is defined in the last point. The desired result is the**A(z)ᴍᴀx**and its eigenvalue being |z>ᴍᴀx on applying the 2p parameters.

Next we select a new set of 2p parameters β and γ and continue with the steps from 3 to 6. Here we get a distribution of **A(z)ᴍᴀx** out of all the set of parameters and we select the largest of them all . The termination condition depends on the optimization criterion and how the angles ae being updated.

The problem that arises in the optimization is when we choose a 2p parameter function optimization where the function is being evaluated by a quantum computer, if we run a 2p dimensional grid it becomes very large and grows to O(r^(2p)) where O is the Big-O notation where r is the number of points for each parameter. We can update the angles based on any optimization method like Gradient Descent and can also use the methods in the scipy library for the same.

Next, I will be programming the QAOA on maxcut on various graph examples using the **Blueqat Quantum Library **and plotting the results for the problems.

# References:

- Edward Farhi, Jeffrey Goldstone, and Sam Gutmann. A Quantum Approximate Optimization Algorithm. arXiv1411.4028.
- UMD Course on QAOA
- Evaluating Quantum Approximate Optimization Algorithm