common.title

Docs
Quantum Circuit
TYTAN CLOUD

QUANTUM GAMING


Desktop RAG

Overview
Terms of service

Privacy policy

Contact
Research

Sign in
Sign up
common.title

[review article] Extractive sentence summarization is performed by XY-QAOA and L-VQE using an ion-trap quantum computer.

Yuichiro Minato

2022/10/02 13:09

I found a rather fun paper, by JP Morgan, on the ion trap optimization problem. The problem extraction method is called ES / MMR (Maximal Marginal Relevance), but it looks very similar to a combinatorial optimization problem such as portfolio optimization and can be solved with QAOA.

Reference Papers

Constrained Quantum Optimization for Extractive Summarization on a Trapped-ion Quantum Computer

Pradeep Niroula,1, 2, 3, ∗ Ruslan Shaydulin,1, ∗ Romina Yalovetzky,1, ∗ Pierre Minssen,1 Dylan Herman,1 Shaohan Hu,1 and Marco Pistoia1 1 JPMorgan Chase Bank, N.A., New York, NY 2 Joint Center for Quantum Information and Computer Science, NIST/University of Maryland, College Park, MD 3 Joint Quantum Institute, University of Maryland, College Park, MD (Dated: June 14, 2022)

https://arxiv.org/abs/2206.06290

Extractive Summarization

There seem to be several methods for summarizing sentences while maintaining the meaning of the sentences. This time, they focus on the Extractive Summarization (ES) method, which is an algorithm similar to QUBO in form, and its formulation seems to be taken from the following paper.

A Study of Global Inference Algorithms in Multi-Document Summarization

Ryan McDonald

Google Research 76 Ninth Avenue, New York, NY 10011 ryanmcd@google.com

https://ryanmcd.github.io/papers/globsumm.pdf

It seems that MMR is the standard ES algorithm in this area, but I am interested in it because it is relatively close to the QUBO formulation. I would like to look at the whole picture when I have time, but for this article, I will look at the formulation and how to solve it for this paper.

Formulation

The formulation is as in the above paper. It is a relatively simple structure.

µ is the value with the amount of information for the sentences, and β is the matrix with the calculated similarity between the sentences. Also, at the bottom is the number of sentences M contained in this summary, with the constraint to extract M from N sentences. The last equation can be treated as a hard constraint in XY-QAOA, so its formulation is slightly different from that of the soft constraint case.

Constraints

Constraints are applied to the number of sentences extracted as constraints. Besides solving QAOA as a soft constraint, they also use XY Mixer as a hard constraint. They are also trying Layer VQE as Ansatz, and L-VQE is not specifically discussed in this review deeply, but will be covered at another time. The idea of using a circuit with a shallow depth of Dicke state as the initial state is also considered, which is quite nice.

Calculation Results

QAOA, XY-QAOA, and L-VQE are performed on 14-qubit and 20-qubit real devices. I thought it was rare to run QAOA with hard constraints on a real machine. As for the calculation results, it seems that the hard constraints are a little too strict to be fully executed on a real machine, but a certain trend was seen.

Considerations

The results of QAOA, especially XY-QAOA, on the real machine were not as good as I expected, but the problem setting is interesting and I think this paper is quite enjoyable. The QAOA using the actual Honeywell machine was also quite hopeful. That's all for now.

© 2025, blueqat Inc. All rights reserved