# Stating the Problem¶

Once properly stated, a problem can be formulated as an objective function to be solved on a D-Wave solver. The example in the Introduction chapter states a communication-networks problem as the well-known graph problem, vertex cover.

D-Wave provides several resource containing many reference examples:

- Ocean software’s collection of code examples on GitHub.
- D-Wave corporate website papers (for example, [dwave2]) and links to user applications.
- Leap Jupyter notebooks.

For beginners, Ocean documentation provides a series of code examples, for different levels of experience, that explain various aspects of quantum computing in solving problems.

This chapter provides a sample of problems in various fields, and the available resources for each (at the time of writing); having a relevant reference problem may enable you to use similar solution steps when solving your own problems on D-Wave solvers.

Many of these are discrete optimization, also known as combinatorial optimization, which is the optimization of an objective function defined over a set of discrete values such as Booleans.

Keep in mind that there are different ways to model a given problem; for example, a constraint satisfaction problem (CSP) can have various domains, variables, and constraints. Model selection can affect solution performance, so it may be useful to consider various approaches.

Problem |
Beginner Content Available? |
Solvers |
Content |
---|---|---|---|

Circuits & Fault Diagnosis | QPU, hybrid | Code, papers | |

Database Queries (SAT Filters) | QPU | Papers | |

Factoring | Yes | QPU | Code, papers |

Finance | QPU, hybrid | Papers | |

Graph Partitioning | Yes | QPU, hybrid | Code, papers |

Map Coloring | Yes | QPU, hybrid | Code, paper |

Machine Learning | QPU | Papers | |

Protein Folding | QPU | Papers | |

Scheduling | Yes | QPU, hybrid | Code, papers |

Traffic Flow | QPU, hybrid | Papers |

Whether or not you see a relevant problem here, it’s recommended you check out the examples in D-Wave’s collection of code examples and corporate website for the latest examples of problems from all fields of study and industry.

## Circuits & Fault Diagnosis¶

Fault diagnosis attempts to quickly localize failures as soon as they are detected in systems such as sensor networks, process monitoring, and safety monitoring. Circuit fault diagnosis attempts to identify failed gates during manufacturing, under the assumption that gate failure is rare enough that the minimum number of gates failing is the most likely cause of the detected problem.

The Example Reformulation: Circuit Fault Diagnosis section in the Reformulating a Problem chapter shows the steps of solving a circuit fault diagnosis problem on a D-Wave QPU.

### Code Examples¶

Ocean documentation example: Multiple-Gate Circuit

Solves a logic circuit problem using Ocean tools to demonstrate solving a CSP on a D-Wave QPU solver.

dwave-examples code example: circuit-fault-diagnosis

Demonstrates the use of D-Wave solvers to solve a three-bit multiplier circuit.

dwave-examples code example: circuit-equivalence

Verifies equivalence of two representations of electronic circuits using a discrete quadratic model (DQM).

### Papers¶

- [Bia2016] discusses embedding fault diagnosis CSPs on the D-Wave system.
- [Bis2017] discusses a problem of diagnosing faults in an electrical power-distribution system.
- [Pap1976] discusses decomposing complex systems for the problem of generating tests for digital-faults detection.
- [Per2015] maps fault diagnosis to a QUBO and embeds onto a QPU.

## Database Queries (SAT Filters)¶

A satisfiability (SAT) filter is a small data structure that enables fast querying over a huge dataset by allowing for false positives (but not false negatives).

## Factoring¶

The factoring problem is to decompose a number into its factors. There is no known method to quickly factor large integers—the complexity of this problem has made it the basis of public-key cryptography algorithms.

### Code Examples¶

dwave-examples code example: factoring

Demonstrates the use of D-Wave QPU solvers to solve a small factoring problem.

dwave-examples code example: factoring-notebook

Demonstrates the use of D-Wave QPU solvers to solve a small factoring problem.

### Papers¶

- [Dwave3] discusses integer factoring in the context of using the D-Wave Anneal Offsets feature; see also the Anneal Offsets section.
- [Bur2002] discusses factoring as optimization.

## Finance¶

Portfolio optimization is the problem of optimizing the allocation of a budget to a set of financial assets.

### Papers¶

- [Els2017] discusses using Markowitz’s optimization of the financial portfolio selection problem on the D-Wave system.
- [Gra2021] uses portfolio optimization as a case study by which to benchmark quantum annealing controls.
- [Mug2020] implements dynamic portfolio optimization on quantum and quantum-inspired algorithms and compare with D-Wave hybrid solvers.
- [Oru2019] looks at forecasting financial crashes.
- [Ros2016a] discusses solving a portfolio optimization problem on the D-Wave system.

## Graph Partitioning¶

Graph partitioning is the problem of reducing a graph into mutually exclusive sets of nodes.

### Code Examples¶

- dwave-examples code example: graph-partitioning
Solves a graph partitioning problem on the QPU.

- dwave-examples code example: graph-partitioning-dqm
Solves a graph partitioning problem on Leap’s DQM solver.

- dwave-examples code example: maximum-cut
Solves a maximum cut problem on the QPU.

- dwave-examples code example: clustering
Identifies clusters in a data set.

- dwave-examples code example: immunization-strategy
Fragments a population into separate groups via a “separator” on Leap’s DQM solver.

## Map Coloring¶

Map coloring is an example of a constraint satisfaction problem (CSP). CSPs require that all a problem’s variables be assigned values, out of a finite domain, that result in the satisfying of all constraints. The map-coloring CSP is to assign a color to each region of a map such that any two regions sharing a border have different colors.

The Example Reformulation: Map Coloring section in the Reformulating a Problem chapter is an example of map coloring on the D-Wave system.

### Code Examples¶

Ocean documentation example: Large Map Coloring

Demonstrates an out-of-the-box use of an Ocean classical-quantum hybrid sampler solving a problem of arbitrary structure and size.

Ocean documentation example: Map Coloring: Hybrid DQM Sampler

Demonstrates Leap’s hybrid discrete quadratic model (DQM) solver.

Ocean documentation example: Map Coloring

Demonstrates solving a map-coloring CSP on a QPU.

dwave-examples code example: map-coloring

Demonstrates the use of D-Wave QPU solvers to solve a map-coloring problem.

## Machine Learning¶

Machine learning algorithms operate by constructing a model with parameters that can be
learned from a large amount of example input so that the trained model can make
predictions about unseen data. Most implementations to-date have been based on deterministic
machine learning models such as feed-forward neural networks. The real world, however,
is nondeterministic and filled with uncertainty. *Probabilistic*
models explicitly handle this uncertainty by accounting for gaps in our knowledge and
errors in data sources.

In probabilistic models, *probability distributions* represent the unobserved quantities in
a model (including noise effects) and how they relate to the data. The distribution of
the data is approximated based on a finite set of *samples*. The model infers from the
observed data, and learning occurs as it transforms the *prior* distributions, defined
before observing the data, into *posterior* distributions, defined afterward. If the
training process is successful, the learned distribution resembles the actual distribution
of the data to the extent that the model can make correct predictions about unseen
situations—correctly interpreting a previously unseen handwritten digit, for example.

A *Boltzmann distribution* is a discrete energy-based distribution that
defines probability, \(p\), for each of the discrete states in a binary vector.
For a set of \(N\) binary random variables, \(\vc{x}\), the Boltzmann
distribution defines a probability for each possible state that \(\vc{x}\)
can take using[1]

[1] | \(\beta\) is omitted from this equation because usually, in the context of machine learning, it is assumed to be 1. |

where \(E(\vc{x};\theta)\) is an energy function parameterized by \(\theta\) and

is the *normalizing coefficient*, also known as the *partition function*, that ensures that \(p(\vc{x})\) sums to
1 over all the possible states of \(\vc{x}\); that is,

Note that because of the negative sign for energy, \(E\), the states with high probability correspond to states with low energy.

The energy function \(E(\vc{x};\theta)\) can be represented as a QUBO: the linear coefficients bias the probability of individual binary variables in \(\vc{x}\) and the quadratic coefficients represent the correlation weights between the elements of \(\vc{x}\). The D-Wave architecture, which natively processes information through the Ising/QUBO models (linear coefficients are represented by qubit biases and quadratic coefficients by coupler strengths), can help discrete energy-based machine learning.

Probabilistic Sampling: RBM

A Boltzmann machine is a stochastic recurrent neural network that defines a probability distribution over binary visible and hidden variables. A restricted Boltzmann machine has no intra-connections among the visible and hidden variables.

An RBM is shown in Figure 31.

Energy-Based Models

Machine learning with energy-based models (EBMs) minimizes an objective function by lowering scalar energy for configurations of variables that best represent dependencies for probabilistic and nonprobabilistic models.

For an RBM as a generative model, for example, where the gradient needed to maximize log-likelihood of data is intractable (due to the partition function for the energy objective function), instead of using the standard Gibbs’s sampling, use samples from the D-Wave system. The training will have steps like these: a. Initialize variables. b. Teach visible nodes with training samples. c. Sample from the D-Wave system. d. Update and repeat as needed.

Support Vector Machines

Support vector machines (SVM) find a hyperplane separating data into classes with maximized margin to each class; structured support vector machines (SSVM) assume structure in the output labels; for example, a beach in a picture increases the chance the picture is of a sunset.

Boosting

In machine learning,

*boosting*methods are used to combine a set of simple, “weak” predictors in such a way as to produce a more powerful, “strong” predictor.

### Code Examples¶

- Qboost is an example of formulating boosting as an optimization problem for solution on a QPU.

### Papers¶

General machine learning and sampling:

- [Bia2010] discusses using quantum annealing for machine learning applications in two modes of operation: zero-temperature for optimization and finite-temperature for sampling.
- [Ben2017] discusses sampling on the D-Wave system.
- [Vah2017] discusses label noise in neural networks.

RBMs:

- [Ada2015] describes implementing an RBM on the D-Wave system to generate samples for estimating model expectations of deep neural networks.
- [Dum2013] discusses implementing an RBM using physical computation.
- [Hin2012] is a tutorial on training RBMs.
- [Kor2016] benchmarks quantum hardware on Boltzmann machines.
- [Mac2018] discusses mutual information and renormalization group using RBMs.
- [Rol2016] describes discrete variational autoencoders.
- [Sal2007] describes RBMs used to model tabular data, such as users’ ratings of movies.
- [Vin2019] describes using D-Wave quantum annealers as Boltzmann samplers to perform quantum-assisted, end-to-end training of QVAE.

Energy-Based Models:

- [Lec2006] describes EBMs.

Support Vector Machines:

- [Boy2007] gives a concise introduction to subgradient methods.
- [Wil2019] gives a method to train SVMs on a D-Wave 2000Q, and applies it to data from biology experiments.

Boosting:

- [Nev2012] describes the Qboost formulation.

## Protein Folding¶

Protein folding refers to the way protein chains structure themselves in the context of providing some biological function. Although their constituent amino acids enable multiple configurations, proteins rarely misfold (such proteins are a cause of disease) because the standard configuration has lower energy and so is more stable.

## Scheduling¶

The well-known job-shop scheduling
problem is to maximize priority or minimize schedule
length (known as a *makespan*, the time interval between starting the first
job and finishing the last) of multiple jobs done on several machines, where a
job is an ordered sequence of tasks performed on particular
machines, with constraints that a machine executes one task at a time and
must complete started tasks.

### Code Examples¶

Ocean documentation example: Vertex Cover

Shows new users how to formulate a constraint satisfaction problem (CSP) using Ocean tools and solve it on a D-Wave QPU solver.

dwave-examples code example: job-shop-scheduling

An implementation of [Ven2015] for D-Wave QPU solvers.

dwave-examples code example: employee-scheduling

A formulation of a discrete quadratic model (DQM) for solution on Leap’s hybrid DQM solver.

dwave-examples code example: nurse-scheduling

An implementation of [Ike2019] that forms a QUBO for solution by Leap’s hybrid BQM solver.

### Papers¶

- [Ike2019] describes an implementation of nurse scheduling.
- [Kur2020] describes an implementation of job-shop scheduling on a D-Wave QPU solver.
- [Ven2015] describes an implementation of job-shop scheduling on the D-Wave system, which includes formulating the problem, translating to QUBO, and applying variable reduction techniques. It also talks about direct embedding of local constraints.

## Traffic Flow¶

One form of the traffic-flow optimization problem is to minimize the travel time of a group of vehicles from their sources to destinations by minimizing congestion on the roads being used.