Tech Gaming Report

Latest Tech & Gaming News

Quantum computing for data science: a practical test

Quantum computing for data science: a practical test



A real stir has broken out around the topic of quantum computing. The technology could open new horizons in the field of data science and machine learning. But how practical is it today? Based on a test of three use cases, the authors examine the specific benefits for data scientists.


Unlike classical computers, the algorithms of a quantum computer are not based on bits but on qubits. A bit can have the state 0 or 1. If you measure a set of bits multiple times, you always get the same result. This is different with a qubit. In principle, as strange as it sounds, it can take the values ​​0 and 1 at the same time. If you measure the value of a qubit multiple times, the values ​​0 and 1 occur with some probability. In the initial state, this is usually one hundred percent for the value 0. Other probability distributions for a qubit can be generated by superimposing the states. This is possible thanks to quantum mechanics, which follows laws that we do not know in this way in our daily environment.

The decisive advantage of a quantum computer lies in its probabilities. Classic computers are strong on problems where you need a single end result. In contrast, quantum computers are very good at handling probabilities and can compute with multiple values ​​at the same time. If you perform an operation once on a qubit in a superimposed state, it will apply to both the number 0 and the number 1. The qubit represents both states at the same time. The more qubits included in the calculation, the greater the advantage over a classical computer. For example, a computer with three qubits can cover up to eight (corresponding to 2³) possible states at the same time: the binary numbers 000, 001, 010, 011, 100, 101, 110, and 111.

See also  Golf on Mars is the Sequel to Meditative Cult Hit Desert Golfing, Out Now


There is general agreement in the scientific literature that quantum computers will help solve previously intractable problems, including in the fields of data science and artificial intelligence. However, there are currently no optimal quantum computers available. The current generation is called Noisy Intermediate-Scale Quantum (NISQ). These computers are limited in the number of their qubits and are susceptible to interference, they are subject to noise. By 2021, the first companies could build quantum computers with more than 100 qubits, including IBM and QuEra Computing. But what is the practical benefit of this NISQ generation? This is shown in the following test, for which the authors implement three use cases with the Qiskit and PennyLane frameworks and verify their practical suitability. Compared to alternatives like Cirq (Google) and Q# (Microsoft), IBM’s Qiskit framework offers very good documentation and the advantage of being able to run the circuitry on a real quantum computer for free.

to the home page