Default Banner

Quantum computing: The future of computing?

24/10/2017
Quantum computing: The future of computing?

We rely increasingly on computing in today’s world. We use them to share our information and store our most precious data. In fact, the very idea of living without computers would baffle most of us.

However, if we continue to follow this reliance trend, that has been in place since computers were introduced, we will no longer have the capability to power all the machines around the globe by 2040[1]. Additionally, Moore’s law (this states that the number of transistors in an integrated circuit doubles every two years) will not be keeping up if we continue to build traditional computers. To circumvent this, the industry is focused on finding ways to make computing more energy efficient. Obviously, classical computers are limited by the minimum amount of energy it takes them to perform one operation.

Therefore, people are turning to radically different ways of computing and this is where Quantum computing enters the fray.

What is quantum computing?

To grasp the concept of quantum computing, one needs to understand something that few can understand fully. That is, that subatomic particles can exist in more than one state at the same time. Quantum computing takes advantage of this strange ability that subatomic particles have. Due to this advantage, operations can be done much quicker and use less energy than when using classical computers.

Here is how it works: In classical computing, a bit is a single piece of information that can exist in two states; either on (1) or off (0). In quantum computing, quantum bits, or “qubits” are used instead. Whilst in classical computing, only two states can exist at any one time for a bit, a qubit in quantum computing can have an infinite number of different values. Either 0, 1 or any superposition value of 0 and 1.

Thus, a qubit can be thought of like a sphere. Whereas a classical bit can be in two states only – at either of the two poles, a qubit can be at any point in the sphere. Using qubits, one can store a huge amount more information using less energy than using classical bits.

 image of qubit

                       








Figure 1: A representation of a Classical bit (left) vs a Qubit (right)

 

What has been done so far in quantum computing?

D-Wave systems unveiled their first quantum computer in 2007 - a 16-qubit computer – although it is unable to entangle all of the qubits and nor can the qubits be programmed individually. However, major companies like Google have invested in a D-Wave computer with the claims that that certain processes can be thousands of times faster than using classical computing.[2]

 DWave machine















Figure 2: A D-Wave 2X quantum computer. Looks as big as the first vacuum tube computer!

At the same time, moving quantum computing to an industrial scale is proving to be a challenge. The main reason why quantum computers are so hard to manufacture is that scientists still have yet to find a simple way to control the complex systems of qubits.

IBM claims that commercial quantum computers with around 50 qubits will be created in the next few years[3], but that they will still not be commercially available.

Applications of quantum computing

Quantum computing is mostly suited to tackling complex optimisation problems that usually take many years to solve using classical computing such as machine learning, pattern recognition and anomaly detection, cyber security, image analysis, financial analysis, bioinformatics and cancer research.

Conclusion

It is predicted that[4], when quantum computers become mainstream, the time it will take to discover life-saving drugs should be reduced to a fraction of what it is today. It will unlock new facets of artificial intelligence by vastly accelerating machine learning and it will safeguard cloud computing systems to being impregnable from cyber-attack. With such rewarding propositions ahead, one would want the development of commercially viable quantum computing to be a reality sooner rather than later.

 

Vincent Farrugia