Quantum Computers

What You Need To Know

“Those who are not shocked when they first come across quantum theory cannot possibly have understood it.”

— Niels Bohr

Quantum computing is the new buzz word in the technology market. With Google performing computations using these computers, in seconds and IBM still trying to prove their claim false, we are stuck somewhere between not understanding what a quantum computer is or why suddenly everybody is talking about it? If you want to learn more on them, read on….

So how do you think a computer understands the language of humans? There have been developed coding languages like java, cpp etc. to communicate with the computers. They then translate this language into their own language of 0’s and 1’s and the output produced is also in the form of 0’s and 1’s. Thus, if you are aware about the important concept of time complexity, if there are number of paths from the input to the output using the code language[also known as a program] then a simple program can even take thousands or millions of years to run because it is deterministic, i.e., the output is discrete 0 or 1. Such a computer is called a classical computer which is based on the Turing thesis, the same as which you are possibly reading this article on.

The quantum computers defy this principle, they are based on superposition[more on this later] after all. They not only can output a 0 or a 1 but also anything between 0 and 1. Thus they analyse the probability of the different sets of outputs that we can have and thus they do not have to consider all the unique paths from the input to the output but they can superimpose the various paths and thus they can perform a computation intensive task unlike a classical computer very trivially.

The fact that computers have been already around for the majority of the 20th century but the quantum computing was first theorized only 30 years ago by a physicist at the Argonne National Laboratory makes quantum computers all the more interesting to study about. The relation between computing and quantum physics has long been established although very little attention was being paid to this area until recently when Google announced the successful development and testing of it’s first quantum computer.

If, as Moore’s Law states, the number of transistors on a microprocessor continues to double every 18 months, the year 2020 or 2030 will find the circuits on a microprocessor measured on an atomic scale. But to scale down the microprocessors to this extent will not always be possible due to the heat constraints. It might be the very case that the silicon chip processors will soon be replaced by the quantum computers themselves. The Blockchain consensus algorithms like Proof of Work indeed consume energy equivalent to the amount that can power up the entire country of Switzerland, so we will need to fall back to alternatives like quantum computers because changing the consensus algorithm is not always possible. Although, the error correction in this field due to super position is still a facet that needs to be looked after but who knows that our next need of the hour will be quantum computing.

This blog post will only lightly introduce you to quantum computers and why are they being talked about from a lay man’s perspective. I will cover more in-depth details in my future blog posts.

Published by arunikayadav

I am a full stack developer in my senior year and pursuing my bachelors at the Indian Institute of Technology Patna. I am interested in AI, Machine Learning, Blockchain, Quantum Computing and anything related to tech actually piques my interest.

Leave a comment

Design a site like this with WordPress.com
Get started