The first ever programmable computer was an electro-mechanical hybrid of a machine put together in his parents’ front room in the late 1930s by German wunderkind Konrad Zuse. By the first half of the 1940s a fully digital machine, the Atanasoff-Berry Computer, had been achieved by the two scientists whose names the model was given. The two were scientists working at the Iowa State University. It took another several decades before computers had become both efficient and cheap enough to become a norm in first workspaces and then homes.
Over the 25 years since the economy and much of the entertainment sector was digitalised, there has been a rapid acceleration in the pace of development. The latest technology in the world of computer hardware and software is able to achieve some quite remarkable feats. Even the smartphones we carry in our pockets contain more powerful processors than the laptops and PCs of ten years ago.
We’ve undoubtedly come a long, long way since the computers of Zuse, Berry and Atanassof. We’re on the verge of AI software algorithms taking control of driverless cars and creating virtual reality worlds some are forecast to earn their primary income in within a decade or so. But despite the wonders modern software developers and hardware engineers are achieving at a seemingly ever accelerating rate, today’s most powerful computers and programmes retain one key similarity to Zuse’s first computer. Computers are all still programmed using exactly the same basic building blocks of binary code.
Software code is, and always has been, represented by strings of ‘1’ and ‘0’. If you know where to look, the code that CERN scientists in Switzerland use to crunch the data produced by the Hadron Collider, the photos on your smartphone, the PowerPoint Presentation you made at work last week and your favourite computer game are all just endless streams of ‘0’ and ‘1’. These digits are represented by tiny circuits that switch from off to on, representing ‘0’ and ‘1’ depending on their off or on position. These circuits are called ‘bits’.
This binary system of bits has made the latest technology in the world possible and for most tasks we use computing for is more than enough. However, for the most complex tasks we are now trying to compute, this binary system is beginning to prove to be a bottleneck. Some problems, such as modelling traffic flows, weather systems or the human genome become exponentially harder with each variable added. Faster and faster classical processors allow algorithms to very quickly sift through all the possible variables in their quest to arrive at an answer or end result. However, the sheer scale of the potential variables in some cases, such as the examples provided, means that it takes too long for binary computers to arrive at an answer to be a practical solution.
It is hoped the quantum processors will solve that bottleneck.
So What is Quantum Computing?
Quantum mechanics is the fundamental theory of physics that explains nature by breaking it down to the smallest possible level – the energy inherent in atoms and sub-atomic particles. Quantum computing is made possible by scientists being able to use electrons as quantum bits, or qubits. Qubits are distinct from classical bits not only by their microscopic size but by their very nature. Electrons exist simultaneously in two states at once. Or rather, they exist between two states to varying degrees. It’s something that even the greatest minds in theoretical physics grapple with but in the simplest possible terms, by maintaining qubits at an extremely cold temperature and using magnetism to manipulate their polar field, they can simultaneously represent both a ‘1’ and ‘0’ at the same time.
Rather than the string sequence of ‘1’s and ‘0’s of binary code, one qubit, which is both ‘1’ and ‘0’ can speak to two more qubits, who also speak to two more qubits each, which almost instantaneously results in countless numbers of qubits working together to process information. This means, in theory, qubits are able to run an indeterminate set of processes simultaneously and at a spectacular speed. This exponentially increases the processing power of a theoretical quantum computer system.
At What Stage is Quantum Computing Technology?
So that’s the theory but at what stage of development is quantum computing technology practically? Computer processors that have been proven to employ quantum phenomena do already exist. The tech big boys such as IBM, Microsoft and Alphabet are all working intensely on quantum computing R&D and one company, D-Wave Systems, is manufacturing commercially available quantum computing hardware.
However, the technology is still in an early stage, is far from perfect and can’t, as yet, do anything that classical binary processors cannot. Nonetheless, it has already been demonstrated that these early quantum processors can solve problems that created specifically to take account of their current structure and limitations, far more quickly than classical processors. Microsoft’s Research Lab is predicting working quantum computers being widely available within a decade. Many other experts approximately agree, give or take a decade or so. Others are sceptical and think that quantum computing faces obstacles that mean it will never be genuinely practical.
The biggest problem that researcher working on quantum processors face is keeping enough electrons in qubit state for long enough. It takes a huge amount of power to maintain the temperature and magnetic manipulation necessary for a qubit to come into existence for even a fraction of a second. When an electron loses it qubit state the information held in it is also lost unless passed on in time. This means the same information has to be held simultaneously on multiple qubits simultaneously and passed through the network quickly. It’s a bit like a movie scene where the hero or heroine is running across a crumbling bridge and has to reach the other side of a ravine before it crumbles from beneath their feet.
The challenge is in creating processors that can maintain enough qubits in a connected state for long enough. D-Wave’s most recent working prototype 2000Q system is said to have made a ‘critical breakthrough’ in this respect.
One has been bought and installed in the Quantum AI Lab run in partnership by Alphabet, NASA and the Universities Space Research Association. IBM also believes its research lab will succeed in connecting 50 qubits in a processor before the end of the year. The company has also managed to increase the period of time qubits exist to 100 microseconds and expects this to be multiplied by ten to a millisecond within 5 years.
Within the next few years, quantum computing processors are expected to reach a ‘good enough’ approximation of the technology. ‘Good enough’ will mean a system that is based on quantum phenomena and, at least in certain applications, achieves processing speeds that classical binary processors cannot.
The Future for Quantum Computing
The ‘full’ quantum revolution is thought to still be many years away but plenty of encouraging breakthroughs are taking place. Key to increasing the stability of qubits may be recent achievements by a team of quantum scientists at Harvard University lead by Professor Kang-Kuen Ni.
The team have, for the first time, succeeded in creating a ‘bespoke molecule’. Optical tweezers were used to manipulate single sodium and caesium atoms in an alloy that represents the first man-made molecule. The molecule also just happens to represent the optimal electrical constitution to be maintained in qubit status.
We don’t yet really know how quickly developments in quantum computing technology may come about. It may be that, like many fields of science, the modern era will see us make leaps in years that previously took multiple decades.
As an almost completely new science, quantum computing may also take the decades to develop that classical binary computing took before progress started to speed up. We may also hit a bottleneck that means quantum computing turns out to be a dead end.
If we do get there though, and there are plenty of positive indicators that we may, quantum computing could be the most significant step yet towards humanity succeeding in stripping back the Veil of Maya to see the secrets of the universe revealed.