Quantum Computing in 2013

The introduction of classical computing brought the languages of classical physics (electricity and magnetism) and joined it into a new assembly of people in the future called computer scientists. Comparable to most technologies, classical computers like ENIAC (Electronic Numerical Integrator and Computer) began under the purview of engineers and progressed to a shared services setting (where businesses could purchase time on the computer). With the assistance of a common simplified language and operational contexts, traditional computing moved from the scientific/government dominion to usage by large enterprises, in anticipation of what could be considered general availability for both content (data and program) inventors and content consumers.

The commencement of the simplified language for classical computing was the description of the bit, the smallest of information illustration. The bit was a language of abstraction, a representation of electrical and/or magnetic physical properties. The bit was zero while voltage was off and one when voltage was applied. Bits are usually used to symbolize data or commands. In order to create commands, voltages were combined using various methods called gates (AND, OR, NAND and COPY making up the complete classical set). These were physical representations (i.e., combinations of voltages) of logic command arrangements to integrate bits in different ways.

As programming advanced in this evolutionary sequence, not only were certain objects on lower foundation layers abstracted, but innovative languages of representation were produced. Nowadays it is innocuous to assume that a Java programmer who utilizes an object oriented program does not distress himself with how the bits are flipped.1


When I interviewed Dr. Vinton “Vint” Cerf, I asked him, “What are your views or view on quantum computing in today’s world in comparison to classical computers?”


He stated,” Quantum computing (see also D-Wave web site) has the promise of getting answers much faster FOR CERTAIN KINDS OF PROBLEMS than conventional computing. It is not a general purpose method, however, and is extremely sensitive to maintaining entanglement coherence for long enough for the computation to be performed. It appears to have application for factoring and for optimization (e.g. traveling salesman problem). Computing is becoming a key element of everyday life, especially in conjunction with mobiles – together they harness the power of the Internet, World Wide Web and cloud computing from virtually anywhere on the globe. I am very excited about the “internet of things” and also about computers that hear and see and can be part of the traditional human dialog. I like the idea of being able to have a conversation with a search engine or a discussion with a control system. Of course, Google Glass and Google self-driving cars are capturing attention where ever one goes. I am also quite excited about the extension of the Internet to interplanetary operation, as you may discover if you google “interplanetary internet”.

The Quantum Computer is a computer that connects the power of atoms and molecules to accomplish memory and processing tasks. It has the potential to perform particular calculations billions of times quicker than any silicon-constructed computer. The field of Quantum Computing was first introduced in 1980 and 1981.

The classical desktop computer functions by manipulating bits, digits that are binary — i.e., which can either signify a zero or a one. Everything from statistics and letters to the status of the modem or computer mouse are all expressed by an accumulation of bits in combinations of ones and zeros. These bits correspond very well with the approach classical physics represents the globe. Quantum computers are not restricted by the binary nature of the classical physical world. Nonetheless, they rely upon inspecting the condition of quantum bits or qubits that might represent a one or a zero, might appear as a combination of the two or might exhibit a number conveying that the state of the qubit is somewhere between 1 and 0.

With regards to the classical model of a computer, the most essential building block – the bit, can only occur in one of two distinct states, a ‘0’ or a ‘1’. In a quantum computer the procedures are altered. Not only is the qubit capable of remaining in the classical ‘0’ and ‘1’ states, but it can also be in a superposition of both. In this coherent state, the bit exists as a ‘0’ and a ‘1’ in a particular manner. If an individual considers a register of three classical bits: it would be attainable to use this register to represent any one of the numbers from 0 to 7 at any one time. If a register of three qubits is deliberated, it can be observed that if each bit is in the superposition or coherent state, the register can represent all the numbers from 0 to 7 simultaneously.

A processor that can utilize registers of qubits will basically have the ability to perform calculations applying all the likely values of the input registers simultaneously. This phenomenon is known as quantum parallelism, and is the inspiring force concerning the research which is presently being carried out out in quantum computing.

Quantum computers are beneficial in the way they encode a bit, the vital unit of information. A number – 0 or 1, stipulates the state of a bit in a classical digital computer. An n-bit binary word in a regular computer is for that reason described by a string of n zeros and ones. A qubit may be represented by an atom in one of two unalike states, which can also be indicated as 0 or 1. Two qubits, like two classical bits, can reach four different well-defined states (0 and 0, 0 and 1, 1 and 0, or 1 and 1).

On the other hand, in contrasting classical bits, qubits can be existent simultaneously as 0 and 1, with the likelihood for each state given by a numerical coefficient. Revealing a two-qubit quantum computer demands four coefficients. As a general rule, n qubits demand 2n numbers, which speedily become a sizeable set for greater values of n. By way of example, if n equals 50, about 1050 numbers are necessary to describe all the probabilities for the possible states of the quantum machine-a number that surpasses the capacity of the largest conventional computer. A quantum computer gives the assurance that it will be impressively powerful because it can be in superposition and can act on all its potential states simultaneously. As a result, this sort of computer could unsurprisingly accomplish myriad tasks in parallel, using merely a single processing unit.

Quantum Computing is the skill of utilizing all of the prospects that the laws of quantum mechanics offer humans to solve computational problems. Conventional or “Classical” computers only use a minor subset of these possibilities. In principle, they calculate in the same way that people compute by hand. There are numerous outcomes about the wonderful things humanity would be able to do if there was a sufficiently large quantum computer. The utmost significant of these is that we would be able to perform simulations of quantum mechanical procedures in chemistry, biology and physics which will never come within the range of classical computers.3


This figure demonstrates the Bloch sphere which is a depiction of a Qubit, the fundamental building block of quantum computers.

Both practical and theoretical study continues and a number of national government and military funding agencies support quantum computing research to improve quantum computers for both civilian and national security purposes, for example cryptanalysis.

There exist a number of quantum computing models, distinguished by the main features in which the computation is determined. The four central versions of practical significance are:

  1. One-way quantum computer (computation divided into sequence of one-qubit measurements applied to an extremely entangled early state or cluster state)
  2. Quantum gate array (computation divided into sequence of few-qubit quantum gates)
  3. Adiabatic quantum computer or computer based on Quantum annealing (computation distributed into an unhurried constant conversion of an initial Hamiltonian into a final Hamiltonian, whose ground states comprises of the solution)
  4. Topological quantum computer (computation divided into the braiding of anyons in a 2D lattice)

The Quantum Turing machine is theoretically meaningful but direct implementation of this model is not pursued. The four models of computation have been revealed to be equal to each other in the sense that each one can simulate the other with no more than polynomial overhead.

In Modern Day, there has been a great level of controversy about the world’s only commercial quantum computer. The concern with this machine is that there has been an issue in deciphering whether it is truly a quantum device or just a regular computer. The Canadian software company D-Wave created this technological device which has been verified to work on a quantum level.

Unlike a common computer, this kind that is named an “Annealer”, cannot answer any query tossed at it. As an alternative, it can only answer ‘discrete optimization’ problems. This is the sort of issue where a set of criteria are all struggling to be met at the same time and there is one best resolution that meets the most of them. One sample is being the simulation of protein folding, in which the arrangement seeks a state of minimal free energy. The hope is that a quantum annealer should be able to solve these problems much quicker than a classical one.

Professor Scott Aaronson, a theoretical computer scientist at MIT has historically been skeptical of D-Wave’s assertions. He stated in the past that he is fairly persuaded by the data but that there are plenty of important questions remaining. These include whether the current or future versions of the D-Wave computer will truly be any faster than classical machines.

An Australian crew led by researchers at the University of New South Wales has accomplished a breakthrough in quantum science that brings the prospect of a network of ultra-powerful quantum computers that are joined via a quantum internet, closer to reality. The team is the first to have discovered the spin, or quantum state, of a single atom using a combined optical and electrical approach. The study is a group effort between investigators from the ARC Centre of Excellence for Quantum Computation and Communication Technology based at UNSW, the Australian National University and the University of Melbourne.

UNSW’s Professor Sven Rogge alleged that the technical feat was done with a single atom of erbium – an unusual earth element normally used in communications that is embedded in silicon. “We have the best of both worlds with our combination of an electrical and optical system. This is a revolutionary new technique, and people had doubts it was possible. It is the first step towards a global quantum internet,” Professor Rogge indicated.

Quantum computers guarantee to provide an exponential increase in processing power over conventional computers by using a single electron or nucleus of an atom as the basic processing unit – the qubit. By carrying out multiple calculations simultaneously, quantum computers are projected to have applications in economic modeling, quick database searches, modeling of quantum materials and biological molecules as well as drugs, in addition to encryption and decryption of information.


In Quantum Computing, information is stored in quantum bits, or qubits. A qubit can be in states labeled |0} and |1}, but it can also be in a superposition of these states, a|0} + b|1}, where a and b are complex numbers. If the state of a qubit is viewed as a vector, then superposition of states is just vector addition. For every extra qubit you get, you can store twice as many numbers. For example, with 3 qubits, you get coefficients for |000}, |001}, |010}, |011}, |100}, |101}, |110} and |111}. In addition to this, calculations are performed by unitary transformations on the state of the qubits. United with the principle of superposition, this generates possibilities that are not available for hand calculations (as in the QNOT). This translates into more efficient algorithms for a.o. factoring, searching and simulation of quantum mechanical systems. The QNOT-The classical NOT-gate flips its input bit over; NOT(1)=0, NOT(0)=1.The quantum analogue, the QNOT also does this, but it flips all states in a superposition at the same time. So if we start with 3 qubits in the state |000}+|001}+2|010}-|011}-|100}+3i|101}+7|110} and apply QNOT to the first qubit,we get|100}+|101}+2|110}-|111}-|000}+3i|001}+7|010}. Furthermore, the quantum computer is different due to Entanglement and Quantum Teleportation.

The quantum property of entanglement has a fascinating history. Einstein, who claimed that “God does not play dice with the universe”, utilized the property of entanglement in 1935 in an attempt to ascertain that quantum theory was unfinished. Boris Podolski, Albert Einstein and Nathan Rosen identified that the state vectors of certain quantum systems were associated or “entangled” with each other. If one modifies the state vector of one system, the corresponding state vector of the other system is changed instantaneously also, and independently of the medium through which some communicating signals ought to travel. Since nothing could move faster than the speed of light, how could one system arbitrarily far apart have an impact on the other? Einstein termed this “spooky action at a distance” and it demanded a philosophy of reality contrary to science in those years. He favored the notion that some unfamiliar or “hidden variables” were enhancing the results and since they weren’t known, then quantum theory must be imperfect.

In 1964, John Bell evidenced that there could not conceivably be any hidden variables, which implied that spooky action at a distance was factual. Later in 1982, Alan Aspect performed an investigation in which he displayed that Bells’ Theorem, as it was known as, had experimental validity. Either faster-than-light speed communication was occurring or some other mechanism was in process. This basic theory has made all the modification between traditional ideas of reality and quantum ideas of reality.

Throughout all of history before, all physical phenomena were reliant on some force and some particle to transport that force. Therefore, the speed of light restriction applied. For example, as electrostatic forces are carried by the electron, gravitational forces are carried by the graviton, etc. Though, with entanglement, quantum systems are connected in some manner that does not contain a force and the speed of light restriction does not apply. The real mechanism of how one system affects the other is still unknown.


1. Collapse of the State Vector

When two quantum systems are generated while maintaining some property, their state vectors are correlated or entangled. For example, when two photons are created and their spin conserved, as an essential, one photon has a spin of 1 and a spin of -1. Through measuring one of the state vectors of the photon, the state vector falls into an intelligible state. Instantaneously and robotically, the state vector of the other photon collapses into the other identifiable state. When one photon’s spin is measured and found to be 1, the other photon’s spin of -1 immediately becomes recognized as well. There are no forces involved and no description of the mechanism.

2. Quantum Teleportation

The code of entanglement enables a phenomenon termed “quantum teleportation”. This type of teleportation does not include moving an entity from one physical position to another, as shown in popular science fiction stories, but a disintegration of the original and recreation of a matching duplicate at another location.

3. Brassard’s Theoretical Circuit

In 1996, Gilles Brassard visualized a quantum circuit that could build and entangle two pairs of qubits, where one is entangled with two others. On the whole, “Alice’s” circuit entangles three bits (M, A, and B), and communicates M to “Bob”. Bob’s circuit, using information from M, produces a replica of bit B. The prompt result on B, by measuring M, is efficiently a teleportation of qubit B.

For purposes of debate and at the risk of underestimation, the gates marked L, R, S, and T, are referred to as left-rotation, right-rotation, forward-phase shift, and backward-phase shift gates, separately. The XOR gate is presented as a circumscribed cross. These gates can bring about entanglement when qubits are put through them.

Alternatively, classical computers differ to quantum computers as information is stored in bits, which take the discrete values 0 and 1. If storing one number takes 64 bits, then storing N numbers takes N times 64 bits. Calculations are done essentially in the same way as by hand. As a result, the group of problems that can be solved proficiently is the same as the category that can be solved efficiently by hand. Here “efficiently”, deals with the idea that the evaluation period doesn’t grow too quickly with the size of the input.

Applications that cannot be done now are easily possible with quantum computers. The spin-off concepts, like quantum teleportation, open outlooks only imagined before. To conclude, quantum computers are approaching in their maturity, and they will require a new way of looking at computing.

The New Age of Computing

“In attempting to construct such (artificially intelligent) machines we should not be irreverently usurping His (God’s) power of creating souls, any more than we are in the procreation of children. RRather we are, in either case, instruments of His will providing mansions for the souls that He creates.” ― Alan Turing

As Windows 8 launched last year, a great deal of hybrid notebooks that are created to benefit from the touch-optimized operating system started to appear in the computing market. Evidently, it appears that the trend will definitely continue well in 2013. The new HP Envy x2 is one of the products which was first presented at CES 2013 back in January and has made its way across the globe earlier this month.

With its smooth metallic colorway, the HP Envy x2’s physical design suits its purpose as an invention under the corporation’s Envy premium series. With a hybrid design that enables it to perform as a tablet as well as a normal notebook when attached to its keyboard, the Envy x2 is equipped with an 11.6-inch IPS touch display with a graphics resolution of 1366×768 and supports up to 5 simultaneous touch points.


When I interviewed Professor Leonard Adleman, I asked him generally “What motivates you?”

He stated, “I am motivated by the beauty of mathematics.”

When I reviewed this product currently in the market I thought in terms of Applied Mathematics that it was epic for mathematicians that work on practical problems. Most people in one way or the other know how to calculate things whether they are young individuals or in their elderly years. By way of example, whenever someone purchases or sells an item of any sort, he or she utilizes the logical part of the brain as well as its reasoning ability to function in the activity.

In a scenario where a software company desires to have a feasibility study which entails a report that purposes to detail the characteristics that will determine the success or failure of a project, the following is significant to note. The various elements that constitute system requirements are necessary to be looked at and thoroughly assessed. HP Envy x2 is a suitable product to utilize in a situation where there is a ‘High Tech Restaurant & Bar’ which needs a technological upgrade of the establishment. Clearly, in order for any business to operate efficiently on a daily or weekly basis, there are certain individuals that are involved in this process. These persons include:-

  1. The end-users: The prospective users would comprise of: – the chef, possibly kitchen staff, waiters, bartenders, managers and the accounting staff.
  2. The managers: The management of the client comprises of the manager of the kitchen staff and the serving staff.
  3. Indirect beneficiaries: The customers of the client, both average and business are indirectly affected by the system.
  4. Maintenance and Support People: Any Software Development Company and its technical staff will continue to serve and support this system launch for the client.
  5. Regulators and standards people: This includes a Systems Auditor.


Furthermore, it is critical that there are techniques of Elicitation such as:-

  1. Interviews: Business data, business practices, business goals, technical information and skill levels of users can best be extracted by the method of interviewing.
  2. Observation: Physical environment, business practices, skill levels of users and interfaces with other systems are all best obtained through this process.
  3. Scenarios or walkthroughs: Technical information and business practices are best attained through this way.
  4. Questionnaires: These are best used when gathering business data even though they are not always 100% correct.
  5. Brainstorming: This is good for understanding what the clients’ preferences are.


If a Software Development Company has decided to use the Spiral Model of RE to dictate the different stages to be used, then it would be wise to implement a system where each table in the ‘High Tech Restaurant & Bar’ has this product for the use of the customers. The sequence of stages include:-

Quadrant 1: Information gathering makes use of the aforementioned elicitation techniques and provides an understanding of what is to be built.

Quadrant 2: Analysis and modeling is the pulling together of data from elicitation to determine whether additional information is needed.

Quadrant 3: The purpose of feasibility is to decipher the success of the project. This stage is fundamental.

Quadrant 4: The feasibility document is presented to the stakeholders for validation of the requirements’ specification.

In the technological upgrade of such an establishment would require this sequence of stages as in the case of the product HP Envy x2.



The output of Requirements Engineering would be constructed as a contractual agreement. This agreement would consist of the client having specified their needs and how the Software Company would achieve it. The Systems Specification document would comprise of the requirements needed by the designer. The Evolutionary model should be used to construct a product as mentioned before to meet the needs of the client. This feasibility assessment has been conducted to confirm the practicality of this system upgrade.


From my perspective, this is a new endeavor. Based on case studies, it can be done. With the upgrading of the network, installation of hardware and software as well as proper training, the solution to the client is viable.

With the results from Elicitation, the upgrade in ‘High Tech Restaurant & Bar’ in its data processing and other services is feasible.

Given that the software company is creating an additive to the current system, the input information required already exists. In order for this upgrade to be successful, new hardware and software would need to be purchased and changes would have to be made to the existing. Despite the intricacies of this project, it can be delivered in accordance to a detailed project plan. Training of users, changes in the hardware and practicing of safety procedures would eradicate possible dependencies. As a result, the arrangement would be used at its fullest potential.