Almost 40 years ago, Gordon Moore made his observation that the number of transistors per square inch of printed circuits has doubled every year since the integrated circuit was invented. This observation generally has been extrapolated into the future to predict our seemingly endless ability to create smaller and smaller circuits.
Current technology is creating chipsets with features that are 130 nanometers in size (a human hair is about 60,000 to 100,000 nanometers in diameter). The atoms in a silicon wafer (on which microcircuits are built) are spaced about 0.235 nanometers apart. Obviously, we rapidly are approaching technologies that will be able to create structures at the atomic level.
Which brings us to a very interesting problemcurrent computer technologies are based upon Newtonian physics, and Newtonian physics doesnt work at the subatomic level. Interactions among subatomic particles are fundamentally different than interaction among particles of molecular size and larger. Now, I suspect miniaturization of conventional computer circuits will be limited by economics not science. But notice I said conventional computer circuits. If we look at the physics of subatomic particles, we discover there may be other very powerful ways to process data. One such theory is broadly known as Quantum Computing.
Physics 101
I think I can safely say that nobody understands quantum mechanics.
Richard P. Feynman
Thats reassuring, coming from ar-guably the most brilliant post-Einstein physicist. You can probably assume I do not intend to present an in-depth study of quantum theory in these two pages. Fortunately, we can understand the basic principles of quantum computing with just a little bit of knowledge (which we all know is a dangerous thing). Max Planck started all this in 1900 when he was investigating black body radiation. Traditional physics teaches energy transfers among all particles are continuous (the acceleration of a body is directly proportional to the amount of energy applied). What Planck discovered is that at the subatomic level energy transfers are discrete.
When I was first introduced to atomic structure in school, we were taught electrons orbited around the nucleus like the earth around the sun. Newtonian physics teaches if the velocity of an orbiting body increases by applying additional energy, the radius of the orbit will increase. Not so with electrons. There is no continuum through energy levels and thus orbits around the atomic nucleus. Electrons are observed at discontinuous levels around a nucleus, and those levels are achieved with the transfer of a discrete packet or quanta of energy. Plancks discovery states any energy transfer between two bodies is the sum of elementary but finite transfers of quanta.
Let me make one more observation. When we observe a subatomic particle, our observation forces that particle into a particular state. Prior to our observation, there simply is a probability the particle will be in that state. Interaction such as observation reveals a static subatomic universe, but the reality of that universe is not static. In fact, it can be in a multitude of states at the same time. We will come back to this later and explain some of this apparent contradiction.
Classic Computing
All present-day computing is based on principles and algorithms that predate the microprocessor. Data is represented as a series of 1s and 0s. It is processed by being acted upon by gates. A not gate changes a 0 to a 1 and vice-versa. An and gate takes two inputs and returns a 1 only if both inputs are 1, etc. All the systems we currently use are based on these concepts. Data (that series of 1s and 0s) just as easily could be represented by a string of toggle switches or marks on a rock as by voltages in a microprocessor.
Modern computers have achieved great efficiencies of scale and speed but are still working in a serial world. Take this piece of data and do this to it; then take it and do that to itthats why current cryptographic schemes are considered secure. The sites on which we carry out e-commerce are all secured using some sort of security system based on the difficulty of factoring large numbers. Algorithms for factoring numbers really are just brute force attacks that repeatedly try different numbers after the obvious losers are factored out.
Parallel computing may allow us to process complex processes faster but is still limitedwe still are processing everything serially, only with multiple processors or machines. Assume we are trying to factor a 100-digit number. At some point in the process, we are going to test a particular 32-digit number, then another 32-digit number, then another, etc. What if we could simultaneously test all possible 32-digit numbers at once? We could reduce the time needed to factor large numbers from years to hours (or less). And that is what we hope to gain from quantum computing.
Bits and Qubits
This is the tricky part. We all understand bits. They are 0 or 1. There are no other alternatives. Enter the qubit or quantum bit. Just as the bit is the fundamental unit of information in digital systems, the qubit is the fundamental unit of information in a quantum system. A qubit can exist in a state that corresponds to the classical logical states of 0 and 1. It also can exist in a state called a superposition that may be described as a blend of those statesand it can also exist simultaneously as 0 and 1. Let me explain.
Imagine a qubit as a sphere, not unlike the earth. Now imagine a vector (or arrow) drawn from the center of the sphere to the North Pole. Let that vector represent 1. Imagine a second vector drawn from the center of the sphere to the South Pole. Let that vector represent 0. According to classic quantum theory, if we apply a certain amount of energy to the qubit in state 0, we can flip it to state 1. Thus far we have described a system that could be used for present-day digital computing. We can switch the qubit from 0 to 1 and back.
Now, lets assume instead of applying the proper quanta of energy that would flip the state from 0 to 1, we apply an amount of energy less than that discrete packet. What we have created is a new state that is neither 0 nor 1. In reality subatomic particles exist in states that are probabilities. It is only when they are acted upon by an external agent (as when we observe them) that they appear in a discrete state. If our vector or arrow was pointing to a spot on the surface of the sphere that corresponds to a latitude of 45 North (where it would be in a state of superposition), then the probability we would observe that qubit as a 1 is much greater than we would observe it as 0.
Now, assume we apply just the right amount of energy so that our vector or arrow points to a spot at 0 latitude (on the equator). The probability that it will be a state of 1 or 0 is equaland the qubit is actually in both states at once. Really! This is not conjecture or speculation. Experiments using light (photons) and various split mirrors prove subatomic particles are able to travel different paths simultaneously.
You notice I have defined a qubit abstractly, just as a bit is an abstract concept. It doesnt matter whether a bit is represented as a voltage, light, or stroke on a chalkboard. The understanding of a bit as a two-state device is enough_to build working algorithms. Likewise, it doesnt matter how a qubit actually is achieved. Whether it is done by measuring the spin on an electron or the charge on a photon is completely immaterial to our abilities to imagine and design quantum computing systems.
This Is the Cool Part
Lets hope we have a rudimentary understanding of a qubit. We will start with a 512-bit piece of dataa string of 0s and 1s 512 units in size. This chunk of data has 2512 possible states. Using digital technology we can perform an operation on a single state of that data at a time. If we want to perform that operation on every possible state of that data we need to perform 2512 separate operationsone at a time.
Pardon me while I fire up my quantum box. It now is possible to set each of the 512 qubits to represent both 0 and 1. It now is possible to perform our operation on all 2512 states at once. Since each qubit is a state of 0 or 1 simultaneously, every possible state of that data can be acted on in a single process. Thats parallel computing without multiple processors but with the data in multiple possible states. A complex processor-intensive operation like factoring large numbers becomes easy. Current encryption schemes become worthlesscomputing is changed forever.
OK, there you have a grossly oversimplified example of one possibility that quantum computing presents.
Niels Bohr commented on Max Plancks research, saying: In the history of science there are few events, which, in the brief span of a generation, have had such extraordinary consequences as Plancks discovery of the elementary quantum of action. It has brought about a complete revision of the foundations underlying our description of natural phenomena. The same sort of statement could be made regarding quantum computing.
Fact or Fiction?
Did I mention quantum computers dont exist? True, true, but small-scale qubit operations have been demonstrated. Quantum computing began as a mental exercise. Richard Feynman first proposed the idea of a quantum computer in 1981. The question was, If we use quantum states to encode and process data instead of macroscopic states, will it change our understanding of information theory? That early speculation has evolved into reality. Cor-porate America is allocating R&D money to build these things. Lucent Technologies (at the famed Bell Laboratories) and IBM are just two of the players. We may be years away from working machines, but with the enormous revenues anticipated from such a radical change, research will continue. Every so often one sees an article about a 5-qubit device that actually did something it was supposed to do for a brief period of time. To me, that is encouraging.
What About Us?
Hey, we sell insurance. What the heck do we care about factoring 200-digit numbers? Do you have gigabytes of data that you cant properly analyze? Do you want to reduce the risk on each policy you write? Quantum computing will support incredibly complex data mining. It will provide you the ability to better identify homogeneous risk groups. It will allow you to determine patterns in claims instantaneously. Who knows what possibilities it may bring? The insurance industry always has been at the forefront of computer technology, and I suspect it will be there when quantum computing is a reality. If not, it is a wonderful mental exercise to imagine the possibilities.
Want to continue reading?
Become a Free PropertyCasualty360 Digital Reader
Your access to unlimited PropertyCasualty360 content isn’t changing.
Once you are an ALM digital member, you’ll receive:
- Breaking insurance news and analysis, on-site and via our newsletters and custom alerts
- Weekly Insurance Speak podcast featuring exclusive interviews with industry leaders
- Educational webcasts, white papers, and ebooks from industry thought leaders
- Critical converage of the employee benefits and financial advisory markets on our other ALM sites, BenefitsPRO and ThinkAdvisor
Already have an account? Sign In Now
© 2024 ALM Global, LLC, All Rights Reserved. Request academic re-use from www.copyright.com. All other uses, submit a request to [email protected]. For more information visit Asset & Logo Licensing.