“God does not play dice.” These famous words from Albert Einstein encapsulate his lifelong opposition to the radical indeterminism lying at the heart of the Copenhagen Interpretation of quantum mechanics. Developed in the first quarter of the 20th century by (among others) Niels Bohr, Erwin Schrodinger, Louis de Broglie, and ironically, Einstein himself, quantum mechanics represented nothing less than a radical re-imagining of the character and behavior of the physical world at the atomic and sub-atomic scale.

Light, classically conceived as an electromagnetic wave, was discovered to possess a peculiar combination of wave and particle-like properties. Stranger still, this wave-particle duality appeared to apply not only to light, but to any and all subatomic entities, including the very protons, electrons, and neutrons of which all macro-scale objects humans included are comprised, humans included.

The most interesting and troubling consequence of this wave-particle duality is the existence in quantum mechanics of a series of uncertainty relations between measurable characteristics of the particles. Perhaps most well known of these is the uncertainty relation between position and momentum, which is proportional to velocity. These relations imply that there exists an inexorable limit to the degree of precision with which one can know simultaneously a particle’s position and its momentum.

To get a sense for the strangeness of such a limit, imagine that, instead of protons and electrons, you wish to determine the velocity and position of a small metal ball rolling down an inclined plane. You establish a starting point from which the ball will be released, and then, supposing you are very ambitious, proceed to set up a laser system one yard down the length of the ramp designed to make an audible beep at the very instant the ball passes the one-yard mark. You then stand at the finish line, speed gun in hand, ready to measure, immediately after hearing the beep, exactly how fast the ball is going (we will assume you can visually verify the ball‘s direction and so ascertain its velocity).

You run the test a number of times and, to your dismay, observe that despite the apparent simplicity of the system, your velocity measurements cover an unnervingly wide spread, and, strangely, that a few measurements even indicate the ball was rolling up the ramp as it passed the one yard mark. Shaken, you decide instead to devise a system designed to measure a car’s position only after the car has reached a certain velocity. A position measurement is triggered only after the ball reaches 5 mph. Again you run numerous tests and this time find it is the position measurements that are all over the place.

In both cases, the precise determination of one variable results in an ambiguous spreading of possible values for the other. To clarify, for each particular experiment definite values for each variable are found, but for the second variable in each case, those values vary significantly from one test to the next. You find ultimately that the total uncertainty in the ball’s momentum and position stays stubbornly above some minimum uncertainty limit. Disillusioned, you conclude that you are helpless to predict the precise behavior of the system for any given run, resorting rather to a mere specification of the average values that both its position and velocity will gravitate towards over many iterations.

Thankfully such peculiar happenings never occur in the relatively large-scale world of everyday of human experience, yet such uncertainty relations are inherent to the theoretical framework of quantum mechanics. More perplexing still, the theory offers no explanation as to why these uncertainty relations exist.

By the 1930’s, the vast majority of physicists accepted the theory as a critical tool for predicting physical phenomena, however the question of what the theory actually meant remained an object of serious contention. One stance, championed by Niels Bohr, maintained that the uncertainty relations in quantum mechanics correspond to a fundamental uncertainty inherent in any attempt to describe atomic and subatomic systems.

This stance, which came to be known as the Copenhagen or Statistical Interpretation, became the dominant view among early and mid-century physicists. However, a substantial number of physicists, including Einstein, regarded the uncertainty relations as an indication that quantum mechanics was somehow incomplete.

The advocates of these ‘hidden variables,’ maintained that the apparent statistical behavior of subatomic systems was, in reality, the result of hidden variables, subtle mechanisms that remained to be discovered and explicated. And certainly theirs’ is a valid objection. This chair I sit in certainly seems solid, so with this floor, these walls, this whole world, all present and determinate before my eyes, to my hands. Why accept these uncertainty relations? They are nothing but inelegant, unintuitive, lazy approximations of an actual system that is out there and in need of explication.

The hidden variables stance rests upon the fundamental belief that the universe unfolds in a manner that is, in its entirety, accessible to the human intellect. This assumption of intelligibility constitutes, in many ways, a kind of faith that undermines scientific inquiry at its inception in the Age of Enlightenment, and which largely continues to guide it to this day.

However, in a rather shocking turn of events, a group of proofs, known collectively as Bell’s Theorem, published in 1964 by American Physicist John S. Bell, dealt a serious blow to any and all hidden variables theories, and the belief in the world’s fundamental intelligibility that they embodied. Bell managed to demonstrate, by way of rigorous mathematical proof, that as long as no forces are allowed to travel faster than the speed of light, no hidden variables theory could possibly make predictions entirely consistent with those made by quantum mechanics.

Thus, if one wishes to maintain the nature of the universe, and of matter in particular, as fundamentally discrete and intelligible, then one is forced to dismiss quantum mechanics in its entirety. As perhaps should not be surprising, the vast majority of contemporary physicists choose to side with quantum mechanics, embracing its limitations as an inescapable consequence of seeking to describe atomic and subatomic systems.

The sciences, and physics in particular, were founded upon an assumption that the workings of the universe were, in a fundamental way, accessible to he human intellect. With quantum mechanics, and Bell’s Theorem in particular, we find a peculiar instance in which the fundamental assumption of intelligibility was pursued to its furthest-reaching implications. There, amidst the irreducible strangeness of the miniscule, the theorem revealed to be fundamentally false. There is something hauntingly beautiful about a world that always and forever resists the prying of the human mind; and perhaps there is an equal beauty in the individual willing to admit this fact.

Nick Lammers

Guest Writer

Leave a Reply