This book is a review edition, reporting on research done by Steve Faulkner, investigating the foundations of *quantum randomness* and *quantum indeterminacy*.

184 pages

Readers will find quantum indeterminacy to be a mathematically technical subject that makes intuitive good-sense of unintuitive facts witnessed in experiments.

The book provides answers for all theorists whose area of study is Philosophy of Physics, Foundations of Quantum Theory, and especially, scientists wanting insight and understanding of quantum indeterminacy and quantum randomness. In particular, it will interest researchers investigating the freedoms and limitations of Mathematical Logic, having impact on physical processes in Quantum Mechanics. But also, its revelations will interest all who study physics, all students of Quantum Mechanics, and all philosophers researching Foundations of Physics.

The book is a rigorous dissection of Quantum Mathematics, from the standpoint of Mathematical Logic, self-reference, undecidability and information content; following up on experimental evidence of Tomasz Paterek et al, published under the title: “Logical Independence and Quantum Randomness”; and exposing the internal workings of quantum indeterminacy, with answers explaining *uncausedness* as lying in self-reference, and *indefiniteness* as arising in inaccessible lost-history, together with, ambiguity in orientational attitude of reference frame, due to perfect symmetry.

The answers to quantum indeterminacy are in mathematical information and the limitations in how it may be conveyed through physical processes. Physicists and Philosophers and those studying Quantum Information, believing the question of quantum indeterminacy and quantum randomness, shall be resolved through the discovery of some currently unknown missing Physical Principle or Postulate, will find their expectations misplaced. And those expecting quantum indeterminacy and quantum randomness to be found irreducible or predeterminated in hidden variables, shall find themselves also wrong.

Physicists and Philosophers will find the book’s revelations groundbreaking and exciting. They open up a rich new area for research. And those learning from it shall give themselves the opportunity to spread the word to their peers, consolidate it, utilise it in their own research, extend it and further advance physical theory. The author predicts this shall be helpful in progressing the Measurement Problem and the EPR Paradox. And may possibly also offer help for those wishing to link unitary quantum theory with a curved spacetime metric.

The book is substantively based on empirical evidence gained from 'The Vienna Experiments' performed by Tomasz Paterek et al, done on polarised photons, and published in their paper: “Logical Independence and Quantum Randomness”. Those experiments shall be seen by future physicists as a monumental turning point. They reveal a flow of mathematical information, implying definite logical consequence for pure states; and for mixed states, freedom permitting specific indefiniteness.

At the heart of this freedom is logical independence; this is the absence of dependency between unconnected variables. In axiom systems prescribing the infinite fields of scalars: the complex plane, the real line, and the rationals; this is synonymous with undecidability (of statements) made famous by Kurt Gödel.

This independence or undecidability is non-existent information, neither specified nor asserted; which in quantum systems, cannot be provided by the combined union of Axioms upon which Applied Mathematics rests, together with, Postulates of Quantum Theory — neither for the derivation of proofs, nor their negations; but which, if needed, is information that must be brought in as ingression from elsewhere.

The book shows how the Vienna Team’s language of qubits and Boolean formalism, translates into the language of Quantum Mechanics proper; and importantly, how textbook Quantum Mechanics adapts to accommodate the Team’s experiments.

In *classical physics,* experiments of chance, such as coin-tossing and dice-throwing are not truly random, but are *deterministic* — in the sense that — perfect knowledge of the initial conditions would render outcomes perfectly predictable. Put another way; if initial conditions are guaranteed perfectly identical, outcomes of different throws shall be identical also. The degree of randomness relates to the degree of ignorance in the detail of the initial toss or throw. Accordingly, *classical randomness* stems from the ignorance of *physical information*.

In diametrical contrast, in the case of *quantum physics,* the theorems of Kochen and Specker, the inequalities of John Bell, and experimental evidence of Alain Aspect, all indicate that *quantum randomness* does not stem from any such *physical information,* often referred to as *‘hidden variables’* or *‘predeterminated properties’*.

Motivated by that negatory evidence, in 2008, experiments were conducted in Vienna by Tomasz Paterek et al, designed to demonstrate that quantum randomness originates in *mathematical information*. Their research revealed that quantum randomness results only in experiments where logical independence is involved. This is a logical disconnect that stands between items of information which neither prove nor disprove one another.

The inference we can make is that quantum randomness is a matter of *conveyance processes* and *communication* of physical information, rather than the substance-content of physical information itself.

Randomness refers to statistical distribution in *large* samples; doing statistics on a sample of *one* is meaningless and can never tell us about randomness. Yet, each single sample of *one* must convey an ‘intrinsic randomness’ — We call this *indeterminacy*. The following is a simple example illustrating quantum indeterminacy, given by Richard Feynman in his book: *QED \;The Strange Theory of Light and Matter.*

Quantum Indeterminacy is illustrated in light reflected by a glass sheet. The experiment concerns a beam of red light. Blue light would do just as well; the important point is that all the light is the same colour.

A very sensitive detector produces ’noise’ when hit by this beam. As the beam intensity is lowered the noise becomes discernible as *separate* clicks. The separate clicks are explained as registering discrete *photons*. As the beam intensity is lowered further, to something of the order of weak starlight, the clicks happen less and less often, but their loudness never weakens. The clear separation of clicks indicates that one photon at a time is present in the experiment.

The stream of photons is now aimed at a glass sheet, with detectors placed in front and behind, facing it. Clicks from the different detectors are found to be never simultaneous. Counting clicks reveals the ratio of photons *reflected*, to those *transmitted*. Out of every 100 clicks, those reflected average some definite number, in the range 0 – 16: dependent on the glass thickness. For a particular thickness glass sheet the reflected clicks might average 5, say. This average remains constant as beam intensity is varied.

For the thinnest of glass sheets, the number reflected is almost always zero. As thicknesses are increased, the reflections go up to average 16 and then fall back to zero. This pattern repeats in cycles over and over again as thicknesses are gradually increased. Newton knew of these cycles. Modern experiments using monochromatic lasers reveal them to continue past 100,000,000 repetitions, corresponding to 50 metres of glass.

Irrespective of all that, there exists no rule by which we can ever predict whether the *next* photon will reflect or transmit.

The detectors demonstrate *discrete decisions* made by *discrete* objects. But the ratios and cycles are perfectly explained by interference in a *wave continuum* — expressing no decision. The waves are viewed as expressing *probability* for decisions individual photons *will* make. But they do not determine, predict, imply or cause the decision of any individual photon.

And so, the question of *Quantum Indeterminacy* is this: prior to encounter with the glass plate, if photons are all understood to be perfectly identical, by what mechanism does any individual photon have the freedom, either to transmit, or reflect? And *The Measurement Problem:* by what mechanism is that freedom lost, as the decision is made?