# The State Space of Quantum Mechanics is Redundant

There was some water-cooler talk around the office this past week about a paper by Masanes, Galley and Müller that hit the arXiv, and I decided to write up my thoughts about it for ease of future reference. In short, I have no reason yet to think that the math is wrong, but what they present as a condition on states seems more naturally to me like a condition on measurement outcomes. Upon making this substitution, the Masanes, Galley and Müller result comes much closer to resembling Gleason’s theorem than they say it does.

So, if you’re wanting for some commentary on quantum mechanics, here goes:
Continue reading The State Space of Quantum Mechanics is Redundant

# What I Do

At the moment, I’m taking a quick break from reading some rather dense mathematical prose, and I spent yesterday plugging away at a draft of my research group’s next technical publication. This led me to reflect on a lesson that I think a lot of science education leaves out: Even in a technical article, you have to have a story to carry the progression through. “These are all the boffo weird roadside attractions we found while proving the theorems in our last paper” is honest, but not adequate.

Our research project is the reconstruction of the mathematical formalism of quantum theory from physical principles. We tease apart the theory, identify what is robustly strange about it — for many more quantum phenomena can be emulated with classical stochasticity than are often appreciated — and try to build a new representation that brings the most remarkable features of the physics to the forefront. In special relativity, we have Einstein’s postulates, and the dramatic tension between them: Inertial observers can come to agree upon the laws of physics, but they cannot agree upon a standard of rest. In thermodynamics, we have the Four Laws, which come with their own dramatic tension, in that energy is conserved while entropy is nondecreasing. Both of these theories are expressed in terms of what agents can and cannot do, yet they are more than “mere” engineering, because they apply to all agents. Or, to say it another way, it is to the benefit of any agent to pick up the theory and use it as a guide.

What, then, is the analogue for quantum theory? If the textbook presentation of quantum physics is like the formulae for the Lorentz transform, with all those square roots and whatnot, or the Maxwell relations in thermo, with all those intermingling partial derivatives that we invent hacks about determinants to remember, what is quantum theory’s version of Einstein’s postulates or the Four Laws?

That’s the grandiose version, anyway. The reason I got invited to speak at an American Mathematical Society meeting is that the geometric structures that arise in this work are vexingly fascinating. You want about Galois fields and Hilbert’s 12th problem? We’ve got ’em! How about sphere packing and unexpected octonions? We’ve got those, too! And the structure that leads down the latter path turns out, on top of that, to yield a new way of thinking about Mermin’s 3-qubit Bell inequality. It is all lovely, and it is all strange.

The SIC problem gives us the opportunity to travel all throughout mathematics, because, while the definition looks pretty small, the question is bigger on the inside.

# New Paper Dance, Encore

This time, it’s another solo-author outing.

B. C. Stacey, “Is the SIC Outcome There When Nobody Looks?” [arXiv:1807.07194].

Informationally complete measurements are a dramatic discovery of quantum information science, and the symmetric IC measurements, known as SICs, are in many ways optimal among them. Close study of three of the “sporadic SICs” reveals an illuminating relation between different ways of quantifying the extent to which quantum theory deviates from classical expectations.

# New Papers Dance

In spite of the “everything, etc.” that is life these days, I’ve managed to do a bit of science here and there, which has manifested as two papers. First, there’s the one about quantum physics, written with the QBism group at UMass Boston:

J. B. DeBrota, C. A. Fuchs and B. C. Stacey, “Symmetric Informationally Complete Measurements Identify the Essential Difference between Classical and Quantum” [arXiv:1805.08721].

We describe a general procedure for associating a minimal informationally-complete quantum measurement (or MIC) and a set of linearly independent post-measurement quantum states with a purely probabilistic representation of the Born Rule. Such representations are motivated by QBism, where the Born Rule is understood as a consistency condition between probabilities assigned to the outcomes of one experiment in terms of the probabilities assigned to the outcomes of other experiments. In this setting, the difference between quantum and classical physics is the way their physical assumptions augment bare probability theory: Classical physics corresponds to a trivial augmentation — one just applies the Law of Total Probability (LTP) between the scenarios — while quantum theory makes use of the Born Rule expressed in one or another of the forms of our general procedure. To mark the essential difference between quantum and classical, one should seek the representations that minimize the disparity between the expressions. We prove that the representation of the Born Rule obtained from a symmetric informationally-complete measurement (or SIC) minimizes this distinction in at least two senses—the first to do with unitarily invariant distance measures between the rules, and the second to do with available volume in a reference probability simplex (roughly speaking a new kind of uncertainty principle). Both of these arise from a significant majorization result. This work complements recent studies in quantum computation where the deviation of the Born Rule from the LTP is measured in terms of negativity of Wigner functions.

To get an overall picture of our results without diving into the theorem-proving, you can watch John DeBrota give a lecture about our work.

Second, there’s the more classical (in the physicist’s sense, if not the economist’s):

B. C. Stacey and Y. Bar-Yam, “The Stock Market Has Grown Unstable Since February 2018” [arXiv:1806.00529].

On the fifth of February, 2018, the Dow Jones Industrial Average dropped 1,175.21 points, the largest single-day fall in history in raw point terms. This followed a 666-point loss on the second, and another drop of over a thousand points occurred three days later. It is natural to ask whether these events indicate a transition to a new regime of market behavior, particularly given the dramatic fluctuations — both gains and losses — in the weeks since. To illuminate this matter, we can apply a model grounded in the science of complex systems, a model that demonstrated considerable success at unraveling the stock-market dynamics from the 1980s through the 2000s. By using large-scale comovement of stock prices as an early indicator of unhealthy market dynamics, this work found that abrupt drops in a certain parameter U provide an early warning of single-day panics and economic crises. Decreases in U indicate regimes of “high co-movement”, a market behavior that is not the same as volatility, though market volatility can be a component of co-movement. Applying the same analysis to stock-price data from the beginning of 2016 until now, we find that the U value for the period since 5 February is significantly lower than for the period before. This decrease entered the “danger zone” in the last week of May, 2018.

The weekend before last, I overcame my reluctance to travel and went to a mathematics conference, the American Mathematical Society’s Spring Central Sectional Meeting. I gave a talk in the “Recent Advances in Packing” session, spreading the word about SICs. My talk followed those by Steve Flammia and Marcus Appleby, who spoke about the main family of known SIC solutions while I covered the rest (the sporadic SICs). The co-organizer of that session, Dustin Mixon, has posted an overall summary and the speakers’ slides over at his blog.

# To Thems That Have

Occasionally, I think of burning my opportunities of advancing in the physics profession — or, more likely, just burning my bridges with Geek Culture(TM) — by writing a paper entitled, “Richard Feynman’s Greatest Mistake”.

I did start drafting an essay I call “To Thems That Have, Shall Be Given More”. There are a sizable number of examples where Feynman gets credit for an idea that somebody else discovered first. It’s the rich-get-richer of science.
Continue reading To Thems That Have

# Multiscale Structure of More-than-Binary Variables

When I face a writing task, my two big failure modes are either not starting at all and dragging my feet indefinitely, or writing far too much and having to cut it down to size later. In the latter case, my problem isn’t just that I go off on tangents. I try to answer every conceivable objection, including those that only I would think of. As a result, I end up fighting a rhetorical battle that only I know about, and the prose that emerges is not just overlong, but arcane and obscure. Furthermore, if the existing literature on a subject is confusing to me, I write a lot in the course of figuring it out, and so I end up with great big expository globs that I feel obligated to include with my reporting on what I myself actually did. That’s why my PhD thesis set the length record for my department by a factor of about three.

I have been experimenting with writing scientific pieces that are deliberately bite-sized to begin with. The first such experiment that I presented to the world, “Sporadic SICs and the Normed Division Algebras,” was exactly two pages long in its original form. The version that appeared in a peer-reviewed journal was slightly longer; I added a paragraph of context and a few references.

My latest attempt at a mini-paper (articlet?) is based on a blog post from a few months back. I polished it up, added some mathematical details, and worked in a comparison with other research that was published since I posted that blog item. The result is still fairly short:

A few years ago, I noticed a glitch in a paper that colleagues of mine had published back in 2002. A less-than sign in an inequality should have been a less-than-or-equals. This might have been a transcription error during the typing-up of the work, or it could have entered during some other phase of the writing process. Happens to the best of us! Algebraically, it was equivalent to solving an equation
$ax^2 + bx + c = 0$ with the quadratic formula,
$x = \frac{-b \pm \sqrt{b^2 – 4ac}}{2a},$ and neglecting the fact that if the expression under the square root sign equals zero, you still get a real solution.

This sort of glitch is usually not worth a lot of breath, though I do tend to write in when I notice them, to keep down the overall confusingness of the scientific literature. In this case, however, there’s a surprise bonus. The extra solutions you pick up turn out to have a very interesting structure to them, and they include mathematical objects that were already interesting for other reasons. So, I wrote a little note explaining this. In order to make it self-contained, I had to lay down a bit of background, and with one thing and another, the little note became more substantial. Too substantial, I learned: The journal that published the original paper wouldn’t take it as a Comment on that paper, because it said too many new things! Eventually, after a little more work, it found a home:

The number of citations that Google Scholar lists for this paper (one officially published in a journal, mind) fluctuates between 5 and 6. I think it wavers on whether to include a paper by Szymusiak and Słomczyński (Phys. Rev. A 94, 012122 = arXiv:1512.01735 [quant-ph]). Also, if you compare against the NASA ADS results, it turns out that Google Scholar is missing other citations, too, including a journal-published item by Bellomo et al. (Int. J. Quant. Info. 13, 2 (2015), 1550015 = arXiv:1504.02077 [quant-ph]).

As I said in 2014, this would be a rather petty thing to care about, if people didn’t rely on these metrics to make decisions! And, as it happens, all the problems I noted then are still true now.

# Multiscale Structure, Information Theory, Explosions

I’d like to talk a bit about using information theory to quantify the intuition that a complex system exhibits structure at multiple scales of organization. My friend and colleague Ben Allen wrote an introduction to this a while ago:

Ben’s blog post is a capsule introduction to this article that he and I wrote with Yaneer Bar-Yam:

I also cover this topic, as well as a fair bit of background on how to relate probability and information, in my PhD thesis:

In this post, I’ll carry the ideas laid out in these sources a little bit farther in a particular direction.
Continue reading Multiscale Structure, Information Theory, Explosions

# New Paper Dance

M. Appleby, C. A. Fuchs, B. C. Stacey and H. Zhu, “Introducing the Qplex: A Novel Arena for Quantum Theory,” arXiv:1612.03234 [quant-ph] (2016).

Abstract:

We reconstruct quantum theory starting from the premise that, as Asher Peres remarked, “Unperformed experiments have no results.” The tools of modern quantum information theory, and in particular the symmetric informationally complete (SIC) measurements, provide a concise expression of how exactly Peres’s dictum holds true. That expression is a constraint on how the probability distributions for outcomes of different, mutually exclusive experiments mesh together, a type of constraint not foreseen in classical thinking. Taking this as our foundational principle, we show how to reconstruct the formalism of quantum theory in finite-dimensional Hilbert spaces. Along the way, we derive a condition for the existence of a $d$-dimensional SIC.

Also available through SciRate, where I have a whole profile.

# New Paper Dance

B. C. Stacey, “Geometric and Information-Theoretic Properties of the Hoggar Lines” (2016), arXiv:1609.03075 [quant-ph].

We take a tour of a set of equiangular lines in eight-dimensional Hilbert space. This structure defines an informationally complete measurement, that is, a way to represent all quantum states of three-qubit systems as probability distributions. Investigating the shape of this representation of state space yields a pattern of connections among a remarkable spread of mathematical constructions. In particular, studying the Shannon entropy of probabilistic representations of quantum states leads to an intriguing link between the questions of real and of complex equiangular lines. Furthermore, we will find relations between quantum information theory and mathematical topics like octonionic integers and the 28 bitangents to a quartic curve.

All things told, SIC-POVMs are just about the damnedest things I’ve ever studied in mathematics.

(Also listed on SciRate.)

# New(-ish) Publications

I’ve had a few scholarly items come out in the past several weeks—new stuff, and updated versions of old stuff. Here are their coordinates:

# Multiscale Structure via Information Theory

We have scienced:

B. Allen, B. C. Stacey and Y. Bar-Yam, “An Information-Theoretic Formalism for Multiscale Structure in Complex Systems” [arXiv:1409.4708].

We develop a general formalism for representing and understanding structure in complex systems. In our view, structure is the totality of relationships among a system’s components, and these relationships can be quantified using information theory. In the interest of flexibility we allow information to be quantified using any function, including Shannon entropy and Kolmogorov complexity, that satisfies certain fundamental axioms. Using these axioms, we formalize the notion of a dependency among components, and show how a system’s structure is revealed in the amount of information assigned to each dependency. We explore quantitative indices that summarize system structure, providing a new formal basis for the complexity profile and introducing a new index, the “marginal utility of information”. Using simple examples, we show how these indices capture intuitive ideas about structure in a quantitative way. Our formalism also sheds light on a longstanding mystery: that the mutual information of three or more variables can be negative. We discuss applications to complex networks, gene regulation, the kinetic theory of fluids and multiscale cybernetic thermodynamics.

There’s much more to do, but for the moment, let this indicate my mood:

# #WhyTheQuantum

One day, I’ll be able to explain the story behind how I got into this, but looking back on all the oddities of it, I’m not sure that a medium other than manga could do it justice.

My Struggles with the Block Universe [arXiv:1405.2390]

Christopher A. Fuchs, Maximilian Schlosshauer (foreword), Blake C. Stacey (editor)

This document is the second installment of three in the Cerro Grande Fire Series. Like its predecessor arXiv:quant-ph/0105039, “Notes on a Paulian Idea,” it is a collection of letters written to various friends and colleagues, most of whom regularly circuit this archive. The unifying theme of all the letters is that each has something to do with the quantum. Particularly, the collection chronicles the emergence of Quantum Bayesianism as a robust view of quantum theory, eventually evolving into the still-more-radical “QBism” (with the B standing for no particular designation anymore), as it took its most distinctive turn away from various Copenhagen Interpretations. Included are many anecdotes from the history of quantum information theory: for instance, the story of the origin of the terms “qubit” and “quantum information” from their originator’s own mouth, a copy of a rejection letter written by E. T. Jaynes for one of Rolf Landauer’s original erasure-cost principle papers, and much more. Specialized indices are devoted to historical, technical, and philosophical matters. More roundly, the document is an attempt to provide an essential ingredient, unavailable anywhere else, for turning QBism into a live option within the vast spectrum of quantum foundational thought.

As the comment field says, “CAUTION, do not unthinkingly print from a printer: 2,348 pages, 4 indices, 6 figures, with extensive hyperlinking.”

MSwtBU was originally submitted to the arXiv on 10 May 2014, the anniversary of the predecessor volume and before that of the Cerro Grande Fire, which started the whole business. To my knowledge, it is the longest item currently on the arXiv.

omg 2000+ pages. There goes my free time.