# A Picture for the Mind: the Bloch Ball

Now and then, stories will pop up in the news about the latest hot new thing in quantum computers. If the story makes any attempt to explain why quantum computing is special or interesting, it often recycles a remark along the lines of, “A quantum bit can be both 0 and 1 simultaneously.” This, well, ehhhhh… It’s rather like saying that Boston is at both the North Pole and the South Pole simultaneously. Something important has been lost. I figured I should take a stab at explaining what. Our goal today is to develop a mental picture for a qubit, the basic unit that quantum computers are typically regarded as built out of. To be more precise, we will develop a mental picture for the mathematics of a qubit, not for how to implement one in the lab. There are many ways to do so, and getting into the details of any one method would, for our purposes today, be a distraction. Instead, we will be brave and face the issue on a more abstract level.

A qubit is a thing that one prepares and that one measures. The mathematics of quantum theory tells us how to represent these actions algebraically. That is, it describes the set of all possible preparations, the set of all possible measurements, and how to compute the probability of getting a particular result from a chosen measurement given a particular preparation. To do something interesting, one would typically work with multiple qubits together, but we will start with a single one. And we will begin with the simplest kind of measurement, the binary ones. A binary test has two possible outcomes, which we can represent as 0 or 1, “plus” or “minus”, “ping” and “pong”, et cetera. In the lab, this could be sending an ion through a magnetic field and registering whether it swerved up or down; or, it could be sending a blip of light through a polarizing filter turned at a certain angle and registering whether there is or is not a flash. Or any of many other possibilities! The important thing is that there are two outcomes that we can clearly distinguish from each other.

For any physical implementation of a qubit, there are three binary measurements of special interest, which we can call the $X$ test, the $Y$ test and the $Z$ test. Let us denote the possible outcomes of each test by $+1$ and $-1$, which turns out to be a convenient choice. The expected value of the $X$ test is the average of these two possibilities, weighted by the probability of each. If we write $P(+1|X)$ for the probability of getting the $+1$ outcome given that we do the $X$ test, and likewise for $P(-1|X)$, then this expected value is $$x = P(+1|X) \cdot (+1) + P(-1|X) \cdot (-1).$$ Because this is a weighted average of $+1$ and $-1$, it will always be somewhere in that interval. If for example we are completely confident that an $X$ test will return the outcome $+1$, then $x = 1$. If instead we lay even odds on the two possible outcomes, then $x = 0$. Likewise, $$y = P(+1|Y) \cdot (+1) + P(-1|Y) \cdot (-1),$$ and $$z = P(+1|Z) \cdot (+1) + P(-1|Z) \cdot (-1).$$

To specify the preparation of a single qubit, all we have to do is pick a value for $x$, a value for $y$ and a value for $z$. But not all combinations $(x,y,z)$ are physically allowed. The valid preparations are those for which the point $(x,y,z)$ lies on or inside the ball of radius 1 centered at the origin: $$x^2 + y^2 + z^2 \leq 1.$$ We call this the Bloch ball, after the physicist Felix Bloch (1905–1983). The surface of the Bloch ball, at the distance exactly 1 from the origin, is the Bloch sphere. The points where the axes intersect the Bloch sphere — $(1,0,0)$, $(-1,0,0)$, $(0,1,0)$ and so forth — are the preparations where we are perfectly confident in the outcome of one of our three tests. Points in the interior of the ball, not on the surface, imply uncertainty about the outcomes of all three tests. But look what happens: If I am perfectly confident of what will happen should I choose to do an $X$ test, then my expected values $y$ and $z$ must both be zero, meaning that I am completely uncertain about what might happen should I choose to do either a $Y$ test or a $Z$ test. There is an inevitable tradeoff between levels of uncertainty, baked into the shape of the theory itself. One might even call that a matter… of principle.

We are now well-poised to improve upon the language in the news stories. The point that specifies the preparation of a qubit can be at the North Pole $(0,0,1)$, the South Pole $(0,0,-1)$, or anywhere in the ball between them. We have a whole continuum of ways to be intermediate between completely confident that the $Z$ test will yield $+1$ (all the way north) and completely confident that it will yield $-1$ (all the way south).

Now, there are other things one can do to a qubit. For starters, there are other binary measurements beyond just the $X$, $Y$ and $Z$ tests. Any pair of points exactly opposite each other on the Bloch sphere define a test, with each point standing for an outcome. The closer the preparation point is to an outcome point, the more probable that outcome. To be more specific, let’s write the preparation point as $(x,y,z)$ and the outcome point as $(x’,y’,z’)$. Then the probability of getting that outcome given that preparation is $$P = \frac{1}{2}(1 + x x’ + y y’ + z z’).$$

An interesting conceptual thing has happened here. We have encoded the preparation of a qubit by a set of expected values, i.e., a set of probabilities. Consequently, all those late-night jazz-cigarette arguments over what probability means will spill over into the arguments about what quantum mechanics means. Moreover, and not unrelatedly, we can ask, “Why three probabilities? Why is it the Bloch sphere, instead of the Bloch disc or the Bloch hypersphere?” It would be perfectly legitimate, mathematically, to require probabilities for only two tests in order to specify a preparation point, or to require more than three. That would not be quantum mechanics; the fact that three coordinates are needed to nail down the preparation of the simplest possible system is a structural fact of quantum theory. But is there a deeper truth from which that can be deduced?

One could go in multiple directions from here: What about tests with more than two outcomes? Systems composed of more than one qubit? Very quickly, the structures involved become more difficult to visualize, and familiarity with linear algebra — eigenvectors, eigenvalues and their friends — becomes a prerequisite. People have also tried a variety of approaches to understand what quantum theory might be derivable from. Any of those topics could justify something in between a blog post and a lifetime of study.

• E. Rieffel and W. Polak, Quantum Computing: A Gentle Introduction (MIT Press, 2011), chapter 2
• J. Rau, Quantum Theory: An Information Processing Approach (Oxford University Press, 2021), section 3.3
• M. Weiss, “Python tools for the budding quantum bettabilitarian” (2022)

# Complex Equiangular Lines: The Unusual Shapes of Quantum Physics

How many intersecting lines can you draw such that the angle made by any pair is the same as the angle made by any other pair? What if you try in 3 dimensions, or 4, or 5? What if you let your coordinates become complex numbers? And what does all this have to do with quantum probability?!

As the kids say, “Like, quantum and subscribe!”

# New Textbook

B. C. Stacey, A First Course in the Sporadic SICs. SpringerBriefs in Mathematical Physics volume 41 (2021).

This book focuses on the Symmetric Informationally Complete quantum measurements (SICs) in dimensions 2 and 3, along with one set of SICs in dimension 8. These objects stand out in ways that have earned them the moniker of “sporadic SICs”. By some standards, they are more approachable than the other known SICs, while by others they are simply atypical. The author forays into quantum information theory using them as examples, and the author explores their connections with other exceptional objects like the Leech lattice and integral octonions. The sporadic SICs take readers from the classification of finite simple groups to Bell’s theorem and the discovery that “hidden variables” cannot explain away quantum uncertainty.

While no one department teaches every subject to which the sporadic SICs pertain, the topic is approachable without too much background knowledge. The book includes exercises suitable for an elective at the graduate or advanced undergraduate level.

ERRATA:

In the preface, on p. v, there is a puzzling appearance of “in references [77–80]”. This is due to an error in the process of splitting the book into chapters available for separate downloads. These references are arXiv:1301.3274, arXiv:1311.5253, arXiv:1612.07308 and arXiv:1705.03483.

Page 6: “5799” should be “5779” (76 squared plus 3), and M. Harrison should be added to the list of co-credited discoverers. The most current list of known solutions, exact and numerical, is to my knowledge this presentation by Grassl.

Page 58: “Then there are 56 octavians” should be “Then there are 112 octavians”.

# Thoughts on “relational Quantum Mechanics”

Recently, the far-flung QBism discussion group nominally centered at UMass Boston has been conversing about Carlo Rovelli’s relational interpretation of quantum mechanics. Trying to think all this through halfway clearly, I wrote some notes. They don’t seem to be moving in the direction of a paper, and they’re too chatty for the arXiv even by my standards, so this seems the best place to host them.

EDIT TO ADD (8 September): To my surprise, I was able to edit those notes in the direction of being a paper. A few items came out after my post which lifted the burden of discussing certain topics and let a theme come together. Accordingly, see arXiv:2109.03186.

# Canonical Probabilities by Directly Quantizing Thermodynamics

I’ve had this derivation kicking around a while, and today seemed like as good a day as any to make a fuller write-up of it:

• B. C. Stacey, “Canonical probabilities by directly quantizing thermodynamics” (PDF).

The idea is that Boltzmann’s rule $p(E_n) \propto e^{-E_n / k_B T}$ pops up really naturally when you ask for a rule that plays nicely with the composing-together of uncorrelated systems. This, in turn, gives a convenient expression to the idea that classical physics is what you get when you handle quantum systems sloppily.

# More on Bohr

This post carries further on in the vein of my earlier writings on how the way most physicists talk about “the Copenhagen interpretation of quantum mechanics” is largely ahistorical.

It’s common to present “the Copenhagen interpretation” as a kind of dynamical collapse model, in which wavefunctions are ontic entities (like a sophomore’s picture of the electromagnetic field) that evolve according to the Schrödinger equation, except in moments of “measurement” that take place in unspecified conditions. This portrayal is typically intended to make “the Copenhagen interpretation” sound like a mutant form of Newtonian mechanics where $F = ma$ almost always, except at peculiar instants when $F$ suddenly becomes $ma/2$ and then switches back again. Of course, this is abhorrent and pathological.

When I was a child, my parents bought me a magnet from a museum gift shop. It had a long handle, likely made deliberately to resemble a magic wand, and as educational toys go, it served its function, since I went around poking all sorts of things to see if the magnet would grab them. I suspect this is a common enough type of learning experience. One discovers, for example, that it will pick up paperclips but not pennies. Having calibrated one’s understanding of the magnet, one can then use it as a tool — say, by telling which of two matchboxes is filled with paperclips, or that something is different about a wire coil connected to a battery versus one that is not.

What concerned Bohr himself was that this transition — between the calibration phase, when an object is under scrutiny, and its later use as a laboratory instrument — is conceptually nontrivial. First a lens is a strangely curved block of glass we must work to comprehend, and then it is a means to overthrow Aristotle. There are not two different dynamical laws, but two different languages.

Here’s how John Wheeler put it:

“Bohr stresses […] that the stick we hold can itself be an object of investigation, as when we run our fingers over its surface. The same stick, when grasped firmly and used to explore something else, becomes an extension of the observer or—when we depersonalize—a part of the measuring equipment. As we withdraw the stick from the one role, and recast it in the other role, we transpose the line of demarcation from one end of it to the other. The distinction between the probed and the probe, so evident at this scale of the everyday, is the without-which-nothing of every elementary phenomenon, of every closed quantum process.”

[From “Law Without Law”, in the Wheeler–Zurek collection, p. 206]

The commonalities and contrasts with QBism should be evident enough. Extension of the observer, yes; depersonalize to mere dead “equipment”, no, for it is the latter move that gets one into trouble with Wigner’s Friend. And, on a perhaps more practical level where the choice of research problems is concerned, Bohr takes the quantum formalism pretty much as given and leaves “the quantum principle” not explicitly defined.

It may also be illustrative to consider how Rovelli’s “Relational Quantum Mechanics” treats this point. I tentatively infer that Rovelli thinks giving a special role to an agent means imposing two different dynamical laws, one for systems of agent-type and another for all nonagent physical entities. Even if he doesn’t spell it out, that seems to be the mindset he operates with, and the background he relies upon. Of course, he balks at that dichotomy. I would, too!

# Underappreciated

Some time ago, I had one of those odd little thoughts that could be the spark of an essay. But in this particular case, the point I wanted to make felt like it could be made most clearly by demonstration, rather than explication. So, I wrote a concise report on “An Underappreciated Exchange in the Bohr–Einstein Debate.” Judging by the modest splash of positive e-mail that I received after posting it, I think I layered the whimsy and the serious point adequately well.

# My 2019 in Science

First, of course, there was the doubt and the pain.

Let’s talk about the papers I managed to get out the door and into public view. In retrospect, the list is pleasingly not insubstantial:

There was also From Gender to Gleason, my review of Adam Becker’s book What is Real? (2018). By the time I was done, it was as lengthy as a paper, but the arXiv isn’t really a host for book reviews, so I just posted it here at Sunclipse and moved on.

# On Being a Quantum Physicist in Autumn 2019

(a friendly warning for police violence, transphobia and philosophy of physics)

The way I see it, the two big Why? questions about quantum mechanics are, first, why do we use the particular mathematical apparatus of quantum theory, as opposed to any alternative we might imagine? And second, why do we only find it necessary to work with the full perplexities of quantum physics some of the time? These two questions are related. In order to understand how imprecise measurements might wash out quantum weirdness, we need to characterize which features of quantum theory really are fundamentally weird. And this, in turn, requires separating deep principles from convenient conventions and illuminating the true core of the physics. My own research has focused on the first question, but the second is never too far from my mind.

Of course, I have a lot on my mind these days, but I don’t think I’m special in that regard.

If you ask me, a “quantum system” can be any part of nature that is subject to an agent’s inquiry. A “quantum measurement” is, in principle, any action that an agent takes upon a quantum system. The road between Boston’s City Hall and the Holocaust Memorial is a quantum system. When the police use their bicycles as battering rams against queer kids and street medics, running towards the trouble is a quantum measurement. Being threatened with pepper spray, while secoondhand exposure already stings the eye and throat, one human thrown to the pavement in the intersection in front of you while another arrest happens on the sidewalk just behind you, is an outcome of that measurement. Unsurprisingly, textbooks provide little guidance on casting that event into the algebraic formalism of density matrices, and in the moment, other types of expertise are more immediately useful.

I first encountered quantum physics in a serious way during the spring of my second year at university — 2003, that would have been. I did not particularly care about the conceptual or philosophical “foundations” of it until the summer of 2010. The interval in between encompassed six semesters of quantum mechanics and subjects dependent upon it, along with my first attempts to find a research problem in the area. Once my curiosity had been provoked, it took the better part of a year to find an “interpretation” of quantum mechanics that was at all satisfying, and longer than that to make the transition from “this is how a member of that school would answer that question” to “this is what I declare myself”. Part of that transition was my discovery that I could put my own stamp on the ideas: The concepts and the history provoked new mathematical questions, which I could approach with a background that nobody else had.

The interpretation I adopted was the QBism of Chris Fuchs and Rüdiger Schack, later joined by N. David Mermin.

QBism is

an interpretation of quantum mechanics in which the ideas of agent and experience are fundamental. A “quantum measurement” is an act that an agent performs on the external world. A “quantum state” is an agent’s encoding of her own personal expectations for what she might experience as a consequence of her actions. Moreover, each measurement outcome is a personal event, an experience specific to the agent who incites it. Subjective judgments thus comprise much of the quantum machinery, but the formalism of the theory establishes the standard to which agents should strive to hold their expectations, and that standard for the relations among beliefs is as objective as any other physical theory.

That’s how we put it in the FAQ. Any physicist who is weird enough to endorse an interpretation of quantum mechanics will naturally get inquiries about it. Many of these, we get often enough that we try to compile good answers together into a nicely portable package — with the proviso that the quantum is a project, and some answers are not final because if physics were easy, we’d be done by now.

There’s a question which seems particularly suited to answering in the blog format, though: “Why don’t you believe in the Many Worlds Interpretation?”
Continue reading On Being a Quantum Physicist in Autumn 2019

# Concerning Wigner’s Former Roommate

I attended a workshop on the mini-genre of Extended Wigner’s Friend “paradoxes” but did not think that I’d write much on the topic myself. And, indeed, the comment I eventually produced is mostly bibliography.

B. C. Stacey, “On QBism and Assumption (Q)” [arXiv:1907.03805].

I correct two misapprehensions, one historical and one conceptual, in the recent literature on extensions of the Wigner’s Friend thought-experiment. Perhaps fittingly, both concern the accurate description of some quantum physicists’ beliefs by others.

Also available via SciRate.

# On Reconstructing the Quantum

It’s manifesto time! “Quantum Theory as Symmetry Broken by Vitality” [arXiv:1907.02432].

I summarize a research program that aims to reconstruct quantum theory from a fundamental physical principle that, while a quantum system has no intrinsic hidden variables, it can be understood using a reference measurement. This program reduces the physical question of why the quantum formalism is empirically successful to the mathematical question of why complete sets of equiangular lines appear to exist in complex vector spaces when they do not exist in real ones. My primary goal is to clarify motivations, rather than to present a closed book of numbered theorems, and consequently the discussion is more in the manner of a colloquium than a PRL.

Also available via SciRate.

# New Paper Dance

Another solo-author outing by me: “Invariant Off-Diagonality: SICs as Equicoherent Quantum States” [arXiv:1906.05637].

Coherence, treated as a resource in quantum information theory, is a basis-dependent quantity. Looking for states that have constant coherence under canonical changes of basis yields highly symmetric structures in state space. For the case of a qubit, we find an easy construction of qubit SICs (Symmetric Informationally Complete POVMs). SICs in dimension 3 and 8 are also shown to be equicoherent.

Also available via SciRate.

# From Gender to Gleason

… or, The Case of Adam Becker’s What Is Real? (2018).

It is easy to argue that the founders of quantum mechanics made statements which are opaque and confusing. It is fair to say that their philosophical takes on the subject are not infrequently unsatisfying. We can all use reminders that human flaws and passions are a part of physics. So, it would be nice to have a popular book on these themes, one that makes no vital omissions, represents its sources accurately and lives up to its own ideals.

# In Re “CopenHagen” and “COLLAPSE”

I was having an e-mail conversation the other day with a friend from olden days — another MIT student who made it out with a physics degree the same year I did — and that led me to set down some thoughts about history and terminology that may be useful to share here.

My primary claim is the following:

We should really expunge the term “the Copenhagen interpretation” from our vocabularies.

What Bohr thought was not what Heisenberg thought, nor was it what Pauli thought; there was no single unified “Copenhagen interpretation” worthy of the name. Indeed, the term does not enter the written literature until the 1950s, and that was mostly due to Heisenberg acting like he and Bohr were more in agreement back in the 1920s than they actually had been.

For Bohr, the “collapse of the wavefunction” (or the “reduction of the wave packet”, or whatever you wish to call it) was not a singular concept tacked on to the dynamics, but an essential part of what the quantum theory meant. He considered any description of an experiment as necessarily beginning and ending in “classical language”. So, for him, there was no problem with ending up with a measurement outcome that is just a classical fact: You introduce “classical information” when you specify the problem, so you end up with “classical information” as a result. “Collapse” is not a matter of the Hamiltonian changing stochastically or anything like that, as caricatures of Bohr would have it, but instead, it’s a question of what writing a Hamiltonian means. For example, suppose you are writing the Schrödinger equation for an electron in a potential well. The potential function $V(x)$ that you choose depends upon your experimental arrangement — the voltages you put on your capacitor plates, etc. In the Bohrian view, the description of how you arrange your laboratory apparatus is in “classical language”, or perhaps he’d say “ordinary language, suitably amended by the concepts of classical physics”. Getting a classical fact at your detector is just the necessary flipside of starting with a classical account of your source.

(Yes, Bohr was the kind of guy who would choose the yin-yang symbol as his coat of arms.)

To me, the clearest expression of all this from the man himself is a lecture titled “The causality problem in atomic physics”, given in Warsaw in 1938 and published in the proceedings, New Theories in Physics, the following year. This conference is notable for several reasons, among them the fact that Hans Kramers, speaking both for himself and on behalf of Heisenberg, suggested that quantum mechanics could break down at high energies. More than a decade after what we today consider the establishment of the quantum theory, the pioneers of it did not all trust it in their bones; we tend to forget that nowadays.

As to how Heisenberg disagreed with Bohr, and what all this has to do with decoherence, I refer to Camilleri and Schlosshauer.

Do I find the Bohrian position that I outlined above satisfactory? No, I do not. Perhaps the most important reason why, the reason that emotionally cuts the most deeply, is rather like the concern which Rudolf Haag raised while debating Bohr in the early 1950s:

I tried to argue that we did not understand the status of the superposition principle. Why are pure states described as [rays] in a complex linear space? Approximation or deep principle? Niels Bohr did not understand why I should worry about this. Aage Bohr tried to explain to his father that I hoped to get inspiration about the direction for the development of the theory by analyzing the existing formal structure. Niels Bohr retorted: “But this is very foolish. There is no inspiration besides the results of the experiments.” I guess he did not mean that so absolutely but he was just annoyed. […] Five years later I met Niels Bohr in Princeton at a dinner in the house of Eugene Wigner. When I drove him afterwards to his hotel I apologized for my precocious behaviour in Copenhagen. He just waved it away saying: “We all have our opinions.”

Why rays? Why complex linear space? I want to know too.

# Sporadic SICs and exceptional Lie algebras

A while back, I had a bit of a sprawling conversation about certain geometrical oddities over multiple threads at the n-Category Café. I finally got organized enough to gather these notes together, incorporating edits for clarity and recording one construction I haven’t found written in the literature anywhere.

Sometimes, mathematical oddities crowd in upon one another, and the exceptions to one classification scheme reveal themselves as fellow-travelers with the exceptions to a quite different taxonomy.

UPDATE (30 March 2019): Thanks to a kind offer by John Baez, we’re going through this material step-by-step over at a blog with a community, the n-Category Café:

• Part 1: Definitions and preliminaries
• Part 2: Qutrits and E6
• Part 3: The Hoggar lines, E7 and E8

# Triply Positive Matrices

One more paper to round out the year!

J. B. DeBrota, C. A. Fuchs and B. C. Stacey, “Triply Positive Matrices and Quantum Measurements Motivated by QBism” [arXiv:1812.08762].

We study a class of quantum measurements that furnish probabilistic representations of finite-dimensional quantum theory. The Gram matrices associated with these Minimal Informationally Complete quantum measurements (MICs) exhibit a rich structure. They are “positive” matrices in three different senses, and conditions expressed in terms of them have shown that the Symmetric Informationally Complete measurements (SICs) are in some ways optimal among MICs. Here, we explore MICs more widely than before, comparing and contrasting SICs with other classes of MICs, and using Gram matrices to begin the process of mapping the territory of all MICs. Moreover, the Gram matrices of MICs turn out to be key tools for relating the probabilistic representations of quantum theory furnished by MICs to quasi-probabilistic representations, like Wigner functions, which have proven relevant for quantum computation. Finally, we pose a number of conjectures, leaving them open for future work.

This is a sequel to our paper from May, and it contains one minor erratum for an article from 2013.