A dilemma of the auto-didact

If publishers could never imagine that there are people who could teach themselves particle physics, why conceive cheaper preliminary textbooks and ridiculously expensive advanced textbooks? Learning vector physics for classical mechanics costs Rs. 245 while progressing then to analytical mechanics involves an incurrence of Rs. 4,520. Does the cost barrier exist because the knowledge is more specialized? If this is the case, then such books should have become cheaper over time. They have not: Analytical Mechanics, which a good friend recommended, has stayed in the vicinity of $75 for the last three years (now, it’s $78.67 for the original paperback and $43 for a used one). This is just a handy example. There are a host of textbooks that detail concepts in advanced physics and cost a fortune: all you have to do is look for those that contain “hadron”, “accelerator”, “QCD”, etc., in their titles.

Getting to a place in time where a student is capable of understanding these subjects is cheap. In other words, the cost of aspirations is low while the price of execution is prohibitive.

Sure, alternatives exist, such as libraries and university archives. However, that misses the point: it seems the costs of the books are higher to prevent their ubiquitous consumption. No other reason seems evident, although I am loth to reach this conclusion. If you, the publisher, want me to read such books only in universities, then you are effectively requiring me to either abstain from reading these books irrespective of my interests if my professional interests reside elsewhere or depend on universities and university-publisher relationships for my progress in advanced physics, not myself. The resulting gap between the layman and the specialist eventually evades spanning, leading to ridiculous results such as not understanding the “God” in “God particle” to questioning the necessity of the LHC without quite understanding what it does and how that helps mankind.

The Indian Bose in the universal boson

Read this article.

Do you think Indians are harping too much about the lack of mention of Satyendra Nath Bose’s name in the media coverage of the CERN announcement last week? The articles in Hindustan Times and Economic Times seemed to be taking things too far with anthropological analyses that have nothing to do with Bose’s work. The boson was named so around 1945 by the great Paul Dirac as a commemoration of Bose’s work with Einstein. Much has happened since; why would we want to celebrate the Bose in the boson again and again?

Dr. Satyendra Nath Bose

The stage now belongs to the ATLAS and the CMS collaborations, and to Higgs, Kibble, Englert, Brout, Guralnik, and Hagen, and to physics itself as a triumph of worldwide cooperation in the face of many problems. Smarting because an Indian’s mention was forgotten is jejune. Then again, this is mostly the layman and the media, because the physicists I met last week seemed to fully understand Bose’s contribution to the field itself instead of count the frequency of his name’s mention.

Priyamvada Natarajan, as she writes in the Hindustan Times, is wrong (and the Economic Times article’s heading is just irritating). That Bose is not a household name like Einstein’s is is not because of post-colonialism – the exceptions are abundant enough to warrant inclusion – but because we place too much faith in a name instead of remembering what the man behind the name did for physics.

Ramblings on partons

When matter and anti-matter meet, they annihilate each other in a “flash” of energy. Usually, this release of energy is in the form of high-energy photons, or gamma rays, which are then detected, analysed, and interpreted to understand more of the collision’s other properties. In nature, however, matter/anti-matter collisions are ultra-rare if not altogether non-existent because of the unavailability of anti-matter.

Such annihilation processes are important not just to supplant our understanding of particle physics but also because they play a central role in the design of hadron colliders. Such colliders use heavily interacting particles (the superficial definition of hadrons), such as protons and neutrons, to bombard into each other. The target particles, depending on experimental necessities, may be stationary – in which case the collider is said to employ a fixed target – or moving. The Large Hadron Collider (LHC) is the world’s largest and most powerful hadron collider, and it uses moving targets, i.e., both the incident and target hadrons are moving toward each other.

Currently, it is know that a hadronic collision is explicable in terms of their constituent particles, quarks and gluons. Quarks are the snowcloned fundamental building blocks of all matter, and gluons are particles that allow two quarks to “stick” together, behaving like glue. More specifically, gluons mediate the residual strong force (where the strong force itself is one of the four fundamental forces of nature): in other words, quarks interact by exchanging gluons.

Parton distribution functions

Earlier, before the quark-gluon model was known, a hadronic collision was broken down in terms of hypothetical particles called partons. The idea was suggested by Richard Feynman in 1969. At very high energies – such as the ones at which collisions occur at the LHC – equations governing the parton model, which approximates the hadrons as presenting point-targets, evolve into parton-distribution functions (PDFs). PDFs, in turn, allow for the prediction of the composition of the hubris resulting from the collisions. Theoretical calculations pertaining to different collision environments and outcomes are used to derive different PDFs for each process, which are then used by technicians to design hadron-colliders accordingly.

(If you can work on FORTRAN, here are some PDFs to work with.)

Once the quark-gluon model was in place, there were no significant deviations from the parton model. At the same time, because quarks have a corresponding anti-matter “form”,anti-quarks, a model had to be developed that could study quark/anti-quark collisions during the course of a hadronic collision, especially one that could factor in the production of pairs of leptons during such collisions. Such a model was developed by Sidney Drell and Tung-Mow Yan in 1970, and was called the Drell-Yan (DY) process, and further complimented by a phenomenon called Bjorken scaling (Bsc).

(In Bsc, when the energy of an incoming lepton is sufficiently high during a collision process, the cross-section available for collision becomes independent of the electron’s momentum. In other words, the lepton, say, an electron, at very-high energies interacts with a hadron not as if the latter were particle but as if it were composed of point-like targets called partons.)

In a DY process, a quark from one hadron would collide with an anti-quark from another hadron and annihilate each other to produce a virtual photon (γ*). The γ* then decays to form a dilepton pair, which, if we were to treat with as one entity instead of as a paired two, could be said to have a mass M.

Now, if M is large, then Heisenberg’s uncertainty principle tells us that the time of interaction between the quark/anti-quark pair should have been small, essentially limiting its interaction with any other partons in the colliding hadrons. Similarly, in a timeframe that is long in comparison to the timescale of the annihilation, the other spectator-partons would rearrange themselves into resultant hadrons. However, in most cases, the dilepton is detected and momentum-analysed, not the properties of the outgoing hadrons. The DY process results in the production of dilepton pairs at finite energies, but these energies are very closely spaced, resulting in an energy-band, or continuum, being defined in the ambit of which a dilepton-pair might be produced.

In quantum chromodynamics and quark-parton transitions

Quark/anti-quark annihilation is of special significance in quantum chromodynamics (QCD), which studies the colour-force, the force between gluons and quarks and anti-quarks, inside hadrons. The strong field that gluons mediate is, in quantum mechanical terms, called the colour field. Unlike in QED (quantum electrodynamics) or classical mechanics, QCD allows for two strange kinds of behaviour from quarks and gluons. The first kind, called confinement, holds that the force between two interacting quarks does not diminish as they are separated. This doesn’t mean that quarks are strongly interacting at large distances! No, it means that once two quarks have come together, no amount of energy can take them apart. The second kind, called asymptotic freedom (AF), holds that that quarks and gluons interact weakly at high energies.

(If you think about it, colour-confinement implies that gluons can emit gluons, and as the separation between two quarks increases, so also the rate of gluon emission increases. Axiomatically, as the separation decreases, or that the relative four-momentum squared increases, the force holding quarks together decreases monotonically in strength, leading to asymptotic freedom.)

The definitions for both properties are deeply rooted in experimental ontology: colour-confinement was chosen to explain the consistent failure of free-quark searches, while asymptotic freedom doesn’t yield any phase-transition line between high- and low-energy scales while still describing a property transition between the two scales. Therefore, the DY process seemed well-poised to provide some indirect proof for the experimental validity of QCD if some relation could be found between the collision cross-section and the particles’ colour-charge, and this is just what was done.

The QCD factorization theorem can be read as:

Here, as(μ) is the effective chromodynamic (quark-gluon-quark) coupling at a factorization scale μ. Further, fa(xμ) defines the probability of finding a parton a within a nucleon with the Bjorken scaling variable x at the scale μ. Also, { hat { sigma  }  }_{ i }^{ a } (TeX converter) is the hard-scattering cross-section of the electroweak vector boson on the parton. The physical implication is that the nucleonic structure function is derived by the area of overlap between the function describing the probability of finding a parton inside a nucleon and the summa of all functions describing the probabilities of finding all partons within the nucleon.

This scaling behaviour enabled by QCD makes possible predictions about future particle phenomenology.

Putting particle physics research to work

In the whole gamut of comments regarding the Higgs boson, there is a depressingly large number decrying the efforts of the ATLAS and CMS collaborations. Why? Because a lot of people think the Large Hadron Collider (LHC) is a yawning waste of time and money, an investment that serves mankind no practical purpose.

Well, here and here are some cases in point that demonstrate the practical good that the LHC has made possible in the material sciences. Another big area of application is in medical diagnostics: making the point is one article about hunting for the origin of Alzheimer’s, and another about the very similar technology used in particle accelerators and medical imaging devices, meteorology, VLSI, large-scale networking, cryogenics, and X-ray spectroscopy.

Moving on to more germane applications: arXiv has reams of papers that discuss the deployment of

… amongst others.

The LHC, above all else, is the brainchild of the European Centre for Nuclear Research, popularly known as CERN. These guys invented the notion of the internet, developed the first touch-screen devices, and pioneered the earliest high-energy medical imaging techniques.

With experiments like those being conducted at the LHC, it’s easy to forget every other development in such laboratories apart from the discovery of much-celebrated particles. All the applications I’ve linked to in this post were conceived by scientists working with the LHC, if only to argue that everyone, the man whose tax money pays for these giant labs to the man who uses the money to work in the labs, is mindful of practical concerns.

Gunning for the goddamned: ATLAS results explained

Here are some of the photos from the CERN webcast yesterday (July 4, Wednesday), with an adjoining explanation of the data presented in each one and what it signifies.

This first image shows the data accumulated post-analysis of the diphoton decay mode of the Higgs boson. In simpler terms, physicists first put together all the data they had that resulted from previously known processes. This constituted what’s called the background. Then, they looked for signs of any particle that seemed to decay into two energetic photons, or gamma rays, in a specific energy window; in this case, 100-160 GeV.

Finally, knowing how the number of events would vary in a scenario without the Higgs boson, a curve was plotted that fit the data perfectly: the number of events at each energy level v. the energy level at which it was tracked. This way, a bump in the curve during measurement would mean there was a particle previously unaccounted for that was causing an excess of diphoton decay events at a particular energy.

This is the plot of the mass of the particle being looked for (x-axis) versus the confidence level with which it has (or has not, depending n how you look at it) been excluded as an event to focus on. The dotted horizontal line, corresponding to 1μ, marks off a 95% exclusion limit: any events registered above the line can be claimed as having been observed with “more than 95% confidence” (colloquial usage).

Toward the top-right corner of the image are some numbers. 7 TeV and 8 TeV are the values of the total energy going into each collision before and after March, 2012, respectively. The beam energy was driven up to increase the incidence of decay events corresponding to Higgs-boson-like particles, which, given the extremely high energy at which they exist, are viciously short-lived. In experiments that were run between March and July, physicists at CERN reported an increase of almost 25-30% of such events.

The two other numbers indicate the particle accelerator’s integrated luminosity. In particle physics, luminosity is measured as the number of particles that can pass detected through a unit of area per second. The integrated luminosity is the same value but measured over a period of time. In the case of the LHC, after the collision energy was vamped up, the luminosity, too, had to be increased: from about 4.7 fb-1 to 5.8 fb-1. You’ll want to Wiki the unit of area called barn. Some lighthearted physics talk there.

In this plot, the y-axis on the left shows the chances of error, and the corresponding statistical significance on the right. When the chances of an error stand at 1, the results are not statistically significant at all because every observation is an error! But wait a minute, does that make sense? How can all results be errors? Well, when looking for one particular type of event, any event that is not this event is an error.

Thus, as we move toward the ~125 GeV mark, the number of statistically significant results shoot up drastically. Looking closer, we see two results registered just beyond the 5-sigma mark, where the chances of error are 1 in 3.5 million. This means that if the physicists created just those conditions that resulted in this >5σ (five-sigma) observation 3.5 million times, only once will a random fluctuation play impostor.

Also, notice how the differences between each level of statistical significance increases with increasing significance? For chances of errors: 5σ – 4σ > 4σ – 3σ > … > 1σ – 0σ. This means that the closer physicists get to a discovery, the exponentially more precise they must be!

OK, this is a graph showing the mass-distribution for the four-lepton decay mode, referred to as a channel by those working on the ATLAS and CMS collaborations (because there are separate channels of data-taking for each decay-mode). The plotting parameters are the same as in the first plot in this post except for the scale of the x-axis, which goes all the way from 0 to 250 GeV. Now, between 120 GeV and 130 GeV, there is an excess of events (light blue). Physicists know it is an excess and not at par with expectations because theoretical calculations made after discounting a Higgs-boson-like decay event show that, in that 10 GeV, only around 5.3 events are to be expected, as opposed to the 13 that turned up.

After the Higgs-boson-like particle, what’s next?

This article, as written by me, appeared in print in The Hindu on July 5, 2012.

The ATLAS (A Toroidal LHC Apparatus) collaboration at CERN has announced the sighting of a Higgs boson-like particle in the energy window of 125.3 ± 0.6 GeV. The observation has been made with a statistical significance of 5 sigma. This means the chances of error in their measurements are 1 in 3.5 million, sufficient to claim a discovery and publish papers detailing the efforts in the hunt.

Rolf-Dieter Heuer, Director General of CERN since 2009, said at the special conference called by CERN in Geneva, “It was a global effort, it is a global effort. It is a global success.” He expressed great optimism and concluded the conference saying this was “only the beginning.”

With this result, collaborations at the Large Hadron Collider (LHC), the atom-smashing machine, have vastly improved on their previous announcement on December 13, 2011, where the chance of an error was 1-in-50 for similar sightings.

A screenshot from the Dec 13, 2011, presentation by Fabiola Gianotti, leader of the ATLAS collaboration, that shows a global statistical significance of 2.3 sigma, which translates to a 1-in-50 chance of the result being erroneous.

Another collaboration, called CMS (Compact Muon Solenoid), announced the mass of the Higgs-like particle with a 4.9 sigma result. While insufficient to claim a discovery, it does indicate only a one-in-two-million chance of error.

Joe Incandela, CMS spokesman, added, “We’re reaching into the fabric of the universe at a level we’ve never done before.”

The LHC will continue to run its experiments so that results revealed on Wednesday can be revalidated before it shuts down at the end of the year for maintenance. Even so, by 2013, scientists, such as Dr. Rahul Sinha, a participant of the Belle Collaboration in Japan, are confident that a conclusive result will be out.

“The LHC has the highest beam energy in the world now. The experiment was designed to yield quick results. With its high luminosity, it quickly narrowed down the energy-ranges. I’m sure that by the end of the year, we will have a definite word on the Higgs boson’s properties,” he said.

However, even though the Standard Model, the framework of all fundamental particles and the dominating explanatory model in physics today, predicted the particle’s existence, slight deviations have been observed in terms of the particle’s predicted mass. Even more: zeroing in on the mass of the Higgs-like particle doesn’t mean the model is complete when, in fact, it is far from.

While an answer to the question of mass formation took 50 years to be reached, physicists are yet to understand many phenomena. For instance, why aren’t the four fundamental forces of nature equally strong?

The weak, nuclear, electromagnetic, and gravitational forces were born in the first few moments succeeding the Big Bang 13.75 billion years ago. Of these, the weak force is, for some reason, almost 1 billion, trillion, trillion times stronger than the gravitational force! Called the hierarchy problem, it evades a Standard Model explanation.

In response, many theories were proposed. One, called supersymmetry (SUSY), proposed that all fermions, which are particles with half-integer spin, were paired with a corresponding boson, or particles with integer spin. Particle spin is the term quantum mechanics attributes to the particle’s rotation around an axis.

Technicolor was the second framework. It rejects the Higgs mechanism, a process through which the Higgs boson couples stronger with some particles and weaker with others, making them heavier and lighter, respectively.

Instead, it proposes a new form of interaction with initially-massless fermions. The short-lived particles required to certify this framework are accessible at the LHC. Now, with a Higgs-like particle having been spotted with a significant confidence level, the future of Technicolor seems uncertain.

However, “significant constraints” have been imposed on the validity of these and such theories, labeled New Physics, according to Prof. M.V.N. Murthy of the Institute of Mathematical Sciences (IMS), whose current research focuses on high-energy physics.

Some other important questions include why there is more matter than antimatter in this universe, why fundamental particles manifest in three generations and not more or fewer, and the masses of the weakly-interacting neutrinos. State-of-the-art technology worldwide has helped physicists design experiments to study each of these problems better.

For example, the India-based Neutrino Observatory (INO), under construction in Theni, will house the world’s largest static particle detector to study atmospheric neutrinos. Equipped with its giant iron-calorimeter (ICAL) detector, physicists aim to discover which neutrinos are heavier and which lighter.

The LHC currently operates at the Energy Frontier, with high-energy being the defining constraint on experiments. Two other frontiers, Intensity and Cosmic, are also seeing progress. Project X, a proposed proton accelerator at Fermilab in Chicago, Illinois, will push the boundaries of the Intensity Frontier by trying to look for ultra-rare process. On the Cosmic Frontier, dark matter holds the greatest focus.

Hunt for the Higgs boson: A quick update

And it was good news after all! In an announcement made earlier today at the special conference called by CERN near Geneva, the discovery of a Higgs-boson-like particle was announced by physicists from the ATLAS and CMS collaborations that spearheaded the hunt. I say discovery because the ATLAS team spotted an excess of events near the 125-GeV mark with a statistical significance of 5 sigma. This puts the chances of the observation being a random fluctuation at 1 in 3.5 million, a precision that asserts (almost) certainty.

Fabiola Gianotti announced the preliminary results of the ATLAS detector, as she did in December, while Joe Incandela was her CMS counterpart. The CMS results showed an excess of events around 125 GeV (give or take 0.6 GeV) at 4.9 sigma. While the chances of error in this case are 1 in 2 million, it can’t be claimed a discovery. Even so, physicists from both detectors will be presenting their efforts in the hunt as papers in the coming weeks. I’ll keep an eye out for their appearance on arXiv, and will post links to them.

After the beam energy in the Large Hadron Collider (LHC) was increased from 3.5 TeV/beam to 4 TeV/beam in March, only so many collisions could be conducted until July. As a result, the sample set available for detailed analysis was lower than could be considered sufficient. This is the reason some stress is placed on saying “boson-like” instead of attributing the observations to the boson itself. Before the end of the year, when the LHC will shut down for routine maintenance, however, scientists expect a definite word on the particle being the Higgs boson itself.

(While we’re on the subject: too many crass comments have been posted on the web claiming a religious element in the naming of the particle as the “God particle”. To those for whom this monicker makes sense: know that it doesn’t. When it was first suggested by a physicist, it stood as the “goddamn particle”, which a sensitive publisher corrected to the “God particle”).

The mass of the boson-like particle seems to deviate slightly from Standard Model (SM) predictions. This does not mean that SM stands invalidated. In point of fact, SM still holds strong because it has been incredibly successful in being able to predict the existence and properties of a host of other particles. One deviation cannot and will not bring it down. At the same time, it’s far from complete, too. What the spotting of a Higgs-boson-like particle in said energy window has done is assure physicists and others worldwide that the predicted mechanism of mass-generation is valid and within the SM ambit.

Last: the CERN announcement was fixed for today not without another reason. The International Conference on High Energy Physics (ICHEP) is scheduled to commence tomorrow in Melbourne. One can definitely expect discussions on the subject of the Higgs mechanism to be held there. Further, other topics also await to be dissected and their futures laid out – in terms vague or concrete. So, the excitement in the scientific community is set to continue until July 11, when ICHEP is scheduled to close.

Be sure to stay updated. These are exciting times!

So, is it going to be good news tomorrow?

As the much-anticipated lead-up to the CERN announcement on Wednesday unfolds, the scientific community is rife with many speculations and few rumours. In spite of this deluge, it may be that we could expect a confirmation of the God particle’s existence in the seminar called by physicists working on the Large Hadron Collider (LHC).

The most prominent indication of good news is that five of the six physicists who theorized the Higgs mechanism in a seminal paper in 1964 have been invited to the meeting. The sixth physicist, Robert Brout, passed away in May 2011. Peter Higgs, the man for whom the mass-giving particle is named, has also agreed to attend.

The other indication is much more subtle but just as effective. Dr. Rahul Sinha, a professor of high-energy physics and a participant in the Japanese Belle collaboration, said, “Hints of the Higgs boson have already been spotted in the energy range in which LHC is looking. If it has to be ruled out, four-times as much statistical data should have been gathered to back it up, but this has not been done.”

The energy window which the LHC has been combing through was based on previous searches for the particle at the detector during 2010 and at the Fermilab’s Tevatron before that. While the CERN-based machine is looking for signs of two-photon decay of the notoriously unstable boson, the American legend looked for signs of the boson’s decay into two bottom quarks.

Last year, on December 13, CERN announced in a press conference that the particle had been glimpsed in the vicinity of 127 GeV (GeV, or giga-electron-volt, is used as a measure of particle energy and, by extension of the mass-energy equivalence, its mass).

However, scientists working on the ATLAS detector, which is heading the search, could establish only a statistical significance of 2.3 sigma then, or a 1-in-50 chance of error. To claim a discovery, a 5-sigma result is required, where the chances of errors are one in 3.5 million.

Scientists, including Dr. Sinha and his colleagues, are hoping for a 4-sigma result announcement on Wednesday. If they get it, the foundation stone will have been set for physicists to explore further into the nature of fundamental particles.

Dr. M.V.N. Murthy, who is currently conducting research in high-energy physics at the Institute of Mathematical Sciences (IMS), said, “Knowing the mass of the Higgs boson is the final step in cementing the Standard Model.” The model is a framework of all the fundamental particles and dictates their behaviour. “Once we know the mass of the particle, we can move on and explore the nature of New Physics. It is just around the corner,” he added.

The philosophies in physics

As a big week for physics comes up–a July 4 update by CERN on the search for the Higgs boson followed by ICHEP ’12 at Melbourne–I feel really anxious as a small-time proto-journalist and particle-physics-enthusiast. If CERN announces the discovery of evidence that rules out the existence of such a thing as the Higgs particle, not much will be lost apart from years of theoretical groundwork set in place for the post-Higgs universe. Physicists obeying the Standard Model will, to think the snowclone, scramble to their boards and come up with another hypothesis that explains mass-formation in quantum-mechanical terms.

For me… I don’t know what it means. Sure, I will have to unlearn the Higgs mechanism, which does make a lot of sense, and scour through the outpouring of scientific literature that will definitely follow to keep track of new directions and, more fascinatingly, new thought. The competing supertheories–loop quantum gravity (LQG) and string theory–will have to have their innards adjusted to make up for the change in the mechanism of mass-formation. Even then, their principle bone of contention will remain unchanged: whether there exists an absolute frame of reference. All this while, the universe, however, will have continued to witness the rise and fall of stars, galaxies and matter.

It is easier to consider the non-existence of the Higgs boson than its proven existence: the post-Higgs world is dark, riddled with problems more complex and, unsurprisingly, more philosophical. The two theories that dominated the first half of the previous century, quantum mechanics and special relativity, will still have to be reconciled. While special relativity holds causality and locality close to its heart, quantum mechanics’ tendency to violate the latter made it disagreeable at the philosophical level to A. Einstein (in a humorous and ironical turn, his attempts to illustrate this “anomaly” numerically opened up the field that further made acceptable the implications of quantum mechanics).

The theories’ impudent bickering continues with mathematical terms as well. While one prohibits travel at the speed of light, the other allows for the conclusive demonstration of superluminal communication. While one keeps all objects nailed to one place in space and time, the other allows for the occupation of multiple regions of space at a time. While one operates in a universe wherein gods don’t play with dice, the other can exist at all only if there are unseen powers that gamble on a secondly basis. If you ask me, I’d prefer one with no gods; I also have a strange feeling that that’s not a physics problem.

Speaking of causality, physicists of the Standard Model believe that the four fundamental forces–nuclear, weak, gravitational, and electromagnetic–cause everything that happens in this universe. However, they are at a loss to explain why the weak force is 1032-times stronger than the gravitational force (even the finding of the Higgs boson won’t fix this–assuming the boson exists). An attempt to explain this anomaly exists in the name of supersymmetry (SUSY) or, together with the Standard Model, MSSM. If an entity in the (hypothetical) likeness of the Higgs boson cannot exist, then MSSM will also fall with it.

Taunting physicists everywhere all the way through this mesh of intense speculation, Werner Heisenberg’s tragic formulation remains indefatigable. In a universe in which the scale at which physics is born is only hypothetical, in which energy in its fundamental form is thought to be a result of probabilistic fluctuations in a quantum field, determinism plays a dominant role in determining the future as well as, in some ways, contradicting it. The quantum field, counter-intuitively, is antecedent to human intervention: Heisenberg postulated that physical quantities such as position and particle spin come in conjugate quantities, and that making a measurement of one quantity makes the other indeterminable. In other words, one cannot simultaneously know the position and momentum of a particle, or the spins of a particle around two different axes.

To me, this seems like a problem of scale: humans are macroscopic in the sense that they can manipulate objects using the laws of classical mechanics and not the laws of quantum mechanics. However, a sense of scale is rendered incontextualizable when it is known that the dynamics of quantum mechanics affect the entire universe through a principle called the collapse postulate (i.e., collapse of the state vector): if I measure an observable physical property of a system that is in a particular state, I subject the entire system to collapse into a state that is described by the observable’s eigenstate. Even further, there exist many eigenstates for collapsing into; which eigenstate is “chosen” depends on its observation (this is an awfully close analogue to the anthropic principle).

xkcd #45

That reminds me. The greatest unsolved question in my opinion is whether the universe houses the brain or if the brain houses the universe. To be honest, I started writing this post without knowing how it would end: there were multiple eigenstates it could “collapse” into. That it would collapse into this particular one was unknown to me, too, and, in hindsight, there was no way I could have known about any aspect of its destiny. Having said that, the nature of the universe–and the brain/universe protogenesis problem–with the knowledge of deterministic causality and mensural antecedence, if the universe conceived the brain, the brain must inherit the characteristics of the universe, and therefore must not allow for freewill.

Now, I’m faintly depressed. And yes, this eigenstate did exist in the possibility-space.