Hunt for the Higgs boson: A quick update

And it was good news after all! In an announcement made earlier today at the special conference called by CERN near Geneva, the discovery of a Higgs-boson-like particle was announced by physicists from the ATLAS and CMS collaborations that spearheaded the hunt. I say discovery because the ATLAS team spotted an excess of events near the 125-GeV mark with a statistical significance of 5 sigma. This puts the chances of the observation being a random fluctuation at 1 in 3.5 million, a precision that asserts (almost) certainty.

Fabiola Gianotti announced the preliminary results of the ATLAS detector, as she did in December, while Joe Incandela was her CMS counterpart. The CMS results showed an excess of events around 125 GeV (give or take 0.6 GeV) at 4.9 sigma. While the chances of error in this case are 1 in 2 million, it can’t be claimed a discovery. Even so, physicists from both detectors will be presenting their efforts in the hunt as papers in the coming weeks. I’ll keep an eye out for their appearance on arXiv, and will post links to them.

After the beam energy in the Large Hadron Collider (LHC) was increased from 3.5 TeV/beam to 4 TeV/beam in March, only so many collisions could be conducted until July. As a result, the sample set available for detailed analysis was lower than could be considered sufficient. This is the reason some stress is placed on saying “boson-like” instead of attributing the observations to the boson itself. Before the end of the year, when the LHC will shut down for routine maintenance, however, scientists expect a definite word on the particle being the Higgs boson itself.

(While we’re on the subject: too many crass comments have been posted on the web claiming a religious element in the naming of the particle as the “God particle”. To those for whom this monicker makes sense: know that it doesn’t. When it was first suggested by a physicist, it stood as the “goddamn particle”, which a sensitive publisher corrected to the “God particle”).

The mass of the boson-like particle seems to deviate slightly from Standard Model (SM) predictions. This does not mean that SM stands invalidated. In point of fact, SM still holds strong because it has been incredibly successful in being able to predict the existence and properties of a host of other particles. One deviation cannot and will not bring it down. At the same time, it’s far from complete, too. What the spotting of a Higgs-boson-like particle in said energy window has done is assure physicists and others worldwide that the predicted mechanism of mass-generation is valid and within the SM ambit.

Last: the CERN announcement was fixed for today not without another reason. The International Conference on High Energy Physics (ICHEP) is scheduled to commence tomorrow in Melbourne. One can definitely expect discussions on the subject of the Higgs mechanism to be held there. Further, other topics also await to be dissected and their futures laid out – in terms vague or concrete. So, the excitement in the scientific community is set to continue until July 11, when ICHEP is scheduled to close.

Be sure to stay updated. These are exciting times!

So, is it going to be good news tomorrow?

As the much-anticipated lead-up to the CERN announcement on Wednesday unfolds, the scientific community is rife with many speculations and few rumours. In spite of this deluge, it may be that we could expect a confirmation of the God particle’s existence in the seminar called by physicists working on the Large Hadron Collider (LHC).

The most prominent indication of good news is that five of the six physicists who theorized the Higgs mechanism in a seminal paper in 1964 have been invited to the meeting. The sixth physicist, Robert Brout, passed away in May 2011. Peter Higgs, the man for whom the mass-giving particle is named, has also agreed to attend.

The other indication is much more subtle but just as effective. Dr. Rahul Sinha, a professor of high-energy physics and a participant in the Japanese Belle collaboration, said, “Hints of the Higgs boson have already been spotted in the energy range in which LHC is looking. If it has to be ruled out, four-times as much statistical data should have been gathered to back it up, but this has not been done.”

The energy window which the LHC has been combing through was based on previous searches for the particle at the detector during 2010 and at the Fermilab’s Tevatron before that. While the CERN-based machine is looking for signs of two-photon decay of the notoriously unstable boson, the American legend looked for signs of the boson’s decay into two bottom quarks.

Last year, on December 13, CERN announced in a press conference that the particle had been glimpsed in the vicinity of 127 GeV (GeV, or giga-electron-volt, is used as a measure of particle energy and, by extension of the mass-energy equivalence, its mass).

However, scientists working on the ATLAS detector, which is heading the search, could establish only a statistical significance of 2.3 sigma then, or a 1-in-50 chance of error. To claim a discovery, a 5-sigma result is required, where the chances of errors are one in 3.5 million.

Scientists, including Dr. Sinha and his colleagues, are hoping for a 4-sigma result announcement on Wednesday. If they get it, the foundation stone will have been set for physicists to explore further into the nature of fundamental particles.

Dr. M.V.N. Murthy, who is currently conducting research in high-energy physics at the Institute of Mathematical Sciences (IMS), said, “Knowing the mass of the Higgs boson is the final step in cementing the Standard Model.” The model is a framework of all the fundamental particles and dictates their behaviour. “Once we know the mass of the particle, we can move on and explore the nature of New Physics. It is just around the corner,” he added.

The philosophies in physics

As a big week for physics comes up–a July 4 update by CERN on the search for the Higgs boson followed by ICHEP ’12 at Melbourne–I feel really anxious as a small-time proto-journalist and particle-physics-enthusiast. If CERN announces the discovery of evidence that rules out the existence of such a thing as the Higgs particle, not much will be lost apart from years of theoretical groundwork set in place for the post-Higgs universe. Physicists obeying the Standard Model will, to think the snowclone, scramble to their boards and come up with another hypothesis that explains mass-formation in quantum-mechanical terms.

For me… I don’t know what it means. Sure, I will have to unlearn the Higgs mechanism, which does make a lot of sense, and scour through the outpouring of scientific literature that will definitely follow to keep track of new directions and, more fascinatingly, new thought. The competing supertheories–loop quantum gravity (LQG) and string theory–will have to have their innards adjusted to make up for the change in the mechanism of mass-formation. Even then, their principle bone of contention will remain unchanged: whether there exists an absolute frame of reference. All this while, the universe, however, will have continued to witness the rise and fall of stars, galaxies and matter.

It is easier to consider the non-existence of the Higgs boson than its proven existence: the post-Higgs world is dark, riddled with problems more complex and, unsurprisingly, more philosophical. The two theories that dominated the first half of the previous century, quantum mechanics and special relativity, will still have to be reconciled. While special relativity holds causality and locality close to its heart, quantum mechanics’ tendency to violate the latter made it disagreeable at the philosophical level to A. Einstein (in a humorous and ironical turn, his attempts to illustrate this “anomaly” numerically opened up the field that further made acceptable the implications of quantum mechanics).

The theories’ impudent bickering continues with mathematical terms as well. While one prohibits travel at the speed of light, the other allows for the conclusive demonstration of superluminal communication. While one keeps all objects nailed to one place in space and time, the other allows for the occupation of multiple regions of space at a time. While one operates in a universe wherein gods don’t play with dice, the other can exist at all only if there are unseen powers that gamble on a secondly basis. If you ask me, I’d prefer one with no gods; I also have a strange feeling that that’s not a physics problem.

Speaking of causality, physicists of the Standard Model believe that the four fundamental forces–nuclear, weak, gravitational, and electromagnetic–cause everything that happens in this universe. However, they are at a loss to explain why the weak force is 1032-times stronger than the gravitational force (even the finding of the Higgs boson won’t fix this–assuming the boson exists). An attempt to explain this anomaly exists in the name of supersymmetry (SUSY) or, together with the Standard Model, MSSM. If an entity in the (hypothetical) likeness of the Higgs boson cannot exist, then MSSM will also fall with it.

Taunting physicists everywhere all the way through this mesh of intense speculation, Werner Heisenberg’s tragic formulation remains indefatigable. In a universe in which the scale at which physics is born is only hypothetical, in which energy in its fundamental form is thought to be a result of probabilistic fluctuations in a quantum field, determinism plays a dominant role in determining the future as well as, in some ways, contradicting it. The quantum field, counter-intuitively, is antecedent to human intervention: Heisenberg postulated that physical quantities such as position and particle spin come in conjugate quantities, and that making a measurement of one quantity makes the other indeterminable. In other words, one cannot simultaneously know the position and momentum of a particle, or the spins of a particle around two different axes.

To me, this seems like a problem of scale: humans are macroscopic in the sense that they can manipulate objects using the laws of classical mechanics and not the laws of quantum mechanics. However, a sense of scale is rendered incontextualizable when it is known that the dynamics of quantum mechanics affect the entire universe through a principle called the collapse postulate (i.e., collapse of the state vector): if I measure an observable physical property of a system that is in a particular state, I subject the entire system to collapse into a state that is described by the observable’s eigenstate. Even further, there exist many eigenstates for collapsing into; which eigenstate is “chosen” depends on its observation (this is an awfully close analogue to the anthropic principle).

xkcd #45

That reminds me. The greatest unsolved question in my opinion is whether the universe houses the brain or if the brain houses the universe. To be honest, I started writing this post without knowing how it would end: there were multiple eigenstates it could “collapse” into. That it would collapse into this particular one was unknown to me, too, and, in hindsight, there was no way I could have known about any aspect of its destiny. Having said that, the nature of the universe–and the brain/universe protogenesis problem–with the knowledge of deterministic causality and mensural antecedence, if the universe conceived the brain, the brain must inherit the characteristics of the universe, and therefore must not allow for freewill.

Now, I’m faintly depressed. And yes, this eigenstate did exist in the possibility-space.

Eigenstates of the human mind

  1. Would a mind’s computing strength be determined by its ability to make sense of counter-intuitive principles (Type I) or by its ability to solve an increasing number of simple problems in a second (Type II)?
  2. Would Type I and Type II strengths translate into the same computing strength?
  3. Does either Type I or Type II metric possess a local inconsistency that prevents its state-function from being continuous at all points?
  4. Does either Type I or Type II metric possess an inconsistency that manifests as probabilistic eigenstates?

Necessity of the interdisciplinary

Click on the image for the Nature article

A strange cosmic “crucifix” in 774 AD, recorded in the Anglo-Saxon Chronicle, could be explained by the occurrence of a supernova, perhaps rendered unobservable by a dense cloud of gas between Earth and the dying star that scattered all but some of the light. The real story, however, is that of Jonathon Allen, who came up with this idea after listening to a radio talk-show that mentioned a strange spike of C-14 content in three tree-rings in Japan. Because increased C-14 generation in the atmosphere can happen only with incoming cosmic radiation from supernovae or vicious solar flares, the two strange phenomena could be related. Such a development also does well to justify inculcating an interdisciplinary background amongst scientists (such as astronomy and history) because it would simply be hypocritical to assume that the laws of physics apply in one field but not in another.

Understanding the Solar System from within a shard

Click on the image for the Caltech press release

A new variant of titanium oxide was discovered less than two days ago at Caltech, embedded within the famous Allende meteorite, which crashed on Earth in 1969. Using electron diffraction, mineralogists found the mineral, named panguite after the Chinese legend of Pan Gu, in a refractory incursion (RI). In the early stages of the formation of the Solar System, condensation of pre-solar gases at high temperature and pressure resulted in the formation of refractory liquids, which then solidified into RIs, which are called so because of their stability in extreme conditions. Thus, the contents of the panguite in situ play a crucial role in understanding the birth of our extended home.

What’s allowed and disallowed in the name of SUSY

The International Conference on High Energy Physics (ICHEP) is due to begin on July 7 in Melbourne. This is the 26th episode of the most prestigious scientific conference on particle physics. In keeping with its stature, scientists from the ATLAS and CMS collaborations at the LHC plan to announce the results of preliminary tests conducted to look for the Higgs boson on July 4. Although speculations still will run rife within the high-energy and particle physics communities, they will be subdued; after all, nobody wants to be involved in another OPERAtic fiasco.

Earlier this year, CERN announced that the beam energy at the LHC would be increased from 3.5 TeV/beam to 4 TeV/beam. This means the collision energy will see a jump from 7 TeV to 8 TeV, increasing the chances of recreating the elusive Higgs boson, the “God particle”, and confirming if the Standard Model is able to explain the mechanism of mass formation in this universe. While this was the stated goal when the LHC was being constructed, another particle physics hypothesis was taking shape that lent itself to the LHC’s purpose.

In 1981, Howard Georgi and Savas Dimopoulos proposed a correction to the Standard Model to solve for what is called the hierarchy problem. Specifically, the question is why the weak force (mediated by the W± and Z bosons) is 1032 times stronger than gravity. Both forces are mediated by natural constants: Fermi’s constant for the weak force and for gravity, Newton’s constant. However, when operations of the Standard Model are used to quantum-correct for Fermi’s constant (a process that involves correcting for errors), its value starts to deviate from closer to Newton’s constant to something much, much higher.

Savas Dimopoulos (L) and Howard Georgi

Even by the late 1960s, the propositions of the Standard Model were cemented strongly enough into the psyche of mathematicians and scientists the world over: it had predicted with remarkable accuracy most naturally occurring processes and had predicted the existence of other particles, too, discovered later at detectors such as the Tevatron, ATLAS, CMS, and ZEUS. In other words, it was inviolable. At the same time, there were no provisions to correct for the deviation, indicating that there could be certain entities – particles and forces – that were yet to be discovered and that could solve the hierarchy problem, and perhaps explain the nature of dark matter, too.

So, the 1981 Georgi-Dimopoulos solution was called the Minimal Supersymmetric Standard Model (MSSM), a special formulation of supersymmetry, first proposed in 1966 by Hironari Miyazawa, that paired particles of half-integer spin with those of integer spin and vice versa. (The spin of a particle is the quantum mechanical equivalent of its orbital angular momentum, although one has never been representative of the other. Expressed in multiples of the reduced Planck’s constant, particle spin is denoted in natural units as simply an integer or half-integer.)

Particles of half-integer spin are called fermions and include leptons and quarks. Particles with integer spin are called bosons and comprise photons, the W± and Z bosons, eight gluons, and the hypothetical, scalar boson named after co-postulator Peter Higgs. The principle of supersymmetry (SUSY) states that for each fermion, there is a corresponding boson, and for each boson, there is a corresponding fermion. Also, if SUSY is assumed to possess an unbroken symmetry, then a particle and its superpartner will have the same mass. The superpartners are yet to be discovered, and if anyone has a chance of finding them, it has to be at the LHC.

MSSM solved for the hierarchy problem, which could be restated as the mass of the Higgs boson being much lower than the mass at which new physics appears (Planck mass), by exploiting the effects of what is called the spin-statistics theorem (SST). SST implies that the quantum corrections to the Higgs-mass-squared will be positive if from a boson, and negative if from a fermion. Along with MSSM, however, because of the existence of a superpartner to every particle, the contribution to the correction, Δm2H, is zero. This result leaves the Higgs mass lower than the Planck mass.

The existence of extra dimensions has been proposed to explain the hierarchy problem. However, the law of parsimony, insofar as SUSY seems validatable, prevents physicists from turning so radical.

MSSM didn’t just stabilize the weak scale: in turn, it necessitated the existence of more than one Higgs field for mass-coupling since the Higgs boson would have a superpartner, the fermionic Higgsino. For all other particles, though, particulate doubling didn’t involve an invocation of special fields or extrinsic parameters and was fairly simple. The presence of a single Higgsino in the existing Higgs field would supply an extra degree of freedom (DoF), leaving the Higgs mechanism theoretically inconsistent. However, the presence of two Higgsinos instead of one doesn’t lead to this anomaly (called the gauge anomaly).

The necessity of a second Higgs field was reinforced by another aspect of the Higgs mechanism: mass-coupling. The Higgs boson binds stronger to the heavier particle, which means that there must be a coupling constant to describe the proportionality. This was named after Hideki Yukawa, a Japanese theoretical physicist, and termed λf. When a Higgs boson couples with an up-quark, λf = +1/2; when it couples with a down-quark, λf = -1/2. SUSY, however, prohibits this switch to the value’s complex conjugate (a mass-reducing move), and necessitates a second Higgs field to describe the interactions.

A “quasi-political” explanation of the Higgs mechanism surfaced in 1993 and likened the process to a political leader entering a room full of party members. As she moved through the room, the members moved out of their evenly spaced “slots” and towards her, forming a cluster around her. The speed of the leader was then restricted because there were always a knot of people around her, and she became slowed (like a heavy particle). Finally, as she moved away, the members returned to their original positions in the room.

The MSSM-predicted superpartners are thought to have masses 100- to 1,000-times that of the proton, and require extremely large energies to be recreated in a hadronic collision. The sole, unambiguous way to validate the MSSM theory is to spot the particles in a laboratory experiment (such as those conducted at CERN, not in a high-school chemistry lab). Even as the LHC prepares for that, however, there are certain aspects of MSSM that aren’t understood even theoretically.

The first is the mu problem (that arises in describing the superpotential, or mass, of the Higgsino). Mu appears in the term μHuHd, and in order to perfectly describe the quantum vacuum expectation value of the Higgsino after electroweak symmetry breaking (again, the Higgsino’s mass), mu’s value must be of that order of magnitude close to the electroweak scale (As an analog of electroweak symmetry breaking, MSSM also introduces a soft SUSY-breaking, the terms of which must also be of the order of magnitude of the electroweak scale). The question is whence these large differences in magnitudes, whether they are natural, and if they are, then how.

The second is the problem of flavour mixing. Neutrinos and quarks exhibit a property called flavours, which they seem to change through a mechanism called flavour-mixing. Since no instances of this phenomenon have been observed outside the ambit of the Standard Model, the new terms introduced by MSSM must not interfere with it. In other words, MSSM must be flavour-invariant, and, by an extension of the same logic, CP-invariant.

Because of its involvement in determining which particle has how much mass, MSSM plays a central role in clarifying our understanding of gravity as well as, it has been theorized, in unifying gravity with special relativity. Even though it exists only in the theoretical realm, even though physicists are attracted to it because its consequences seem like favourable solutions, the mathematics of MSSM does explain many of the anomalies that threaten the Standard Model. To wit, dark matter is hypothesized to be the superpartner of the graviton, the particle that mediates the gravitational force, and is given the name gravitino (Here’s a paper from 2007 that attempts to explain the thermal production of gravitinos in the early universe).

While the beam energies were increased in pursuit of the Higgs boson after CERN’s landmark December 13, 2011 announcement, let’s hope that the folks at ATLAS, CMS, ALICE, and other detectors have something to say about opening the next big chapter in particle physics, the next big chapter that will bring humankind one giant leap closer to understanding the universe and the stuff that we’re made of.

The post-reporter era

One of the foundation stones of journalism is the process of reporting. That there is a messenger working the gap between an event and a story provides for news to exist and exist with myriad nuances attached to it. There are ethical and moral issues, technical considerations, writing styles, and presentation formats to perfect. The entire news-publishing industry is centered on the activities of reporters and streamlining them.

What the reporter requires the most is… well, a few things. The first is a domain of events, from which he picks issues to talk about. The second is a domain of stories, into which he publishes his reports. The third is a platform using which he may incentivize this process for himself, and acquire the tools with which he may publish his stories efficiently and effectively. The last entity is more commonly understood in the form of a publishing house.

The reason I’ve broken the working of a reporter into these categories is to understand what makes a reporter at all. Today, a reporter is most commonly understood in terms of an individual who is employed with a publishing house and publishes stories for them. Ideally, however, everyone is a reporter: simply the creation of knowledge by people based on experiences around them should be qualification enough. This calls into question the role of a publishing house: is it a platform working with which reporters may function efficiently, or is it an employer of reporters?

If it’s an employer of reporters, then any publishing house wouldn’t have to worry about where the course of journalism is going to take the organization itself. Reporters will have to change the way they work – how they spot issues, evolving writing styles to suit their audiences, so forth – but the publishing house will retain ownership of the reporters themselves. As long as it’s not a platform which individuals use to function as reporters, things are going to be fine.

Now, let’s move to the post-reporter era, where everyone is a reporter (of course, that’s an idealized image, but even so). In this world, a reporter is not someone who works for a publishing house – that aspect of the word’s meaning is left behind in the age of the publishing house. In this world, a reporter is someone who works simply as a messenger between the domains of events and stories, where the role of the publishing house as the owner of reportage is absent.

The nature of such a world throws light on the valuation of information. When multiple reporters cover different events and return to HQ to file their stories, the house decides which stories make the cut and which don’t on the basis of a set of parameters. In other words, the house creates and assigns a particular value to each story, and then compares the values of different stories to determine their destiny.

In the post-reporter era, which is likely to be occupied by channels of individual presentation – ranging from word-of-mouth to full-scale websites – houses that thrive today on the valuation of information and the importance the houses’ readers place on it  will steadily fade out. What exists will be an all-encompassing form of what is known as citizen journalism (CJ) today. Houses take to CJ because of the mutually beneficial relationship available therein: the CJ gets the coverage and the advantage of the issue pursued no longer being under wraps; the reporter gets a story that has both civic/criminal and human-interest angles to it.

However, when the CJ voids the relationship by refusing the intervention of a publishing/broadcasting house, and chooses to take his story straight to the people through a channel he finds effective enough, the house-level valuation of stories is replaced by a democratic institution that may or may not be guided by a paternalistic attitude.

Therefore, if a particular house has to survive into the post-reporter era, it must discard issue-valuation as an engine and instead rely on some other entity, such as one represented by a parameter whose efficiency is a maximizable quantity. This can be conceived as a fourth domain which, upon maximization, becomes the superset of which the three domains are subsets.

A counter-productive entity in this situation is that of property, which is accrued in great quantities by a high-achieving house in the present but which delays the onset of change in the future. Even when the house starts to experience slightly rougher weather, its first move will be to pump in more money, thereby offsetting change by some time. Only when the amount of property invested in delaying change is considerable will the house start to consider other alternatives, by which time other competing organizations will have moved into the future.

Fizzed-out futures

Initiatives are arising to plug holes in the Indian education system, or so they claim. Many are ambitious, some even overreaching, but they also exist in the company of those that are honest. However, the cause for concern is that such projects are being viewed as extracurricular to the prevailing education system-even by those who have founded the initiatives. Thoughtful engagement is sought after, an awareness of the “outside world”–a summation of the realities extraneous to the student’s chosen field–is deemed lacking and designated a goal.

Most such initiatives are by students, or recent graduates, and with them, they carry fresh memories of incomplete lessons and half-mentored theses. As their activities grow in scope–which they surely do–there is an attrition between a tendency to remain experimentalist and the certainty provided by going commercial through installing a secure source of support and a fundamental incentive. The last is necessary even though many students remain in denial of it: one man’s idea cannot be shared with the same intensity throughout unless there is a need to depend on it. Money, many fail to realize, maintains currency, too.

The prevalent belief is that the Indian way of learning sidelines the humanities: if a job doesn’t fetch a fat cheque, it concludes there is no point in studying for it. Unfortunately, however, such a view also degrades the pros of technical learning. Subsequently, the responses are disappointingly reactionary. If a student has found it difficult to inculcate a skill, he simply participates in the overarching institution of frustration and dissatisfaction, and assumes the problem is faced by everyone. That is never true, has never been. However, it finds enough purchase to surface as fixes.

In many parts of the country, young graduates and final-year students gather in small rooms on terraces and in garages. For the most part, they discuss the different activities they could perform to compensate for what they think they ought to have learned in the classroom but didn’t. They quickly conclude that original thought is missing-which is very true-and proceed to talk about what they’d need to inculcate it. These are, obviously, surface-level problems. As time passes, the incentive to meet each subsequent week and debate and act or whatever peters out. Essentially, such students’ and graduates’ concerns have been for the short-term.

The long-term concern, it seems, can be addressed more effectively at the individual level than at the systemic level. The institution can encourage extracurricular tasks, point at the dearth of invention and abundance of innovation, and build up an army of youngsters to fix the nation’s most pressing problems. However, the only solution that can pluck India out of this moshpit of unoriginality is to do what is required of all youngsters no matter where they are these days: ideate. Ideas, whether original or otherwise, are necessary; even better when they are distilled out from a knowledge pool that is vast.

Whatever the most dollar-guzzling problems are, the ones that are solved by continuous ideation are what will keep the machine from descending into a standstill. May the humanities be sidelined, may the rote-learner be celebrated, may technical learning signify the staple diet that deprives most Indian students’ of the indulgence of the arts-we are not in need of a paradigm shift to rectify matters. What we need most is to build ourselves to achieve even in the absence of expectations. What we need most is to transcend our cubicles and classrooms and disintegrate the institutionalized frustration. By not doing so, we are letting our communal objectives be defined by a chance mistake.

A simplification of superfluidity

“Once people tell me what symmetry the system starts with and what symmetry it ends up with, and whether the broken symmetries can be interchanged, I can work out exactly how many bosons there are and if that leads to weird behavior or not,” Murayama said. “We’ve tried it on more than 10 systems, and it works out every single time.”

– Haruki Watanabe, co-author of the paper

To those who’ve followed studies on superfluidity and spontaneous symmetry-breaking, a study by Hitoshi Murayama and Haruki Watanabe at UC, Berkeley, will come as a boon. It simplifies our understanding of symmetry-breaking for practical considerations by unifying the behaviour of supercooled matter – such as BEC and superfluidity – and provides a workable formula to derive the number of Nambu-Goldstone bosons given the symmetry of the system during a few phases!

This is the R&D article that serves as a lead-in into the issue.

This is a primer on spontaneous symmetry-breaking (and the origins of the Higgs boson).

Finally, and importantly, the pre-print paper (from arXiv) can be viewed here. Caution: don’t open the paper if you’re not seriously good at math.