AAB asked on Theoretical Physics Stack Exchange:
I was wondering what is the opinion about importance of the hierarchy problem in the hep community? I'm still a student and I don't really understand, why there is so much attention around this issue.
1loop corrections to the Higgs mass are divergent  in the cutoff regularization proportional to Λ^{2} and therefore require large fine tuning between the parameters to make those corrections small. But this kind of problem do not appear in the dimensional regularization.
The hierarchy problem is the mystery why Higgs is so thin even though he could eat all the tasteful, virtual, GUT, heavy stuff
People like the value of Λ to be very large, with an argument that it should correspond to same energy scale at which our theory breaks down. I don't think, that we should treat the scale Λ, as some kind of a physical scale of our model cutoff, as it is just a parameter to regularize the integral. Just like the 4+ϵ dimension in the dimensional regularization is not a physical thing. Why do we apply a physical meaning to Λ? Not to mention the troubles with the Lorentz invariance.
Maybe the hierarchy problem is an argument that the cutoff regularization scheme is just not right to use?
LM answers
Whether you do your calculations using a cutoff regularization or dimensional regularization or another regularization is just a technical detail that has nothing to do with the existence of the hierarchy problem. Order by order, you will get the same results whatever your chosen regularization or scheme is.
The schemes and algorithms may differ by the precise moment at which you subtract some unphysical infinite terms etc. Indeed, the dimensional regularization cures powerlaw divergences from scratch. But the hierarchy problem may be expressed in a way that is manifestly independent of these technicalities.
The hierarchy problem is the problem that one has to finetune actual physical parameters of a theory expressed at a high energy scale with a huge accuracy – with error margins smaller than \((E_{low}/E_{high})^k\) where \(k\) is a positive power – in order for this highenergy theory to produce the lowenergy scale and light objects at all.
If I formulate the problem in this way, it's clear that it doesn't matter what scheme you are using to do the calculations. In particular, your miraculous "cure" based on the dimensional regularization may hide the explicit \(\Lambda^2\) in intermediate results. But it doesn't change anything about the dependence on the highenergy parameters.
What you would really need for a "cure" of the physical problem is to pretend that no highenergy scale physics exists at all. But it does. It's clear that the Standard Model breaks before we reach the Planck energy and probably way before that. There have to be more detailed physical laws that operate at the GUT scale or the Planck scale and those new laws have new parameters.
The lowenergy parameters such as the LHCmeasured Higgs mass 125 GeV are complicated functions of the more fundamental highenergy parameters governing the GUTscale or Planckscale theory. And if you figure out what condition is needed for the highscale parameters to make the Higgs \(10^{15}\) times lighter than the reduced Planck scale, you will see that they're unnaturally finetuned conditions requiring some dimensionful parameters to be in some precise ranges.
More generally, it's very important to distinguish true physical insights and true physical problems from some artifacts depending on a formalism. One common misconception is the belief of some people that if the space is discretized, converted to a lattice, a spin network, or whatever, one cures the problem of nonrenormalizability of theories such as gravity.
But this is a deep misunderstanding. The actual physical problem hiding under the "nonrenormalizability" label isn't the appearance of the symbol \(\infty\) which is just a symbol that one should interpret rationally. We know that this \(\infty\) as such isn't a problem because at the end, it gets subtracted in one way or another; it is unphysical. The main physical problem is the need to specify infinitely many coupling constants – coefficients of the arbitrarilyhighorder terms in the Lagrangian – to uniquely specify the theory. The cutoff approach makes it clear because there are many kinds of divergences that differ and each of these divergent expressions has to be "renamed" as a finite constant, producing a finite unspecific parameter along the way. But even if you avoid infinities and divergent terms from scratch, the unspecified parameters – the finite remainders of the infinite subtractions – are still there. A theory with infinitely many terms in the Lagrangian has infinitely many pieces of data that must be measured before one may predict anything: it remains unpredictive at any point.
In a similar way, finetuning required for the highenergy parameters is a problem because using the Bayesian inference, one may argue that it was "highly unlikely" for the parameters to conspire in such a way that the highenergy physical laws produce e.g. the light Higgs boson. The degree of finetuning (parameterized by a small number) is therefore translated as a small probability (given by the same small number) that the original theory (a class of theory with some parameters) agrees with the observations.
When this finetuning is of order \(0.1\) or even \(0.01\), it's probably OK. Physicists have different tastes what degree of finetuning they're ready to tolerate. For example, many phenomenologists have thought that even a \(0.1\)style finetuning is a problem – the little hierarchy problem – that justifies the production of hundreds of complicated papers. Many others disagree that the term "little hierarchy problem" deserves to be viewed as a real one at all. But pretty much everyone who understands the actual "flow of information" in quantum field theory calculations as well as the basic Bayesian inference seems to agree that finetuning and the hierarchy problem is a problem when it becomes too severe. The problem isn't necessarily an "inconsistency" but it does mean that there should exist an improved explanation why the Higgs is so unnaturally light. The role of this explanation is to modify the naive Bayesian measure – with a uniform probability distribution for the parameters – that made the observed Higgs mass look very unlikely. Using a better conceptual framework, the prior probabilities are modified so that the small parameters observed at low energies are no longer unnatural i.e. unlikely.
Symmetries such as the supersymmetry and new physics near the electroweak scale are two major representatives of the solution to the hierarchy problem. They eliminate the huge "power law" dependence on the parameters describing the highenergy theory. One still has to explain why the parameters at the high energy scale are so that the Higgs is much lighter than the GUT scale but the amount of finetuning needed to explain such a thing may be just "logarithmic", i.e. "one in \(15\ln 10\)" where 15 is the baseten logarithm of the ratio of the mass scales. And this is of course a huge improvement over the finetuning at precision "1 in 1 quadrillion".
StemCell Treatment for Blindness Moving Through Patient Testing

Advanced Cell Technology is testing a stemcell treatment for blindness
that could preserve vision and potentially reverse vision loss.
A new treatment f...
3 hours ago
snail feedback (3) :
Could you explain why the error margin should smaller than \((E_{low}/E_{high})^k\) , but not
\(log(E_{low}/E_{high})\)
Because the first number is of order \(10^{15}\) or even smaller (for the Higgs/GUT case) which produces the same tiny probability and such a tiny probability of something special is equivalent to a more than 5sigma (one part per million chance) proof that something is going on.
The logarithm is just of order 15 or a small multiple of it, and things occurring at chance one in 15 are common, and the corresponding confidence level is even smaller than 2 sigma in this case and may be ignored.
The difference is all about the size of the numbers. Certain things may be expected to occur by chance, others are less likely and have to have a more detailed reason.
Thanks for this post, it's really a nice survey (I wasn't aware about the hierarchy problem).
Post a Comment