A recurring topic in our research are the joys and sorrows of the redundancies in our description. As I have discussed several times introducing these redundancies makes live much easier. But this can turn against you, if you need to make approximations. Which, unfortunately, is usually the case. Still their benefits outweighs the troubles.

One of the remarkable consequences of these redundancies is that they even affect our description of the most fundamental particles in our theories. Here, I will concentrate on the gluons of the strong interactions (or QCD). On the one hand because they play a very central role in many phenomena. But, more importantly, because they are the simplest particles exhibiting the problem. This follows essentially the old strategy of divide and conquer. Solve it for the simplest problem first, and continue from there.

Still, even the simplest case is not easy. The reason is that the redundancies introduced auxiliary quantities. These act like some imaginary particles. These phantom particles are called also ghosts, because, just like ghosts, they actually do not really exist, they are only there in our imagination. Actually, they are called Faddeev-Popov ghosts, honoring those two people who have introduced them for the very first time.

Thus, whenever we calculate quantities we can actually observe, we do not see any traces of these ghosts. But directly computing an observable quantity is often hard, especially when you want to use eraser-and-pencil-type calculations. So we work stepwise. And in such intermediate steps ghosts do show up. But because they only encode information differently, but not add information, their presence affects also the description of 'real' particles in these intermediate stages. Only at the very end they would drop out. If we could do the calculations exactly.

Understanding how this turns out quantitatively is something I have been working on since almost a decade, with the last previous results available almost a year ago. Now, I made a little bit progress. But making progress is for this problem rather though. Therefore there are usually no big breakthroughs. It is much like grinding in an MMO. You need to accumulate little bits of information, to perhaps, eventually, understand what is going on. And this is once more the case.

I have presented the results of the latest steps recently at a conference. A summary of this report is freely available in a write-up for the proceedings of this conference.

I found a few new bits of information. One was that we certainly underestimated the seriousness of the problem. That is mainly due to the fact that most such investigations have so far been done using numerical simulations. Even though we want to do in the end rather the eraser-and-pencil type calculations, ensuring that they work is easier done using numerical simulations.

However, the numerical simulations are expensive, and therefore one is limited in them. I have extended the effort, and was able to get a glimpse of the size of the problem. I did this by simulating not only the gluons, but also simulated the extent to which we can probe the problem. By seeing how the problem depends on our perception of the problem, I could estimate, how big it will become at least, eventually.

Actually, the result was somewhat unsettling, even though it is not hopeless. One of the reason, why it is not hopeless is the way how it affects everything. And there it turned out that the aforementioned ghosts actually carry the brunt of the problem. This is good, as they will cancel out in the end. Thus, even if we cannot solve the problem completely, it will not have as horrible an impact as was imaginable. Thus, we can have a little bit more confidence that what we do makes actually sense, especially when we calculate something observable.

You may say that we could use experiments to check our approximations. It appears easier. After all, this is what we want to describe - or is it? Well, this is certainly true, when we are thinking about the standard model. But fundamental physics is more geared towards the unknown nowadays. And as a theoretician, I try to predict also the unknown. But if my predictions are invalidated by my approximations, what good can they be? Knowing therefore that they are not quite as affected as they could be is more than valuable. It is necessary. I can then tell the experimentalists with more confidence the places they should look, with at least some justified hope that I do not lead them on a wild geese chase.

Subscribe to:
Post Comments (Atom)