Tuesday, August 11, 2020

Making big plans

 Occasionally, you have an idea, and you can do the required research within a couple of weeks. But this is the rare exception. Most research requires months, and often years, to complete. In particle physics, with its huge experiments running for decades, this is probably even more aware to people than in many other cases. This requires plans. A very recent example of such a plan is the European Strategy on Particle Physics (Update), in which all of Europe came together to make a plan. I have contributed to this as coordinating the theory input for the national Austrian roadmap. It is a huge effort to get everyone agreeing on what to do next - and what to do in the next half-a-century. Because this is how long you have to plan in advance for the big experiments.

Aside from these big plans, there are also smaller ones. Even for me as a theoretician. Occasionally, I have to sit down, and formulate a research plan for a couple of years into the future. The reason is often that I write a so-called grant proposal to get a considerable amount of money to hire postdocs and PhD students. Such a large proposal requires you to formulate what you want to do with all these people, usually for about five years. Last year, we got one already, for dark matter.

This year, I write another one. Why again, if we just got one? Well, on the one hand each would roughly take up half my time. So, I can manage both, and thereby do more. But putting this up front is cheating. The main reason is that it is unlikely I will get it in the first attempt. As there are currently many more people wanting to do particle physics than resources are allocated for this purpose at the national and international level, these resources need to be distributed. Thus, you write a proposal to get some of these. Then some panel judges the submitted proposals, and decides, who will get resources. And thus where efforts in particle physics will be concentrated. Usually, the are many more proposals the panel would like to fund than there are resources available, and so small points tip the scale to one or the other proposals, and the others are rejected. One can then try again. On average, less than one in five proposals is successful. Thus, you often need to try again, with an optimized proposal. And thus, I already submit another one.

Coming back to the original topic: For such a proposal I need to make a five-years plan. Of course, its research. Nobody can guarantee me that I (or, more likely, someone else) will not discover something which requires a fundamental change of plans. This is always allowed. But you are still required to make a plan what you want to do, if nothing unexpected happens. Usually, in my personal experience, about half what is planned will be done, and the rest of the resources is spent on unexpected stuff. Which is as well.

Still, you need to make a plan, if everything happens as you would expect it now. And that is what I did.

The first thing you need to decide is to what part of your research you would like to base it on. If you read my blog since a while, you may have seen that I actually do quite a lot of different topics, ranging from neutron stars to quantum gravity. But not all of this research is something I would like to extend at this level. The neutron star physics is something I currently do not work too much on. It is very interesting. But I would need to focus much more efforts on it, and needed to mainly concentrate on technical details. That is not what I currently want. The quantum gravity part is very exciting, and we develop quickly new ideas. There is much more to come. But currently it is too much at an exploratory stage as I would be able to formulate a large-scale five-years program. This will have to cook for a little time longer before it warrants this kind of attention.

So, I am down to my Higgs physics and beyond-the-standard-model research. For the latter, we are currently having enough people to work on. Also, it is a bit more speculative, as we did not yet see anything new in experiments. It is thus somewhat less easy to identify where to concentrates ones efforts on. The combination of our current research and what the next few years of experiments, especially LHC Run 3, will bring, will make this clearer.

So I concentrate this time on our attempts to find some new, subtle effects from theory in experiments: That there is an additional Higgs contribution inside the proton.

Right now, what we did was making a good guess, and looked, whether experiment told us we are right. Iterating this would be a time-honed approach to identifying a new effect. But for this plan, I wanted to be more ambitious. I wanted to have some prediction that rather just guess and iterate. This is very demanding. As a suitable tool, I choose simulations. While I will not be able to really simulate an actual proton and its Higgs content, the effort made possible by such a big grant should be enough to get a decent proxy for it. Something, which is close enough to the real thing that from a guess I can move to something which only requires a few more numbers, which I can get from experiments. That would be a huge success. We then use slightly different methods to fix the numbers.

But this is not easy. Based on what we learned so far, this is a big endeavour. At least for a theoretician. I estimated that I will need about four people with PhD, plus myself, and five more doing a PhD to get there. Not to mention that many master students and bachelor students will be able to work on this as well. This also means that especially several of the PhD students will work on this project, but will complete their PhD only on a part of it, and be done before the whole project is done. This required me to break the project down into smaller workpackages, 17 in total. Each of them is a milestone in itself, and provides intermediate (and eventually final) results. Each requires several of the people, and each at least half a year of time, and some even a year. I needed to make a plan, how each of them intersect with the other, and how they depend on each other. If you are interested in how such a thing looks in the end (it has a lot of tech babble in it), contact me. But it is actually not that different from any other large scale project, even in industry, like building a house. Thus, you also need some project management skills to do research. Even as a theoretician.

I am quite pleased with how it turned out in the end. It really has a good flow, and a succession of reasonable and manageable steps. In the end, it holds the promise of a guaranteed discovery - i.e. we will see a new physics effect, as long as we just keep on with the experiments, it will happen. Likely by the end of the runtime of the LHC in about 15 years. Or with the next generation of machines latest, which are part of the Strategy mentioned in the beginning. By this, I come full circle: My small research project ties in into the big ones. And together, we push the boundaries of human knowledge just a bit further.

Monday, July 27, 2020

What happens if gluons meet?

We have published a new paper on how gluons interact, which is described by the strong force. In fact, how exactly one gluon interacts by being absorbed or emitted by another one. There can be interactions with more of them. These are much more complicated to determine, and so we concentrate on this simplest one.

You may ask yourself, how we cannot yet know this, and still do stuff like calculate the mass of a hadron? And not even bother to do more than this simplest process? Because for the proton we need to now what gluons do, right? Well, not exactly. When we want to calculate the properties of a proton we need to know only how they do so in a particular way of averaging. We do not need to resolve the full details. But if we really want to understand how they interact in detail, this is not enough. And this is crucial, if we want to be able to build up not only the proton, but any particle or thing we want to measure. Being able to do a particular averaging good enough is not sufficient to do all of them as well.

In fact, this way of gluons interacting is the simplest way they can interact. Because of this, we know already quite a bit of it, if the gluons are very energetic. But we know less about how they interact, if they have little energy or travel over very long distances. And there a surprise arose some years back. It was raised in a much older work by myself and other people. It indicated that the gluons undergo a drastic change when they start to traverse distances of the order of the size of a proton or even further (inside bigger hadrons, because of confinement). It appears that at distances of the order of a proton diameter they stop interacting. But they become much stronger interacting again at even longer distances. This is, of course, a very interesting insight what happens, in a sense, at the boundary of a proton.

We used simulations for this back then. But we very limited at this time, because of the available computing power. This was aggravated, because at this time, I was working as a postdoc in Brazil. Which, as a disadvantaged country, does have bright minds, but much less resources than I have nowadays in Austria. At any rate, the result nonetheless got people excited, and there were a lot of follow-up works since then. Still, while most results supported the indications, it is not yet possible to give a fully satisfactory answer.

In our latest work, we picked up the idea of looking at the behavior in a world with one direction less. This saves a lot of computing time. And we did not yet had a final answer there either. This we provided now. There is a clear answer, confirming the behavior described above: First getting weaker, until the interaction vanishes at roughly a (flat) proton across, and then becoming quickly much stronger.

Still, doing the same in our world was too expensive. But we did a trick. Having the results from the fewer dimensions, we knew what to anticipate. So we used this information to test our world for consistency. And this checked out surprisingly well. In fact, we could even predict how much more computing time would be needed for a final confirmation also for our world. Could be done in the next few years. So hang around just a little longer for the final answer.

And, perhaps, we can then also do more complicated interactions. But this is a really tedious business. So you need patience and a long-term perspective.

Tuesday, April 21, 2020

A more complicated photon

The photon - the particle which makes up light - is probably one of the best known elementary particles. Nonetheless, everything can be made more involved. Thus, we studied a more complicated version of it in our most recent paper.

"Why in the world should we do this?" is a valid question at this point. That proceeded in multiple stages. I have written quite some time ago that for a particle physicist it is baffling that chemistry works. Chemistry works, among other things, because the electric charge of nuclei and electrons are perfectly balanced. Well, as perfect as we can measure, anyhow. In the standard model of particle physics there is no reason why this should be the case. However, the standard model is mathematical consistent only if this is the case. In fact, only if the balance is really perfect. Mathematical consistency is not a sufficient argument why a theory needs to be correct. Experiment is. So people have investigated this baffling fact since decades. In this process, the idea came up that there is a reason for this. And that reason would be that the standard model is only a facet of an underlying theory. This underlying theory enforces the equality of the electric charges by combining the weak, strong, and electromagnetic forces into one force. Such theories are called grand-unified theories, or short GUTs.

Such GUTs use a single gauge theory to combine all these forces. This is only possible with a certain kind, which is fundamentally different from the one we use for electromagnetism alone. It is more similar to the ones of the strong and weak force. We have investigated this type of theories for a long time. And one central insight is that in such theories none of the elementary particles can be observed individually. Only so-called bound states, which are made from two or more elementary particles, can be. That is very different from the ordinary photon of electromagnetism, which is, essentially, elementary.

The central question was therefore whether any such bound state could have the properties of the photon which we know from experiment. Otherwise a GUT would (very likely) not be a possible candidate to explain the balance of electric charges in chemistry. The photon has three important features. It does not carry itself electric charge. It is massless. And it has one unit of so-called spin.

Thus we needed to build a bound state in a GUT with these properties. The spin and absence of charge is actually quite simple, and you get this almost for free in any GUT. It is really the fact that it should not have mass which makes it so complicated. It is even more complicated to verify that it has no mass.

We had some ideas from our previous work, using pen-and-paper calculations, how this could work. There had also been some numerical simulations looking into similar questions in the early 1980ies, though they were, given resources back then, very exploratory. So we set up our own, modern-day numerical simulations. However, it is not yet possible to simulate a full, realistic GUT. For this all computing power on earth would not suffice, if we want to be done in a lifetime. So we used the simplest possible theory which had all the correct features relevant to a true GUT. This is an often employed trick in physics. One reduces a problem to the absolutely essential features, throwing away the rest, which has no or little impact on the particular question at hand. And by this getting a manageable problem.

So we did. And due to some ingenious ideas of my collaborators, especially my PhD student Vincenzo Afferrante, we were able to perform the simulations. There was a lot of frustrating work for the first few months, actually. But we persevered. And we were rewarded. In the end, we got our massless photon in exactly the way we hoped! We thus demonstrated that such a mechanism is possible. We got a massless photon made up out of elementary particles! A huge success for the whole setup. In addition, the things which make up the photon are (partly) very massive. That a bound state can be lighter than its constituents is an amazing consequence of special relativity. For us, this is an added bonus. Because you cannot see the fact that a particle is made up of other particles if you have not enough energy available to create the constituents. Again, this comes from the theory of relativity. In this scenario one of the constituents is indeed so heavy that we would not be able to produce it in experiments yet. Hence, with our current experiments, we would not yet detect that the photon is made up from other particles. And this is indeed what we observe. So everything is consistent. Very reassuring. Unfortunately, it is so heavy that also none of the currently planned experiments would be able to do so. Hence, we will not be able to test this idea directly experimentally. This will need to resort to indirect evidence.

Of course, I gloss over a lot of details and technicalities here, which took most of our time. Describing them would fill multiple entries.

Now, the only thing we need to do is to figure out whether anything we neglected could interfere. None of it will at a qualitative level. But, of course, we have very good experiments. And thus to make the whole idea a suitable GUT to describe nature, we also need to get it quantitatively correct. But this will be a huge step. Therefore we broke it down in small steps. We will do them one by one. Our next step is now to get the electron right. Let's see if this also works out.

Tuesday, February 18, 2020

What is a proton made of?

We have published a new paper, which has quite a bold topic: That a proton has a bit more structure than what you usually hear about.

Usually, you hear a proton is made up out of three quarks, the so-called valence quarks. These quarks, two up-quarks and one down quark, are termed valence quarks. Valence particles provide the proton with its characteristic properties, like its electric charge and spin. In addition, every other particle can also appear inside the proton, as a so-called sea particle. But these are quantum fluctuations, which are only very short lived. There existence has been tested in experiments for gluons, strange quarks, charm quarks, and bottom quarks, as well as photons. We understand this relatively well. Their contribution gets smaller the larger their mass is. So what do we want to add?

Those people reading this blog longer have already seen that one of the central topics we are looking at in our research are the weak interactions and the Higgs. Especially, we figured out that this part of the standard model of particle physics is more involved than is usually assumed. Most importantly, it requires for mathematical consistency that most particles, which we usually call elementary, are more involved bound states, i.e. made up out of multiple particles. Such bound states are very different from elementary particles. E.g., they should have a size. And, in principle, this should show up in experiments.

Of course, mathematical consistency is not sufficient for nature to behave is a certain way. Though it is nice if it does. Therefore 'should' is not sufficient. If we are right, it *must* show up in experiments. Unfortunately, as all of this is associated with the Higgs, which is very heavy, this requires a lot of energy. Since there is currently only one powerful enough experiment available, the LHC at CERN, we need to figure out how to test our ideas with this one. Which, unfortunately, is not ideally suited. But you have to make do with what you have.

Already two years back we figured out that all of the mathematical consistency arguments had a surprising impact for our proton. You see, the proton is one of two very similar particles, the proton and the neutron, the so-called nucleons. They make up all atomic nuclei. The difference between proton and neutron are threefold. Two are their mass and electric charge. They are explained by the valence quarks. The third is the protoness and neutroness - a feature which is called flavor (or sometimes isospin). The aforementioned valence quarks can actually not be really responsible for this quantum number. The argument is very technical, and has a lot to do with gauge symmetry, and especially its more involved aspects. Those who are interested in all the technical details can find it in my review article. Ultimately, it boils down that this flavor cannot come from the valence quarks. Something else needs to provide it.

This something else should not upset those things which are explained by the valence quarks, the mass and spin. Thus, it needs to be spinless and chargeless. The Higgs is the only particle in the standard model, which fits the bill. And it indeed carries something, which can provide the difference between protons and neutrons. In technical terms, it is called the custodial quantum number. What only matters is that this quantity can have two different values, and one can be associated with being proton and the other with being neutron, mathematically completely consistent, if the Higgs is another valence particle.

As the Higgs is much heavier than the proton, the immediate question is, how can that be? But here the combination of quantum mechanics and relativity comes to the rescue. It allows a bound state to be lighter (or heavier) than the sum of the masses of their constituents. Actually, an hydrogen atom is, e.g., lighter than the mass of the constituent proton and electron. But only by an extremely small amount. In the proton, this now works in the same way, but hugely amplified. But we have examples that this is actually possible. So this is fine.

When we now smash two protons together, like at the LHC, we actually get its constituents to interact with each other. And we have now additional Higgs content, so these Higgs can interact as well. However, this will be suppressed by the large mass of the Higgs, as in this case the interaction is as 'if it was alone'. And then it is heavy. Thus, even at the LHC this will be rare.

What we did in the paper was to estimate how rare, and which processes could best be sensitive to this. We find that the LHC so far is not too sensitive to the valence Higgs beyond uncertainties, if the effect is really there. But we figure out that with the production of top quarks at the LHC we should have a sensitive handle for looking for the valence Higgs.

This is really just the first step in hunting the valence Higgs. And it may well be that we need a more powerful experiment in the future to really see the effect. Not to mention that our estimates a very crude, and a lot of calculations need still to be done much better. But it is the first time that the effect of the valence Higgs, as required from mathematical consistency of the standard model, is tested experimentally. And this is a big step into a completely unknown domain. Who knows what we will find along the way.

Thursday, January 9, 2020

A personal perspective on how capitalism hurts science

In a number of my recent blog entries, and also occasionally on twitter, I have made statements about how bad our current late stage capitalism is for science. It is time that I follow up with a more detailed blog entry on this.

Before delving into it, I should discuss reasons why I hesitate to write on this subject. Those who have read my scientific blog entries may have noticed that I work on many ideas, which are unconventional. While I do my best to back them up with many different types of calculations, I have not been (yet) able to get these issues across as important. Thus, despite there have been quite a number of people in the past who did work on these subjects, and my own results are in line with theirs, there are very few contemporary people doing so. It is quite easy to be frustrated about this, especially since I think that are important things which need to be taken into considerations. Because they may change a lot of particle physics on a very fundamental level.

If you are in such a situation, it is very tempting to search guilt for your continued failure to make your stuff popular in some external reason. Hence, I am very much double guessing myself, if part of what I write here is affected by this. Probably part of it is. If I would be the only one having these thoughts it surely would be the case. However, over recent years I saw more and more studies being published or popping up on the arXiv which agree with my own perception. Hence, I am more and more convinced that a larger issue is at work here. And whether I am affected by this or not is not easy to say. Hence, I will try to avoid making any personal connections here, and just tell how in my perspective I see the results of these studies realized. Most of the studies I linked on twitter over time.

The gist of many of these studies is twofold. The way how research results are published and perceived is not necessarily correlated with its relevance. In fact, there appears to be anti-correlation between long-term relevance (measured by number of citations) and the impact factor of the journal in which the research has been published. Meaning more prestigious journals tend to not accept research where the short-term relevance is not obvious. On the other hand, also in funding there is a strong tendency that those who have get more, and bold claims are more important than well-funded statements or even checks.

While these issues are on their own troublesome, it is the way how they resemble other elements of public life, which is alarming. To say the least. And which is typical for late-stage capitalism. This is the fact that those who have get more. That those who have, or have the favor of someone who has, can do anything essentially anything, and get rewarded. While those who do not have a hard time to get anything. This is amplified by gate-keeping and a lack of diversity in academia, which is far from resolved. Of course, this is also a problem appearing in society in general.

In my personal experience, this manifests itself in a very strong tendency to create hype. If the results is only promising enough, any assumption, even if it is just wishful thinking, becomes acceptable. Theoreticians seem to be much more prone to this than experimentalists. The reason is simple. As long as no one disproves your statement, you will get attention. And if somebody, who has, picks it up and promotes it (or is actually the origin), it will gain traction. If it fails eventually, you just cook up another thing, and so on. This is in particle physics supported by our current lack of hard experimental evidence beyond the standard model. Thus it is easy to escape experimental falsification. Theoretical falsification is much more complicated. Because in sufficiently complicated theories, doing an exact falsification is technically hard. Even if you there is a lot of evidence, it is always possible to find a loop hole to not accept a falsification. And given the promises made, it is for most much better to just ignore any claim of invalidity. Especially, most of the assumptions often simplify, or even trivialize, calculations. Hence, it is possible to get results with little effort. And since they promise so much, it is easy to publish them or get funding for them.

This even happens in a less dramatic fashion quite often. Even without anything wrong any new field has first a lot of simple problems. They can be done with little or moderate effort. Thus, the return-on-investment is large. Therefore many people flock to these new fields, to have a large output compared to work invested. Thereby, they gain resources. As soon as the inevitable complications set in, most of these leave the field, and move to the next field of the same type. However, they take with them the resources, leaving those trying to solve the hard problems with little. While in any case resources are limited it is necessary to focus effort, this should be decided upon the relevance of the question, rather than on how easy it is to get results.

All of this mirrors trends in society. As long as one can get much without solving actual problem, everyone goes for it. And if you can gain an advantage by making too strong claims, the better. We see how this damages our society from the climate crises to the rise of authoritarianism. All of that follows this pattern. You claim that there is an easy solution how you can get profit and avoid investing solving the reason for the climate crises. See greenwashing. Or you claim social problems have an easy solution, because others are at fault, so you just need to get rid of them. Yielding the rise of rightwing extremism and authoritarian systems. All of this is fueled by capitalism, which puts profits before solutions.

And these effects find their mirror in science, as science is not set apart from society. Thus, capitalistic thinking - gathering resources, in science renown and funding, become more important than the actual solution of problems.

How can this by avoided? Well, probably the same way as in society at large. That what damages the scientific process needs to be got rid off. A scientific system which focuses on what people did instead of who did it, and a distribution of resources based on the relevance of problem rather than renown or promises, would probably go a long way. This was recognized by quite some people. And there are tentative steps ongoing. Like banishing renown as a measure of success. Putting the actual works at center, rather than how and where they are published. But it is a slow process, and one which can again be misused. Probably, only if we as a society change fundamentally science will get closer to its ideals.