Kilometre-long tunnels in the ground, six nuclear reactors, 200 scientists, giant acrylic vessels containing 20 tonnes of liquid scintillator. It sounds like the plot of a James Bond movie, but actually it is the Daya Bay experiment.
This experiment aims to measure the neutrino mixing angle theta_{13}. With two of the three mixing angles pretty well constrained by data on atmospheric and solar neutrino oscillations this angle is the least well-known of all the parameters in the neutrino-mixing matrix. (Well, except for maybe one...but more on that later.) Daya Bay will either measure it, or---if it is very small---improve the present limit by a factor of 15. In order to do this one wants to optimize the distance so that electron-type neutrinos are at their "oscillations minimum", i.e. the maximum number of them have disappeared. The best distance to do this at is at about 2 km. (You might want to ask why you don't want to go to the "next minimum" in the oscillation pattern, the one corresponding to 2 Pi, rather than Pi.) So what you need is a copious source of electron-type anti-neutrinos, and a big chunk of matter which will convert some of them back into positrons through inverse beta decay.
Now nuclear reactors are just such a source, because for each GW-hour a typical Uranium-based nuclear reactor produces about 10^{20} electron anti-neutrinos. So in the Daya Bay "far" detector they expect to see about 90 neutrino events per day. Which sounds like a lot until you realize that the "far detector" consists of four vessels, each containing 20t of liquid scintillator. That scintillator ensures that when the p anti-neutrino strikes a proton, converting it to a neutron and a positron, both the neutron and the positron can be detected, with a time gap between them that is an (almost) unique signature of the process.
Other experiments have tried this approach before, but one limiting factor has been knowledge of the neutrino flux from the reactors. Daya Bay will solve this problem by taking a ratio of the neutrino rate between "near" detectors and "far" detectors. That technique, combined with the large volume of the detector, and with the promiximity to 17.4 GW of thermal power that is feeding China's growing energy needs, means that Daya Bay can "nail" theta_{13}. If, that is, they can ensure that what is going on at the near and far detectors is essentially identical.
So this experiment becomes all about controlling systematics. Is the scintillator in the near and far detectors exactly the same? Do the PMTs in the near and the far detectors function in the same way? (This is where regular calibration of detectors becomes key.) And it becomes about removing backgrounds: especially in the "far" detector where the count rate is a factor of ten lower. So when Dr Liu spoke on this topic last Thursday he spent some time explaining the cuts they make in their data to remove background events, e.g. cosmic rays activating Lithium-9 that then beta decay to produce a coincidence signal very like that seen when one of the reactor anti-neutrinos interacts with the target. Obviously it is a very delicate business to remove _only_ those events, and none of the signal, and to do so consistently between the near and far detectors. But if those background events are not properly removed they will mess up the measurement of the ratio that is the goal of this experiment.
Which might prompt the question: why all this effort? The millions of dollars (or RMB) on digging tunnels, the millions of dollars more on detector development here in the US, the 200 people working hard to design the best experiment they can. Part of the answer is that theta_{13} is "the gateway to CP violation": the possibility that there is a complex phase in the neutrino-mixing matrix like the one in the quark CKM matrix. And CP violation is a hot topic right now in particle physics. But is that the reason the Chinese government is footing most of the bill for this experiment? And that the US Department of Energy is making a significant commitment to it too? While I like to think that the US and Chinese governments care about CP violation, I don't think they care that much.
Monday, February 11, 2008
Wednesday, February 6, 2008
Are we there yet?
Jefferson Lab was built largely to probe the "transition region": that domain of energies and momentum transfers where a description of nuclei in terms of protons and neutrons gives way to a description based on the fundamental QCD degrees of freedom of quarks and gluons.
At very large energies we know that the interactions of quarks and gluons can be computed in perturbation theory (pQCD: perturbative Quantum Chromodynmics). This allowed theorists to predict (already in the 1970s) how various quantities that JLab has now measured should behave in the limit of large energies/momentum transfer. The only problem is that no-one really knew how large was lareg. Some of these predictions have been borne out, e.g. those for the disintegration of a deuteron by a high-energy photon. But, for the most part, the "asymptotic region", where these predictions based on pQCD become correct, has proven elusive. JLab appears to be stuck in a transition region covers more kinematic territory than many originally thought.
One observable for which pQCD makes a definite prediction is the "form factor" of a pion. This function encodes how different the pion's charge distribution is from the charge distribution of a point particle. (Yes, you guessed it, the charge distribution of a point particle is that all the charge is at a single point.) The form factor can be accessed via electron-pion scattering experiments. But it is very very difficult to build a target out of pions. (Yes, that was me exercising my gift for understatement.) So experimentalists have cleverly figured out how to get at the pion form factor in experiments where the pion is "electro-produced" in the interaction of a beam of electrons with a proton target. It turns out that protons (and neutrons for that matter too) fluctuate into a pion-nucleon state in ways that are governed by quantum mechanics (think Heisenberg Uncertainty Principle). So if you come in with an electron and hit the pion in that "virtual state" you can knock it away from the proton and detect it in a detector. Look at that: you scattered an electron from a pion target! Pion form factor here we come!
But the problem is that there are a bunch of other mechanisms by which pions can be produced when the electron-proton interaction takes place. So the experimentalists, such as Dr Gaskell who gave Monday's talk, need a model of these other processes in order to isolate the piece of the reaction they are interested in: the piece where the electron interacts with the pion in that pion-nucleon virtual state. With such a model in hand they can subtract off the other stuff and extract numbers for the pion form factor at a variety of momentum transfers where they have measured the pion electro-production process.
The results they get are intriguing, if vaguely disappointing. They show that the pion form factor is behaving with the power of momentum transfer predicted by pQCD (should go like 1/Q^2). But the pre-factor is off by a factor of a few. In astronomy a prediction of the right power law and a coefficient within an order of magnitude would be a success. However, in this case it has been the cause of some head-scratching by theorists, who have built a variety of different models to try and understand the additional processes (i.e. processes beyond those predicted by pQCD) that are taking place in the regime probed by the JLab experiments. Some of these models suggest that data on the pion form factor taken at an upgraded, 12 GeV, JLab will show the beginnings of an approach to the pQCD pion form factor. But at best it seems that 12 GeV JLab will provide only a glimpse of the promised land where the quarks and gluons inside the pion play together under the benevolent rule of pQCD. So let me ask the question: will all this effort and experimental ingenuity have been worth it if what we get out of this program are some very nice measurements of the pion's form factor in the "transition region", i.e. at Q^2's where pQCD does not apply?
At very large energies we know that the interactions of quarks and gluons can be computed in perturbation theory (pQCD: perturbative Quantum Chromodynmics). This allowed theorists to predict (already in the 1970s) how various quantities that JLab has now measured should behave in the limit of large energies/momentum transfer. The only problem is that no-one really knew how large was lareg. Some of these predictions have been borne out, e.g. those for the disintegration of a deuteron by a high-energy photon. But, for the most part, the "asymptotic region", where these predictions based on pQCD become correct, has proven elusive. JLab appears to be stuck in a transition region covers more kinematic territory than many originally thought.
One observable for which pQCD makes a definite prediction is the "form factor" of a pion. This function encodes how different the pion's charge distribution is from the charge distribution of a point particle. (Yes, you guessed it, the charge distribution of a point particle is that all the charge is at a single point.) The form factor can be accessed via electron-pion scattering experiments. But it is very very difficult to build a target out of pions. (Yes, that was me exercising my gift for understatement.) So experimentalists have cleverly figured out how to get at the pion form factor in experiments where the pion is "electro-produced" in the interaction of a beam of electrons with a proton target. It turns out that protons (and neutrons for that matter too) fluctuate into a pion-nucleon state in ways that are governed by quantum mechanics (think Heisenberg Uncertainty Principle). So if you come in with an electron and hit the pion in that "virtual state" you can knock it away from the proton and detect it in a detector. Look at that: you scattered an electron from a pion target! Pion form factor here we come!
But the problem is that there are a bunch of other mechanisms by which pions can be produced when the electron-proton interaction takes place. So the experimentalists, such as Dr Gaskell who gave Monday's talk, need a model of these other processes in order to isolate the piece of the reaction they are interested in: the piece where the electron interacts with the pion in that pion-nucleon virtual state. With such a model in hand they can subtract off the other stuff and extract numbers for the pion form factor at a variety of momentum transfers where they have measured the pion electro-production process.
The results they get are intriguing, if vaguely disappointing. They show that the pion form factor is behaving with the power of momentum transfer predicted by pQCD (should go like 1/Q^2). But the pre-factor is off by a factor of a few. In astronomy a prediction of the right power law and a coefficient within an order of magnitude would be a success. However, in this case it has been the cause of some head-scratching by theorists, who have built a variety of different models to try and understand the additional processes (i.e. processes beyond those predicted by pQCD) that are taking place in the regime probed by the JLab experiments. Some of these models suggest that data on the pion form factor taken at an upgraded, 12 GeV, JLab will show the beginnings of an approach to the pQCD pion form factor. But at best it seems that 12 GeV JLab will provide only a glimpse of the promised land where the quarks and gluons inside the pion play together under the benevolent rule of pQCD. So let me ask the question: will all this effort and experimental ingenuity have been worth it if what we get out of this program are some very nice measurements of the pion's form factor in the "transition region", i.e. at Q^2's where pQCD does not apply?
Friday, February 1, 2008
Deep thoughts
How exactly do you measure a half life that is > 10^25 years? Very very carefully.
(a) You go deep underground, so as to stop Cosmic Rays masquerading as the decay products you're looking for in your detector. Try 8000' under rock in South Dakota. That oughta do it.
(b) You use ultra-high-purity materials: materials with almost no radioactivity of their own, to stop those decay products from looking like the decay your looking for. Try materials that are literally a billion times freer of Uranium and Thorium than the dust in our offices. That might be good enough.
(c) Try collecting a tonne of germanium. Then you might have enough nuclei in there that a few will decay in the way your interested in. And for good measure, you can use the germanium you've collected as a solid-state detector too. Fortunately Germanium will work well in that regard. If you can get enough. And pay enough to the Russians to purify it for you.
(d) Wait several years. At least you know that the longer you wait the better lower bound you'll set on neutrinoless double-beta decay.
So if you do all that you might, just might, see neutrinoless double-beta decay. (Something to think about: how does your chance of seeing NDBD depend on the different things you do in (a)-(d)?)
But neutrinoless double-beta decay---if seen---would be a revelation. It would tell us that neutrinos are Majorana particles. They are their own anti-particle. Personally I think that'd be worth a Nobel prize.
And the rate of neutrinoless double-beta decay is proportional to the square of a linear combination of neutrino masses. So if we see it we learn something concrete regarding the absolute mass scale of the neutrinos: a quantity for which we at present only have upper bounds. (From cosmology and from studies of the end point in tritium beta decay.) Indeed, even if the next generation of neutrinoless double-beta decay experiments sees NOTHING that will still pretty much rule out the "inverted hierarchy" of neutrino masses. It's good when seeing nothing still gets you something.
But to get anything in terms of physics results will take a long time. In the meantime we learn that everything is radioactive. Dust, plastic, air. Everything.
You know one experiment of this type---one based in Europe---had a lot of background events coming from the surface of their crystals. The lines indicated Polonium-210 contamination on the surfaces. I heard today that people think that this background comes about because the people handling the crystals are smokers. and cigarettes are full of polonium. So smoking really is bad for your health: you get lots of radioactivtity on you as a result.
But when a neutrinoless double-beta decay guy says "lots of radioactivity" he probably means 1 decay per kilogram per 20 days or something like that. Because these guys are the ultimate purists. You have to be when you want to find something that only happens once a year in one nucleus out of all the nuclei in a couple of hundred kilograms of stuff. Like I said, delayed gratification. Big time.
(a) You go deep underground, so as to stop Cosmic Rays masquerading as the decay products you're looking for in your detector. Try 8000' under rock in South Dakota. That oughta do it.
(b) You use ultra-high-purity materials: materials with almost no radioactivity of their own, to stop those decay products from looking like the decay your looking for. Try materials that are literally a billion times freer of Uranium and Thorium than the dust in our offices. That might be good enough.
(c) Try collecting a tonne of germanium. Then you might have enough nuclei in there that a few will decay in the way your interested in. And for good measure, you can use the germanium you've collected as a solid-state detector too. Fortunately Germanium will work well in that regard. If you can get enough. And pay enough to the Russians to purify it for you.
(d) Wait several years. At least you know that the longer you wait the better lower bound you'll set on neutrinoless double-beta decay.
So if you do all that you might, just might, see neutrinoless double-beta decay. (Something to think about: how does your chance of seeing NDBD depend on the different things you do in (a)-(d)?)
But neutrinoless double-beta decay---if seen---would be a revelation. It would tell us that neutrinos are Majorana particles. They are their own anti-particle. Personally I think that'd be worth a Nobel prize.
And the rate of neutrinoless double-beta decay is proportional to the square of a linear combination of neutrino masses. So if we see it we learn something concrete regarding the absolute mass scale of the neutrinos: a quantity for which we at present only have upper bounds. (From cosmology and from studies of the end point in tritium beta decay.) Indeed, even if the next generation of neutrinoless double-beta decay experiments sees NOTHING that will still pretty much rule out the "inverted hierarchy" of neutrino masses. It's good when seeing nothing still gets you something.
But to get anything in terms of physics results will take a long time. In the meantime we learn that everything is radioactive. Dust, plastic, air. Everything.
You know one experiment of this type---one based in Europe---had a lot of background events coming from the surface of their crystals. The lines indicated Polonium-210 contamination on the surfaces. I heard today that people think that this background comes about because the people handling the crystals are smokers. and cigarettes are full of polonium. So smoking really is bad for your health: you get lots of radioactivtity on you as a result.
But when a neutrinoless double-beta decay guy says "lots of radioactivity" he probably means 1 decay per kilogram per 20 days or something like that. Because these guys are the ultimate purists. You have to be when you want to find something that only happens once a year in one nucleus out of all the nuclei in a couple of hundred kilograms of stuff. Like I said, delayed gratification. Big time.
Subscribe to:
Comments (Atom)