Feeds:
Posts
Comments

Archive for the ‘high energy physics’ Category

University of California, Santa Barbara
Tenure-Track Faculty Position
Theoretical High Energy Physics
Job #JPF00230

The Physics Department of the University of California, Santa Barbara, is seeking candidates for a tenure-track faculty position at the Assistant Professor level in theoretical high energy physics, or theoretical astrophysics, with an appointment to start in Fall of 2014. We are particularly interested in candidates with interests in the phenomenological areas of particle physics and related areas of astrophysics and cosmology. The ideal candidate will benefit from interactions with the strong existing groups in high energy theory and experiment, as well as the presence of the Kavli Institute for Theoretical Physics.

Candidates are expected to have a Ph.D. in physics or a closely related field, and will teach a range of courses in the physics department. Applicants must send a statement of research interests, a curriculum vitae, and a list of publications, and should arrange for at least three letters of recommendation. All application materials should be submitted via UC Recruit: https://recruit.ap.ucsb.edu

Applications will be considered starting November 20, 2013 and will be accepted until the position is filled.

The department is especially interested in candidates who can contribute to the diversity and excellence of the academic community through research, teaching and service. The University of California is an Equal Opportunity / Affirmative Action Employer.

Read Full Post »

It’s a fine day for the Universe to die, and to be new again! Well, maybe not, but the Internet is abuzz with a reincarnation of the unstable universe story. (You can also see it here, or here, the whole thing is trending in Google). In other works, this is known as tunneling between vacua. And if you have followed the news about the Landscape of vacua in string theory, this should be old news (that we may live in a unstable Universe, which we don’t know). For some reason, this wheel gets reinvented again and again with different names. All you need is one paper, or conference, or talk to make it sound exciting, and then it’s “Coming attraction: the end of the Universe …. a couple of billion years in the future“.

The basic idea is very similar to superheated water, and the formation of water bubbles in the hot water. What you have to imagine is that you are in a situation where you have  a first order phase transition between two phases. Call them phase A and B for lack of a better word (superheated water and water vapor), and you have to assume that the energy density in phase A is larger than the energy density in phase B, and that you happened to get a big chunk of material in the phase A. This can be done in some microwave ovens and you can have  water explosions if you don’t watch out.

Now let us assume that someone happened to nucleate a small (spherical) bubble of phase B inside phase A, and that you want to estimate the energy of the new configuration. You can make the approximation that the wall separating the two phases is thin for simplicity, and that there is an associated wall (surface) tension \sigma to account for any energy that you need to use to transition between the phases. The energy difference (or difference between free energies) of the configuration with the bubble and the one without the bubble is

\Delta E_{tot} = (\rho_B-\rho_A) V +\sigma \Sigma

Where \rho_{A,B} are the energy densities of phase A, phase B, V is the volume of region B, and \Sigma is the surface area between the two phases.

 

If \Delta E_{tot}>0, then the surface term has more energy stored in it than the volume term. In the limit where we shrink the bubble to zero size, we get no energy difference. For big volumes, the volume term wins over the area, and we get a net lowering of the energy, so the system would not have enough energy in it to restore the region filled with phase B with phase A. In between there is a Goldilocks bubble that has the exact same energy of the initial configuration.

So if we look carefully, there is an energy barrier between being able to nucleate a large enough Goldilocks bubble so that there is no net change in energy from a situation with no bubble. If the bubbles are too small, they tend to shrink, and if the bubbles are big they start to grow even bigger.

There are two standard ways to get past such an energy barrier. In the first way, we use thermal fluctuations. In the second one (the more fun one, since it can happen even at zero temperature), we use quantum tunneling to get from no bubble, to bubble. Once we have the bubble it expands.

Now, you might ask, what does this have to do with the Universe dying?

Well, imagine the whole Universe is filled with phase A, but there is a phase B lurking around with less energy density. If a bubble of phase B happens to nucleate, then such a bubble will expand (usually it will accelerate very quickly to reach the maximum speed in the universe: the speed if light) and get bigger as time goes by eating everything in its way (including us). The Universe filled with phase A gets eaten up by a universe with phase B. We call that the end of the Universe A.

You need to add a little bit more information to make this story somewhat consistent with (classical) gravity, but not too much. This was done by Coleman and De Luccia, way back in 1987. You can find some information about this history here. Incidentally, this has been used to describe how inflating universes might be nucleated from nothing, and people who study the Landscape of string vacua have been trying to understand how this tunneling between vacua might seed the Universe we see in some form or another from a process where these tunneling events explore all possibilities.

You can reincarnate that into Today’s version of “The end is near, but not too near”. We know the end is not too near, because if it was, it would have already happened. I’m going to skip this statistical estimate: all  you have to understand is that the expected time that it would take to statistically nucleate that bubble somewhere has to be at least the age of the currently known universe (give or take). I think the only reason this got any traction was because the Higgs potential in just the Standard model, with no dark matter,  with no nothing more in all its possible incarnations is involved in it somehow.

Next week: see baby Universe being born! Isn’t it cute? That’s the last thing you’ll ever see: Now you die!

Fine print: Ab initio calculations of the “vacuum energies”  and “tunneling rates” between various phases are not model independent. It could be that the age of the current Universe is in the trillions or quadrillions of years if a few details are changed. And all of these details depend on the physics at energy scales much larger than the standard model, the precise details of which we don’t know much at all. The main reason these numbers can change so much is because a tunneling rate is calculated by taking the exponential of a negative number. Order one changes in the quantity we exponentiate lead to huge changes in estimates for lifetimes.

 

 

 

 

 

 

Read Full Post »

Right now I’m in the midst of a program I helped to organize (and I’m still organizing) at the KITP. The program deals with the question of how to use numerical methods from lattice and gravity to make inroads into interesting (usually very hard) questions about quantum field theory (and quantum gravity) and the dynamics of the strong interactions at finite temperature (like in the heavy ion collisions).

 

We’ve had a lot of great talks about a wide variety of topics. Personally, I really liked the talk by Phillipe DeForcrand on the sign problem. The main reason I like it is because he had really simple examples that illustrate what the sign problem is all about. You can find it here.

And if you want to see what we’ve been hearing about, you can go here and see the full list of talks so far.

Read Full Post »

Not too long ago we discussed this paper on one of our informal seminars. The paper is called “The eighteen arbitrary parameters of the standard model in your everyday life” by R. N. Cahn, and the paper dates back to 1996. It is an RMP colloquia paper.

I think it still reads great and it explains what are the mysteries of particle physics and how making small changes in the Standard Model could lead to completely different physics.

Well, in those days there were only 18 parameters in the standard model. Now that we have neutrino masses there are a couple more. Because of this,  by many standards, this is considered prehistory. On the other hand, one can take this as a benchmark to calibrate  all the accomplishments of particle physics experiments since then.

What I like about the paper is that in some sense it gives a feeling for how non-generic the parameters of the standard model are.

 

 

Read Full Post »

Well, the press is all fired up about a claim of faster than light neutrinos. The claim from the OPERA experiment can be found in this paper. The paper was released on September 22nd and it has already gotten 20 blog links. Not bad for a new claim.

Considering that the news organizations are happily bashing special relativity, one can always rely on XKCD to spin it correctly.

Now more to the point: the early arrival time is claimed to be 60 nanoseconds. The distance between the emitter and the observer is claimed to be known to about 20 cm, certified by various National Standards bodies. A whole bunch of different systematic errors are estimated and added in quadrature, not to mention that they need satellite relays to match various timings.

60 nanoseconds is about the same as 20 meters uncertainty (just multiply by the speed of light) and they claim this to be both due to statistical errors and systematics. The statistical error is from a likelihood fit. The  systematic error is irreducible and in a certain sense it is the best guess for what the number actually is. They did a blind analysis: this means that the data is kept in the dark until all calibrations have been made, and then the number is discovered for the measurement.

My first reaction is that it could have been worse. It is a very complicated measurement.

Notice that if we assume all systematic errors in table 2 are aligned we get a systematic error that can be three times as big. It is dominated by what they call BCT calibration. The errors are added in quadrature assuming that they are completely uncorrelated, but it is unclear if that is so. But the fact that one error dominates so much means that if they got that wrong by a factor of 2 or 3 (also typical for systematic errors), the result loses a bit on the significance.

My best guess right now is that there is a systematic error that was not taken into account: this does not mean that the people that run the experiment are not smart, it’s just that there are too many places where a few extra nanoseconds could have sneaked in.  It should take a while for this to get sorted out.

You can also check Matt Strassler’s blog and Lubos Motl’s blog for more discussion.

Needless to say, astrophysical data from SN1987a point to neutrinos behaving just fine and they have a much longer baseline. I have heard claims that the effect must depend on the energy of the neutrinos. This can be checked directly: if I were running the experiment, I would repeat it with lower energy neutrinos (for which we have independent data)  and see if the effect goes away then.

 

 

 

 

 

Read Full Post »

Here is an announcement of a program I will be organizing at the KITP from Jan 17 thru March 9 2012. It is a program on numerical methods for gravity and QFT. The web page of the program is located here.

Here is the image I made to illustrate the program: it is generated by taking a set of modes in a box with a UV cutoff. Then amplitudes are seeded for these modes with random numbers and phases multiplied by the typical quantum uncertainty on each mode. The result is a picture like the one below.

It is also fun to animate it.

Right now I have to start chasing people and reminding them that the (first) deadline for applications is coming soon (April 30th).

In the meantime stay tuned.

Image of Fluctuations of quantum fields

Read Full Post »

WCLHC

This is a short announcement that the West Coast LHC meeting is going to be taking place in Santa Barbara, on April 15th 2011. Here is our website:

WCLHC meeting 2011.

As you can imagine we have been busy at UCSB putting this together.

 

Read Full Post »

About one year ago I lost my umbrella. I found it today. Perhaps the most amazing thing is that the umbrella was recovered. The umbrella was happily residing in the conference room of the experimental high energy physics group at UCSB. It was left untouched for a year. This shows how little need there is for general lost umbrellas in Santa Barbara.

The main reason for me being in that conference room is that we had our grant agency, the Department of Energy (DOE), visiting us. This is a yearly event that  I have always found to be very useful, since I get to learn a lot about the nitty gritty details of the high energy experiments that our group is involved in. It also is a lot of work. I’m also very grateful for the support of the DOE has given us through the years.

Read Full Post »

Some HEP news

For those of you who are following the news, here is a link to the recent HEPAP P5 meeting report recommending the extension of the Tevatron run.

Read Full Post »

As many of you probably know, the place to look for references in High Energy Physics for as long as I can remember has been hosted by the SPIRES website, hosted at SLAC.  In the last few year the system has become slow and clunky and I have heard various complaints about it. There is a mirror at Fermilab that works better, but it sometimes still freezes. The next generation of the search engine is called INSPIRE, and it is far superior to the SPIRES engine. This has ben jointly developed with CERN, I’m not sure who else is involved.

This week the SPIRES website is urging the visitors to move on and try INSPIRE. It is in beta testing (has  been for a while and I have used it before), but now it seems to be working much faster.

 

One of the things that SPIRES had troubles with was citation counts. There are some double counts that appear in some places and not in some others and the results had an inherent noise in them.

INSPIRE seems to have corrected those issues and now the counts seem to match everywhere. I have not found anything broken with the new system yet, but I have not been pushing it either.

 

In any case, SPIRES is  obsolete (has been for a while) and the transition is now. I think so far they have done a good job with the new software and the move to the new system is worthwhile.

 

I think the funniest catch phrase one can write about it is the title of this post:

Move to inspire.

It sounds like a slogan for a charity. Oh well.

 

Read Full Post »

Older Posts »