Feeds:
Posts
Comments

Posts Tagged ‘Physics’

Many times in physics one wants to solve systems of ordinary second order differential equations (equations of motion for example). If the dynamics comes from a Lagrangian,  it is standard to try to put them into first order formalism by going to the Hamiltonian formalism and working in “phase space”. Once you get to this stage, hyou can try putting the system on a computer by evolving the equations of motion discretely. Many times this destroys certain aspects of the dynamics. However, if you do things right, you can get some things to work better then expected.

For example, in the Hamiltonian formalism of the Kepler problem, one would have a Hamiltonian of the form

H= \frac {p_1^2 + p_2^2}{2m} - \frac{K}{r}

where

r= \sqrt{x_1^2+x_2^2}

The sign indicates that one has smaller energy where $r$ gets smaller (the potential energy is attractive).

A naive implementation of the evolution of the system is given by evolving

p_i [t+\delta t ] = p_i[t] - \partial_{x_i} V[r[t]] \delta t

and

r_i[t+\delta t]= r_i[t]+ \frac{p_i}{m} \delta t

However, after staring at this for a while, one notices that the dynamics is not reversible: both x,p have changed, so going back by changing \delta t\to -\delta t does not get you exactly back to where you started.

There is a very nice fix to this problem: you think of momenta as being evaluated at half times, and positions at full times. This is, we get

p_i[t+\delta t/2] = p_i[t-\delta t/2]+ \partial_{x_i} V(r[t]) \delta t

and

x_i[t+\delta t]= x_i[t]+ p_i[t+\delta t/2] \delta t

and even though this looks almost identical to what we had before, it is now time reversible (just send $\delta t\to -\delta t$ and do appropriate shifts to check that you really get back to where you started).

This is called the leap-frog algorithm. For problems like the one above, it has rather nice properties. The most important one is that it preserves Liouville’s theorem (it keeps the volume element of phase space constant).

In examples like the one above, it does something else that is quite amazing. If you remember Kepler’s second law (sweeping equal areas in equal time intervals), it is the law of angular momentum conservation. I’ll leave it to you to find a proof that the above system sweeps equal areas in equal times around the origin x_{1,2}= 0. I learned this fact recently in a conversation in my office and I was quite pleased with it, so I thought it would be nice to share it.

This algorithm does quite well on a lot of other systems (like the one I’m studying now for my research). If you have a system with a lot of symmetries, sometimes the leapfrog algorithm will preserve a lot of these symmetries and also the conserved quantities, so that you can evolve the system for much larger values of \delta t without loss of information.

 

 

 

 

 

Read Full Post »

Coloring graphs.

Coloring scheme I.

Coloring scheme II.

At the top you can see two different frames for visualizations of information during different times of a particular simulation. I’m not going to tell you the details of the simulation, nor what the graphs are going to represent (this is still in the ‘top secret’ category: it is work in progress and a lot of stuff can change before we decide to go public with this). In the meantime enjoy the pretty pictures. What I’m trying to figure out is which color scheme looks better. Warning: don’t expect to see graphs like this in any of my papers in the near future.

Here is the deal: coloring schemes produce emotions in the recipient. Different coloring schemes give people different feelings about information. For example, red is usually associated to hot, while blue is associated to cool. However, a blue star is hotter than a red star. The red/blue association is probably due to fire/ice. Fire tends to be reddish, and ice is kind of bluish, but when we see things according to the radiated energy at different frequencies we get a completely different picture.

When presenting scientific information, choices like this one often present themselves. And it makes a difference on how the recipient audiences perceive the quality of the work… or even better: the coolness factor of the work.

The big questions are: what emotions do the above graphs give you? Which one do you like best? Why?

In the end they are conveying sufficiently similar raw information, but though I know this is true, I feel different about it. They have a different artistic feel to them. I just thought I’d share some of these issues and maybe even get some feedback.

Read Full Post »

The Gravity Research Foundation announced the results of the 2010 competition. Here are the results. At UCSB we discussed the prize-winning paper by Mark van Raamsdonk today. It was a very lively discussion and we thought it was a great paper to read. Mark’s paper provided some very tantalyzing evidence that entanglement seems to play a very important role in building up geometry.

On another note, a paper by Daniel Green, Zohar Komargodski, Nathan Seiberg,
Yuji Tachikawa, and Brian Wecht
appeared today. They solve a problem in four dimensional supersymmetric conformal field theories on counting how many marginal deformations there are. As a byproduct, they also solve the problem in 3-d field theories with N=2 Supersymmetry. The paper is beautiful and it is a huge improvement on the work by Leigh and Strassler on the subject many years ago. After reading it I was kicking myself because ‘I could have done it’ (I was interested in the problem and I knew many of the facts. I just didn’t put them together. But if I had thought hard about it I probably could have, although the paper would read rather differently). It’s not surprising that these authors at the Institute for Advanced Study found the solution and that it is written in the particular way that it is written since they have been studying very carefully the superfield formulation of supersymmetric theories in four dimensions. Lubos also commented on the paper.

Read Full Post »

Clifford Johnson pointed to me his post on the quest for perfect quantum fluids. In a certain sense, we are used to thinking about fluids as low energy phenomena (relatively low temperature physics). Famous fluids are characterized by fun properties like superfluidity, or ferrofluids that can be a lot fund to play with in an exhibition. The most perfect fluids will be those with little to no viscosity \eta (viscosity is sometimes related to friction, but this can be misleading).

The recent experiment of RHIC that has claimed detection of the quark-gluon plasma also produces some type of liquid with very low viscosity. To compare how this hot liquid compares with a cool liquid one also needs to measure the entropy density s. The quest of who is more perfect than whom depends on the ratio

\frac \eta s

Whoever gets the smallest value wins. These are difficult quantities to measure, but they can sometimes be estimated from other known data. From the point of view of theory, this figure of merit is the one that allows comparison of various theories with different numbers of microscopic degrees of freedom, and it is suggested by various gravity dualities (this way of  comparing fluids came from the work of Kovtun, Policastro, Son, Starinets around 2001-2003, in various papers that have made a big splash in physics).

There is an issue of Physics Today that is dedicated to the topic of perfect fluids from various points of view. The readers of this blog might want to wander there and look at the expository articles on the subject. Room will be left open for discussion and questions, although I don’t promise that I will be able to answer them.

Read Full Post »

Charles K. Kao, Willard S. Boyle and George E. Smith share this years Nobel prize in Physics for their contributions to the transmission and detection of light. Congratulations to the three of them.

Charles K. Kao did research in material science of glass, and argued that the losses in glass fibers available in the 1960’s where mostly due to impurities in the material. A few years later glass of sufficient purity was made by Corning, and modern fiber optics telecommunications where born. Nowadays, this technology impacts us directly by making the infrastructure that handles the information traffic of the internet possible.

Willard S. Boyle and George E. Smith created the CCD. This is one of the main technologies in modern photography. It make the capture and reading of light fast and efficient and it essentially made photographic film obsolete: the cost of capturing an image went down to essentially zero. It is also one of the standard technologies for astrophysics and most importantly, it is not restricted to the visible spectrum. It can be used to read light from distant sources very fast, data that can be transmitted to researchers all over the world very quickly (we don’t have to wait to develop the film), and being in electronic format, it is easy to manipulate, send and store. This is anther technology that has wide applications on the Internet, capturing live images that end up in U-Tube and Flicker.

Lubos laments the fact that the Nobel prize went just for technology. Although I sympathize, I think that this is not a bad choice at all. Although one can call this applied physics rather than fundamental physics, the technological breakthroughs enabled by these inventions is truly remarkable. Nowadays, we take it for granted. But it is truly a marvelous thing. Today,  I can have a video-phone conversation on the Internet with someone on the other side of the planet, for a costs that is essentially zero. This is a science fiction idea that did come through, but not necessarily the way they were originally envisioned.

You should also consider that modern telecommunications account for a big chunk of the worlds GDP, and it will surely grow in the future. The technologies that make this possible come from Physics research, and it might take many years before the engineering issues and the costs can be lowered enough so that we all benefit from them. Besides, the whole architecture of the modern Internet came out from CERN. A lot of people needed to look at large chunks of data with completely independent computer systems ans operating systems. A common message protocol for communications and standards for addressing data was born from these necessities. It would be hard to give a Nobel prize for that.

I would wish that the public at large was more aware that the technologies of today are the product of years of development, starting from physics discoveries and inventions and refined by engineers so that they can be mass produced with quality that can be controlled. Without the first invention, the rest of the process doesn’t work.

Read Full Post »

Some links

While I am still snowed under, and in the process of finding out how to automate posting my Delicious links to this blog, here is a manual dump of some of the things I found interesting recently. Let me know in comments if you know how to generate weekly (not daily!) link posts like this one automatically, without having to learn Perl in the process.

Google and the future of books

Interesting article about the consequences of having an unprecedented amount of information available, and the dangers of having it all concentrated in one place.

The way of all debt

Everyone is talking about the economy these days, here is an interesting angle, review of Margaret Atwood’s recent book on the subject.

Shall we get rid of the lawyers?

The title says it all…

Lectures on holographic methods for condensed matter physics

Excellent introduction to one of the major recent developments in string theory. Inserted in part to avoid having all links today point to the NY review of books.

Enjoy! I’ll be back pretty soon with some real posts.

p.s: another one, which I find really funny for some unclear reason.

pps: This is written using Ecto, from my new Macbook. Thanks to all the excellent advice I got in response to my post, the transition was really easy. It is also clear, after only a few days, that there is simply no going back…

Read Full Post »

Stereography: It’s in 3D

I went to see Coraline this weekend. It is an animated movie made with stop motion animation (and it is supplemented by usual CGI effects). I like the medium of stop-motion very much, particularly because it is harder to achieve a good result. The most interesting thing about the movie is that it was filmed in 3D, which gives it a much more eerie feeling. The storyline was ok and there were various aspects that were very predictable. There are some elements of it that reminded me of my childhood: I used to be especially afraid of dogs and I met a lot of kids that didn’t fit in well. I also ended up eating more beets than I liked when I was growing up. Now I really love them, so I can understand a few obsessions that are portrayed in the movie. I think the thing I liked the most was that they did not use the 3D to do that usual ‘trick’ of having objects point towards the audience over and over again. There was a  bit of that at the very beginning, but it soon faded away. (For an example of bad – or you could also say tasteless- uses of 3d, see this blog review of Beowulf)

Now, back to the 3D. Just to tell you a  little bit about the technology involved (feel free to go to wikipedia for more info). The basic idea is that we have stereographic vision: we have two eyes, which is really good for measuring distances by angles of triangles. So ideally, each eye sees a slightly different image and our brain reconstructs a 3D map from two 2D-images.

(more…)

Read Full Post »

Older Posts »