Category Archives: precision

The world's time

I was asked to write a news story for the American Physical Society's Forum on International Physics newsletter. Here is my contribution.

As I type this text away, I become aware that time just continues its quiet flow and my keyboard clicks measure its passage. And whatever poetical, philosophical or religious meaning one might assign to the “time”, that is how the time is defined: as a measurable sequence of events.

And time being measurable naturally means that physicists are in business.

Atomic clocks are arguably the most accurate devices ever built. While a typical wristwatch keeps time accurate to about a second over a week, modern atomic clocks aim at neither gaining nor loosing a second over the age of the Universe. Imagine that if some poor soul were to build a clock like that at the beginning of time, at the Big Bang, and for some good reason it were to survive through all the cosmic cataclysms, today it would be off by less than a heartbeat.

Atomic clocks are ubiquitous and one could buy a slightly used one on the internet. Among many places, they tick away on stock exchanges, in data centers, and in the hearts of GPS satellites. However, there is a truly special collection of several hundred atomic clocks distributed among 50 or so industrialized countries that defines the world’s time. This timescale is known as the TAI (from the French “Temps Atomique International”) or the international atomic time.

A collection of atomic clocks at the Physikalisch-Technische Bundesanstalt (PTB), Germany. These clocks substantially contribute to the TAI timescale, the world’s time. Credit: PTB

A collection of atomic clocks at the Physikalisch-Technische Bundesanstalt (PTB), Germany. These clocks substantially contribute to the TAI timescale, the world’s time. Credit: PTB

BIPM (Bureau International Des Poids Et Mesures) is at the heart of defining the world’s time. This international organization is located in a white wooden two-story building on the forested bank of the Seine River in the Parisian suburbs. Judah Levine from NIST-Boulder explains that BIPM was established in 1875 by the international “Treaty of the Meter” which defined the kilogram and the meter. Later the second was added to the convention (SI units) and the meter redefined in terms of the fixed speed of light and the second. The modern legal definition of the second involves a certain number of beats derived from the hyperfine splitting of cesium-133 atom.

Judah Levine has been contributing US data to TAI for nearly half a century. He explains that BIPM collects clock data from metrology labs and averages them. Then BIPM distributes a document called “the Circular T” which tells by how much the national timescales were off from the average about a month ago. In turn, based on this circular, national labs steer their local timescales to account for the drifts from the TAI. Such a protocol keeps the world’s time stable at the level of a nanosecond over a month.

The most advanced metrology labs rely on the so-called primary frequency standards, super-precise cesium clocks, says Peter Rosenbusch of the Laboratoire Nationale de Métrologie et d'Essais and the Paris Observatory. The primary standards are occasionally used to calibrate other local “workhorse” continuously-run atomic clocks to the SI definition of time as close as possible. In the US, the primary frequency standard is the cesium fountain clock at NIST-Boulder.

So if the world’s time is the time counted by atomic clocks, is it the same as the cosmic time?  In principle one could measure time using pulsars, magnetized rotating neutron stars. The pulsars, however tend to slow down over time due to the gravitational wave radiation, and, moreover, Judah Levine points out that the very shape of the pulses also changes over time making counting the pulses imprecise. We joke that, perhaps, to define the Standard Galactic Time one needs to find more stable cosmic sources.

Nevertheless, space and satellite technology are anticipated to improve TAI. Christophe Salomon of Ecole Normale Supérieure in Paris is involved with the ACES (Atomic Clock Ensemble in Space) mission of the European Space Agency. He explains that the goal is to operate the most precise primary Cs frequency standard onboard the International Space Station (ISS). The clock is expected to become operational in space in two years. ISS would broadcast a microwave time signal down to several Earth-based stations. In the USA, the stations will be installed at JPL in Pasadena and NIST-Boulder. Through the ACES mission, national labs around the globe will establish high precision links to compare primary standards and thus remove some uncertainties in their contributions to the world’s time.

Neither time nor its definition is still. There are new generations of atomic clocks based on ultracold atoms and ions that already outperform the primary frequency standards. Pushing these quantum devices to their limits is a friendly competition between several labs around the world. Just over the past year the crown of being the world’s most precise clock has been shared by USA (two teams at JILA and NIST-Boulder), Japan, and Germany. These advances have been summarized in recent talks by E. Peik (PTB, Germany) and A. Ludlow (NIST-Boulder, USA) at the International Conference on Atomic Physics held last July in a historic Mayflower hotel in Washington D.C.

Considering this rapid progress in atomic horology, the international community discusses how to redefine the second in terms of these novel classes of clocks. This means retiring Cs from the SI units and redefining the world’s time.

Also the clock comparison technology improves. The European Union is building a trans-European clock network using existing optical fiber communication links to compare clocks at metrology labs directly, removing the uncertainties of the over-the-air and over-the-space comparisons. The first 920 km-long link between the northern and southern parts of Germany has been already tested.

Futuristic quantum clock network. Credit: Jun Ye group and Steve Burrows, JILA

Futuristic quantum clock network. Credit: Jun Ye group and Steve Burrows, JILA

One of the apparent limitations of the TAI timescale is that it is a “paper timescale” – it only shows what the world’s time was a month ago. What if the dedicated clocks were compared and averaged continuously or even better they formed one single geographically distributed clock? This was envisioned recently by a group of physicists led by Mikhail Lukin at Harvard and Jun Ye at JILA in Colorado.   They proposed a quantum network of atomic clocks (for example, placed on satellites orbiting the Earth) that would utilize quantum entanglement to create one giant distributed clock with each nation contributing satellites to the network. Jun Ye comments, “this is definitely a futuristic proposal, and we must achieve substantial technological advances. However, all of the different building blocks for the network have in principle been demonstrated in small scales.” May be this is how the world’s time would be measured in the far future.

I would like to also thank  Jeff Sherman of NIST-Boulder, Ekkehard Peik of PTB, and Peter Komar of Harvard for illuminating discussions.

About the author: Dr. Andrei Derevianko is a Russian-American theoretical physicist and a professor at the University of Nevada, Reno. He has contributed to the development of several novel classes of atomic clocks and precision tests of fundamental symmetries with atoms and molecules.

UPDATE: published here

Fundamental physics at the precision frontier: questions to ponder

The closing session of the Perimeter workshop on "New ideas in low-energy tests of fundamental physics" was a stimulating discussion on open questions at the intersection of precision measurements and fundamental physics.  The discussion was guided by Derek Kimball's  list of questions that he kindly shares with you below. The video/audio record of the entire discussion can be found online here: part1 and part2.

Are there boring answers to exciting mysteries?

If one assumes, from the experimental perspective, the most boring solutions to mysteries: for example, a cosmological constant driving the accelerating expansion of the universe and dark matter that has no couplings to Standard Model particles, what mysteries still cannot be resolved?

Interaction of DM/DE with Standard Model particles/fields

Is non-gravitational interaction of DM/DE with Standard Model particles/fields well-motivated? (From astrophysical and cosmological measurements?)

Energy scale of new physics

Technical naturalness: for now should we not be overly concerned about this issue for experiments?

Hierarchy problem and its relation to the observed Higgs mass, cosmological constant, BICEP-2, Planck scale: how does this relate to the scale of new physics and where we should search?

New (?) idea of searching for fast-varying constants: could this be done in an astrophysical spectroscopic search?

It was noted that a phase transition process (or evolving couplings) could be introduced “avoid” technical naturalness problems... could there be phase transitions with very small effects that occur frequently, perhaps even today? (Something for GNOME or clock networks to look for?)

Impact of BICEP-2 results

BICEP-2 results: if assumed to be correct, what do they imply about the best regimes/scenarios/experiments to search for new physics?

Does BICEP-2 imply that lots of interesting new physics stuff inflates away?

How plausible is scale evolution of physics to avoid BICEP-2 “problems” and what are experimental signatures of scale evolution?

Relation between astrophysical and laboratory searches

Ideas like chameleon fields: what kind of mechanisms exist to hide interactions in laboratory tests and allow astrophysically, or allow in laboratory tests and hide astrophysically? How plausible are these, and how seriously should constraints be taken?

What is the state of knowledge about coupling between dark matter particles? Would coupling between DM particles make some difference between laboratory vs. astrophysical bounds? What if DM is more complex (not just one species) and 5% is coupled strongly to itself: could one have, for example, axion stars, etc.? Could such objects give transient signals?

Transient and time-dependent new physics signals

What kind of new physics can GNOME, clock network, CASPEr, or related experiments access that laboratory experiments cannot access?

Is there anything that can be said about scale of domains, time between transient signals?

Higher dimension topological defects and textures were mentioned. What are these, and what are interesting signatures and characteristics?

It was noted that the photon mass could be altered inside a topological defect: could this be measured with the GNOME or the clock network experiment?

Symmetry tests of gravity

How do we test if standard gravity violates parity or time-reversal invariance?

Tests involving gravito-magnetic fields were mentioned, also chirality in gravitational waves?

Could G measurements (torsion pendulum or atom interferometer) test somehow?

Clocks are sensitive to general relativity effects: could GR effects be used to test P- and T-invariance of gravity somehow?

Antimatter

What new physics might anti-hydrogen experiments be sensitive to that experiments with ordinary matter are not?

What are other possibilities beyond Peccei-Quinn symmetry breaking (QCD axion) to explain strong-CP problem? What are signatures?

What is the present status of the connection between CPT violation and local Lorentz invariance violation?

Variation of fundamental constants and physical laws

Are there viable ways to test variation of other constants besides \alpha, proton/electron mass ratio? For example, Fermi constant? G (lunar laser ranging was suggested)? Strong coupling constant?

What about violation/variation of other sacred laws: angular momentum conservation? energy conservation? charge conservation?

Electric dipole moments (EDMs)

What is the impact of the ThO electron EDM constraint (also Hg and neutron EDM limits) on new physics scenarios?

Dark Energy (DE)

What are the range of viable ideas outside of the cosmological constant, and among these which have the best motivation? What “hand-waving arguments” motivate where to search?

What is relation of inflation to CP problem and baryogenesis (does CP-violating inflaton do anything, or are Sakharov conditions not satisfied)?

What is connection between inflaton and dark-energy?

Non-quantum fields?

It was noted that loop corrections complicate the physics of light scalar fields. Can one imagine “non-quantizable” fields (something like gravity that can’t be quantized simply)? For example, torsion or chiral gravity that is not quantized (at least in the usual way)?

Do such fields have distinct signatures compared to quantized spin-0, spin-1 quantum fields? What is plausibility, for example, of long-range torsion gravity?

Transients in astronomical spectroscopy

Could we detect new physics “passing through” the line of sight between earth and astrophysical object?

Could we search for transients using a telescope? There was a suggestion to use an astro-comb…

For example, it was suggested that quintessence field coupled electromagnetically and generated Faraday rotation: if \phi is clumped or forms topological defect, is this something observable?

Impact of proton radius measurements

What kind of new physics might it imply?

Hidden sector

There are previous and future tests of the spin-statistics connection being conducted. Could these be sensitive to some of the hidden sector physics?

What are observable signatures of hidden sector supersymmetry?

Large extra dimensions

What is the present status and is there strong motivation to go to particular length scales in tests? Is 100 microns a special length? What would show up in atoms?

Experimentally, what is the status of patch potential systematics?

Lense-Thirring effect for intrinsic spins

Comparison of Lense-Thirring effect for intrinsic spin vs. orbital angular momentum? Is there a way to test?

When would an unanticipated “new physics” event be apparent to an unsuspecting experimentalist?

Suppose an experimentalist has a sensitive device, the conventional physics of which is well under control.  Now let’s assume that once in while the device is perturbed by some unanticipated  “new physics” events, such as an interaction with a lump of dark matter. Suppose the device has enough sensitivity to “new physics”.  When would an unanticipated “new physics” event become apparent to an unsuspecting experimentalist?

This is quite different from particle colliders, when experimentalists do hunt for unusual events. Sometimes specific signal or signature is anticipated (e.g., the Higgs), so let me emphasize that our unsuspecting fellow experimentalist does not specifically look for new physics.

The discovery of the cosmic microwave background could serve as a motivating example of paying attention to “misbehaving” data.

So when would  “new physics” be noticed?  An obvious answer would be: when the new physics perturbs the expected signal in a significant way.

Even this simple statement requires qualifiers. Suppose new physics provides a uniform background to the signal and the signal itself cannot be computed exactly from the first principles. For example, transition frequencies of many-electron atoms can be computed only to 3-4 significant figures while experimentalists can determine some of these frequencies to 18 significant figures. Then (unless there are symmetry arguments, e.g., parity or time-reversal violation and associated external field reversals implemented in an experiment) there is no way to dissect the new physics background from the conventional one.

This leaves us with time/space-dependent new physics signals. I.e., new physics could be noticed if it leads to some noise or drift in time/space-dependent signals. For example, a uniform-in-time drift in atomic frequencies could reveal variation in fundamental constants.

What about “new-physics” noise/spike-like events, such as the perturbation by "lumps" of dark-matter? Suppose the conventional signal is interrupted by new physics events. We could characterize such events by how long an event lasts (short/long interaction times), how frequent the events are (rare/frequent) and if the device is sensitive to the event.  For simplicity we assume that the events do not overlap, i.e., the average time between the individual events is much longer than the event duration.

The event would be missed if the average time between consecutive events is larger than a typical time of continuous operation of the device (i.e., the events are rare). Unfortunately, a single bona fide event could be discarded by an experimentalist as being an outlier. Indeed, one can never guarantee that everything is fully under control, as there still may be occasional perturbations present, such as a student bumping into an optical table or misbehaving power supplies.

Essentially, rare events would be registered as such only if they are anticipated.

This argument brings us to the following conclusion: unanticipated “new physics” events would be noticed only if the sizable events are frequent on time-scale of the experiment.

There is another caveat: suppose the events are so frequent that they look like a white or a flicker noise in the signal. After all it is natural to assume Poissonian distribution of time intervals between consecutive events. Then there is a danger of “integrating out” the events.

Thereby we have to revise our statement: unanticipated “new physics” events would be noticed only if the sizable events are frequent (but not too frequent) on the time-scale of the experiment.

In practical terms, for a typical atomic physics experiment, the events should last longer than a second and there should be hundreds of them per day of operation. Even then, the experimentalist should be gutsy enough to put his/her credibility and comfort at risk and publicly report the data as being unusual.  “New physics” looks for the right fellow to notice and appreciate it.

P.S. I would like to thank Dima Budker and Jeff Sherman for discussions on this topic

Search for topological dark matter with atomic clocks

By monitoring correlated time discrepancy between two spatially-separated clocks one could search for passage of topological defects (TD), such as domain wall pictured here. Domain wall moves at galactic speeds ~ 300 km/s. Here the clocks are assumed to be identical. Before the TD arrival at the first clock, the apparent time difference is zero, as the clocks are synchronized. As the TD passes the first clock, it runs faster (or slower, depending on the TD-SM coupling), with the clock time difference reaching the maximum value. Time difference stays at that level while the defect travels between the two clocks. Finally, as the defect sweeps through the second clock, the phase difference vanishes. For intercontinental scale network, l~ 10,000 km, the characteristic time  30 seconds.

By monitoring correlated time discrepancy between two spatially-separated clocks one could search for passage of topological defects (TD), such as domain wall pictured here. Domain wall moves at galactic speeds ~ 300 km/s. Here the clocks are assumed to be identical. Before the TD arrival at the first clock, the apparent time difference is zero, as the clocks are synchronized. As the TD passes the first clock, it runs faster (or slower, depending on the TD-SM coupling), with the clock time difference reaching the maximum value. Time difference stays at that level while the defect travels between the two clocks. Finally, as the defect sweeps through the second clock, the phase difference vanishes. For intercontinental scale network, l~ 10,000 km, the characteristic time 30 seconds.

Despite solid observational evidence for the existence of dark matter, its nature remains a mystery. A large and ambitious research program in particle physics assumes that dark matter is composed of heavy-particle-like matter. That community hopes to see events of dark matter particles scattering off individual nuclei. Considering nil results of the latest particle detector experiments (see excellent discussion here), this assumption may not hold true, and significant interest exists to alternatives.

Now what about atomic clocks? Atomic clocks are arguably the most accurate scientific instruments ever build. Modern clocks approach the 10^{-18} fractional inaccuracy, which translates into astonishing timepieces guaranteed to keep time within a second over the age of the Universe. Attaining this accuracy requires that the quantum oscillator be well protected from environmental noise and perturbations well controlled and characterized. This opens intriguing prospects of using clocks to study subtle effects, and it is natural to ask if such accuracy can be harnessed for dark matter searches.

Posing and answering this question is the subject of our recent paper:
Hunting for topological dark matter with atomic clocks, A. Derevianko and M. Pospelov, arXiv:1311.1244.

We consider one of alternatives to heavy-particle dark matter and focus on so-called topological dark matter. The argument is that depending on the initial quantum field configuration at early cosmological times, light fields could lead to dark matter via coherent oscillations around the minimum of their potential, and/or form non-trivial stable field configurations in space (topological defects). The stability of this type of dark matter can be dictated by topological reasons.

I know, this sounds a little bit too far fetched to an atomic physicist. Well, ferro-magnets could serve as a familiar analogy. Here topological defects are domain walls separating domains of well-defined magnetization. Above the Curie point, the sample is uniform, but as the temperature is lowered, the domains start to form. So one could argue that as the Universe was cooling down after the Big Bang, quantum fields underwent a similar phase transition.

Generically, one could talk about 0D topological defects (=monopoles), 1D=strings, and 2D=walls. Dark matter would form out of such defects. The light masses of fields forming the defects could lead to a large, macroscopic, size for a defect. Based on observations and simulations, astronomers have a good idea of how dark matter moves around the Solar system. The defects would fly through the Earth at galactic velocities ~ 300 km/s. Now if the defects couple (non-gravitationally) to ordinary matter, one could think of a detection scheme using sensitive listening devices, e.g., atomic clocks. In fact one would benefit from a network of clocks, as one would cross-correlate events occurring at different locations.

Phenomenologically, the dark matter interaction with ordinary matter could be described as a transient variation of fundamental constants. The coupling would shift atomic frequencies and thus affect time readings. During the encounter with a topological defect, as it sweeps through the network, initially synchronized clocks will become desynchronized. This is illustrated in the figure.

The real advantage of clocks is that these are ubiquitous. Several networks of atomic clocks are already operational. Perhaps the most well known are Rb and Cs atomic clocks on-board satellites of the Global Positioning System (GPS) and other satellite navigation systems. Currently there are about 30 satellites in the GPS constellation orbiting the Earth with an orbital radius of 26,600 km with a half of a sidereal day period. As defects sweep through the GPS constellation, satellite clock readings are affected. For two diametrically-opposed satellites the maximum time delay between clock perturbations would be ~ 200 s, assuming the sweep with a typical speed of 300 km/s. Different types of topological defects (e.g., domain walls versus monopoles) would yield distinct cross-correlation signatures. While the GPS is affected by a multitude of systematic effects, e.g., solar flares, temperature and clock frequency modulations as the satellites come in out of the Earth shadow, none of conventional effects would propagate with 300 km/s through the network. Additional constraints can come from analyzing extensive terrestrial network of atomic clocks on GPS tracking stations.

The performance of GPS on-board clocks is certainly lagging behind state-of-the art laboratory clocks. Focusing on laboratory clocks, one could carry out a dark matter search employing the vast network of atomic clocks at national standards laboratories used for evaluating the TAI timescale. Moreover, several elements of high-quality optical links for clock comparisons have been already demonstrated in Europe, with 920 km link connecting two laboratories in Germany.

Naturally I hope that this proposal motivates dark matter searches with atomic physics tools pushing our “listening capabilities” to the next level. This proposal could provide fundamental physics motivation to building high-quality terrestrial and space-based networks of clocks. As the detection schemes would benefit from improved accuracy of short-term time and frequency determination, following this path could stimulate advances in ultra-stable atomic clocks and Heisenberg-limited time-keeping.

Why is the atomic many-body problem so difficult?

illustration of an atomOne could rightfully state that the atomic-structure problem has been around for a very long time. Yes, this is true - in fact quantum mechanics has been invented to explain atomic properties. Then why do we still struggle to solve it?

Should we be embarrassed by our inability to solve this basic problem? Sure we can solve it approximately, but solving it accurately is another story.

So what is holding us back? It is the very same entanglement and complexity of Hilbert space (that is where wave functions live) that makes quantum computing so powerful. To illustrate this enormous complexity, I'll take my favorite atom, cesium. It has 55 electrons. With 3 degrees of freedom per electron (x, y, and z), the Cs wave function depends on 55 \times 3 = 155 coordinates. As a result of the calculation I would need to store the wave function. If I  were to take a very poor grid of 10 points per coordinate, the storage would require $latex 10^{155}$ memory units.

$latex 10^{155}$ is of course an astronomically large number - in fact it exceed an estimated number of atoms in the Universe, 10^{80}. So even if we were able to compute the Cs wave function, there is no plausible way to store it in usable form.