## Posts Tagged 'Math Concepts'

### The predictability of randomness and the age of Earth

The Canyon Diablo meteorite, which was used to provide the first accurate measurement of Earth’s age. [Credit: Geoffrey Notkin]

If you want something completely predictable, pick something random. Discworld aside, if you toss a coin, it will land either on heads or tails, every time, with very close to a 50% chance of either option; if you throw a regular fair die, it will always land on one of the six sides with nearly equal likelihood.

By contrast, a process like throwing a baseball is a complex thing: the shape of the ball, the seams, the flow of air around it, the spin you give it when you throw, and so forth are all completely non-random factors in determining the ball’s trajectory. Sure, there are random factors too, but the crazy unpredictability of a knuckleball comes from the action of forces, each of which are technically deterministic in character. Of course, arguably the coin and die are governed by the same forces, but the options on how they land are restricted enough that the net result is randomness.

However, some systems are truly random, completely free of such considerations — and those are phenomenally reliable. The best example may be radioactive decay of many atomic nuclei. The random process of radioactive decay provides the best and most accurate way to measure the age of rocks, which in turn lets us measure the age of Earth.

### Think your love life is dangerous? Try dating meteorites

The April 20 episode of Cosmos dealt in part with the first reliable dating of Earth, using meteorite fragments. Most meteorites formed in the early Solar System, when the cloud of gas and dust collapsed to make the Sun and planets, so they are a remarkably pristine sample of what things were like 4.5 billion years ago. By contrast, most Earth rocks are significantly younger, since they were molten when our planet was born.

To measure the age of Earth, researchers — including the featured scientist Clair Patterson — performed two sets of measurements. They determined the relative amounts of different forms of lead and uranium in crystals known as zircons, embedded within the Canyon Diablo meteorite fragment that formed Meteor Crater. (See Maki Naro’s comics on visiting the crater for more information about the impact.) Patterson found that the amount of environmental lead from leaded gasoline was contaminating his experiment, inspiring him to build an ultra-clean lab. He also became a crusader to remove lead from gasoline, realizing rightly that the amount of lead being exhausted into the air was harmful to human health and the environment.

But back to our main topic! While the first measurement of Earth’s age involved finding precise amounts of uranium and different forms of lead, the second measurement determined the rate of decay of uranium into lead. That rate is commonly written as its half-life: the amount of time for half the uranium nuclei to decay. The half-life for uranium[1] is about 4.5 billion years, so it’s a good choice to date ancient rocks.

The uranium-lead transition isn’t the only one used. There’s also the rubidium-strontium transition (with a half-life of 49 billion years), the potassium-argon decay (1.3 billion years), and a handful of others. The key in all these cases is to have a rock — meteorite or otherwise — with enough atoms of the given types to perform measurements. One we don’t see in the list is “carbon dating”: carbon-14 has a half-life of about 5,700 years, so it’s useful for archaeology and dating the remains of animals from the relatively recent past, but utterly useless for measuring the age of Earth.

But why is a random process like radioactive decay useful as a clock?

### A random, precise clock

If you take a single uranium atom, there’s no way to predict exactly when it will decay. The best you can do is assign a probability: how likely is it to decay in a certain amount of time. But here’s the deal: that probability is independent of stuff like location, temperature, pressure, and the chemical composition of materials around it.[2] That means a uranium atom on Earth’s surface, deep underground, on the Moon, or on an asteroid will have the same probability of decay.

As with coins and dice, the value of randomness comes from having many nuclei to study. Then, instead of looking at the probability of a single nucleus decaying, we can ask how many will decay on average in a given amount of time. That amount changes — the more nuclei you have, the more likely it is that one or more in the sample will decay — but it does so in a very predictable way. Specifically, the speed of decay (that is, the number of nuclei decaying in a given amount of time) depends directly on the number of nuclei you have.

So, let’s see this in action. The following video is a simulation of a sample of fake nuclei in a grid. Each point in the grid starts off with 1000 nuclei, each of which has a 5% chance of decaying in one unit of time, which could be one second, one minute, one century…whatever. The color represents the ratio of the number of unstable nuclei to the nuclei they decay into, growing bluer as the unstable nuclei disappear.

However, we can’t actually see decays happening like that in the lab. Instead, we use a Geiger counter to measure any particles given off by the decay, which are typically gamma ray photons, electrons, positrons, or alpha particles (which are the same thing as helium nuclei). The graph above the grid in the video is what the Geiger counter would read, including a small amount of “noise” produced by other radioactive sources in the environment.

And here’s that Geiger counter reading again, but this time with the half-life marked out. You can see that although each nucleus decayed randomly, there’s nothing random about the outcome: the number of decays follows a clear curve with almost no deviations. This type of curve is known as an exponential decay, because it’s described mathematically using the exponential function.

Half-life calculation for a large population of atoms. The half-life is the point in the cross-hairs, which is about 13.5 in these time units. I included both the calculation with and without correcting for background radiation. In this case, the difference is small. [Credit: moi]

If you have fewer nuclei to start off with, then the effect of background radiation is more important. Here’s what that might look like:

If we fail to correct for the background radiation, we can mis-estimate the half-life. However, the half-life is the same, no matter how many nuclei we start with! It’s a very accurate clock.

Nuclear decay calculation for a smaller population of atoms, with the same half-life as before. Now we can see that if we fail to correct for background radiation, we run into some trouble: the red crosshairs show a shorter half-life than the corrected black crosshairs. [Credit: moi]

Finally, if there are very few nuclei in our sample, the half-life estimate can be hard to perform, because the background radiation can swamp the Geiger counter readings.

That means when researchers are measuring the ages of rocks, they are rarely using Geiger counters to do so! Instead, they measure the ratios of atoms (using mass spectroscopy, an interesting topic in its own right), then rely on the predictability of random decay to determine how those ratios changed over billions of years.

### Reliability, randomness, and a creationist fallacy

This method is generally known as radiometric dating, and it’s incredibly reliable: multiple measurements give the same results. In this way, we know the age of Earth to within about 20 million years, a remarkable 99.6% accuracy. (Sure, 20 million years sounds like a big number, but relative to 4.5 billion years, it’s pretty small!)

However, a few people still go against the evidence and dispute the measured age of Earth. The most vocal of these are the young-Earth creationists, who believe Earth is somewhere between 6,000 and 10,000 years old (depending on which version). For this to work, of course, radiometric dating has to be flawed in some fundamental way, or God has to be a deceiver by creating rocks to look old when they’re actually young. If the case of the deceiver God is true, we’re all doomed: there’s no possible way to know the true age of Earth from science (and frankly I’d worry if my theology let God be that nasty).

To assert that radiometric dating is flawed, though, is almost equally problematic. Again, each unstable nucleus decays randomly, and that’s where the half-life measure comes from. For this process to give false results, you’d have to have every type of nucleus change its probability of decay exactly the same way…which makes no sense. If the speed of light changes in time (as some have suggested), it might rescale the decay rates, but different atom types would yield different ages for rocks, instead of giving the same results.

In other words, the randomness of decay is precisely why we can trust the measurements of Earth’s age. You can count on randomness.

### Notes

1. I’m leaving out many details here. Uranium has several different isotopes (different numbers of neutrons), which are all unstable. The one of interest to us is uranium-238; another common isotope, uranium-235, has a half-life of about 700 million years. That isn’t long enough to date meteorites, though it could be used to establish the age of many younger rocks.
2. There are complications, of course. For example: a mineral that’s buried is unlikely to be bombarded by cosmic rays, which can in some cases change nuclei from one isotope to another. So, location does matter to a degree.