The U.S.and the Soviet Union agreed to a moratorium on nuclear testing and the U.K.signed on shortly after.

The Fast Fourier Transform (FFT)

The Fast Fourier Transform (FFT) is one of the most important algorithms of all time, and is used in many applications such as watching videos, radar and sonar, 5G and WiFi. It was discovered by scientists trying to detect covert nuclear weapons tests, and if they had discovered it sooner, it may have put a stop to the nuclear arms race.

The nuclear arms race began due to the devastating effects of the atomic bombs dropped on Hiroshima and Nagasaki. The U.S. proposed the Baruch Plan, which proposed that an international body would control all radioactive materials on earth, from mining to refining to using these materials for peaceful purposes as a nuclear power generation. However, the Soviets rejected the proposal, leading to the nuclear arms race.

The U.S. then conducted nuclear tests in remote places like the Arctic and South Pacific Islands. In 1954, the U.S. tested a new thermonuclear design at Bikini Atoll, which released two and a half times more energy than expected, leading to public outcry against nuclear testing. This led to world leaders calling for a comprehensive test ban, and the U.S., Soviet Union, and U.K. agreed to a moratorium on nuclear testing. And to show just how serious they were, they even stopped all testing during the negotiations which is why 1959 is the only year in a long stretch when no nuclear weapons were detonated. But there was a big problem with negotiating a comprehensive test ban: how do you know that the other side will hold up their end of the bargain? The U.S. worried that the Soviets would continue testing covertly and leapfrog them technologically and the Soviets similarly distrusted the U.S.. So to address these concerns, they convened the Conference of Experts to Study the Possibility of Detecting Violations of a Possible Agreement on Suspension of Nuclear Tests. Seriously, that was the name, I am not making this up. Now detecting atmospheric tests was fairly straightforward. The radioactive isotopes they produced disperse in the atmosphere and can be detected thousands of kilometers away. Underwater tests produce distinctive sounds that travel efficiently around a kilometer below the surface of the ocean and these can be picked up by hydrophones. But underground tests are much harder to detect from afar because their radiation is mostly contained and the Soviets refused to allow onsite compliance visits which they regarded as espionage. And this is ultimately why when a test ban treaty was signed in 1963, it was only a partial ban. It banned testing underwater, in the atmosphere and in space, places where compliance could be verified, but it didn’t ban testing underground for the simple reason that it was almost impossible to verify compliance.

But scientists had been trying to find a way to reliably detect underground detonations. Following the Geneva meeting, American and Soviet scientists formed a working group to discuss the issue and their idea was to use seismometers located outside the countries where nukes were being tested to detect the faint ground vibrations caused by explosions. The problem was how do you distinguish a nuclear test from an earthquake? Every day around the world, there are close to 300 earthquakes with a magnitude of three or greater. In addition, part of the purpose of detecting your adversaries tests is to spy on them, to know how big of an explosion they can make. But the seismometer signal depends not only on the yield of the device but also on how deep it was buried. For a given yield, deeper explosions appear smaller. So scientists wanted a method to reliably determine whether a given signal was a bomb or an earthquake and if it was a bomb, how big it was and how deep it was buried. They knew that all this information was contained in the seismometer signal but it couldn’t just be read off by looking at the squiggles. They needed to know how much of different frequencies were present which meant they needed to take a Fourier transform. A Fourier transform is a way of decomposing a signal into pure sine waves, each with its own amplitude and frequency that add to make it up. This seems a bit like magic since the sum of a set of sine waves can look arbitrarily complicated and nothing like its component parts. There are some elegant ways to understand how Fourier transforms work, shout out to the awesome video by 3Blue1Brown. But I wanna take more of a brute force approach. If you wanna know how much of a particular sine wave is in a signal, just multiply the signal by the sine wave at each point and then add up the area under the curve. As a simple example, say our signal is just a sine wave with a certain frequency but pretend we don’t know that and we’re trying to figure out which sine waves add to make it up. Well, if you multiply this signal by a sine wave of an arbitrary frequency, the two waves are uncorrelated. You’re just as likely to find places where they have the same sign, both positive or both negative as where they have opposite signs and therefore when you multiply them together, the area above the x-axis is equal to the area below the x-axis. So, it can be concluded that for real-world signals, the Fourier Transform needs to be a Discreet Fourier Transform. This is because the signals are finite and made up of individual samples or data points. The number and size of the frequency bins is determined by the number of samples in the signal and how closely spaced they are. The lower the frequency resolution, the shorter the signal, and the wider each frequency bin is. The lowest non-zero frequency is one whose period is equal to the duration of the signal, and the higher frequency bins are integer multiples of this frequency. The total number of frequency bins is equal to the number of samples in the signal. Then take the Fourier transform of both halves and combine the results.So the even points are transformed into the even frequencies and the odd points are transformed into the odd frequencies.But this requires twice as many calculations as before.But then you split the even and odd points again and repeat the process.So now you have four halves and you take the Fourier transform of each.And then you split them into eight halves and take the Fourier transform of each of those.And then you keep splitting and transforming until you have your final result.So the Fast Fourier Transform is a way of computing discrete Fourier transforms with many fewer calculations.

If a signal is composed of eight samples, then the discrete Fourier transform will have eight frequency bins ranging from 0 to 7 times the fundamental frequency. As an example, if a signal is composed of 8 samples, the first bin corresponds to a frequency of 0 and is essentially measuring the DC offset. To calculate this, each data point is multiplied by 1 and all the points are added together, in this case adding to 0. The second frequency bin corresponds to the frequency that fits one period in the duration of the signal, in this case 1 hertz. To calculate this, each point is multiplied by both a sine wave and a cosine wave of this frequency and added up separately. This process is then repeated for 2 hertz, 3 hertz and so on, up to 7 hertz, giving the discreet Fourier transform of the signal.

Calculating the discrete Fourier transform in this way requires N squared complex multiplications, which is doable for 8 samples but would take over 3 years to complete for 1 million samples. To solve this, in 1963 Tukey developed the Fast Fourier Transform which requires Nlog base two of N calculations, meaning the bigger the data set, the greater the savings. To calculate the Fast Fourier Transform, the samples are split into even and odd index points, the Fourier transform of each is taken and then the results are combined. The process is then repeated for each half, then each quarter and so on until the final result is achieved. You still need to multiply each of these by the eight different frequency waves. But now let’s look only at the even points and compare the first four frequencies to the second four frequencies. And what you find is that in each case at the location of the samples, the values of the two sine waves are the same. A similar pattern can be observed for the odd index points except the values of one sine wave are the negative of the other. More generally, they’re related by a complex number. But what this means is that you don’t have to do all the multiplication for the second half of the frequencies. Once you’ve calculated the odd and even sums for the lower half the frequencies, you can reuse these values to find the upper half. So you’ve effectively cut the number of calculations required in half but that’s only a factor of two, how do you get down to $N\log_{2}N$? Well, you repeat the same trick, split the samples again into even an odd points and then again repeatedly until you’re down to single data points. At each split, you can exploit the symmetries of sinusoidal functions to cut the number of calculations in half. And that is how the Fast Fourier Transform reduces $N^2$ calculations down to $N\log_{2}N$. And it’s why today whenever anyone takes a Fourier transform, it’s almost always done as a Fast Fourier Transform.

Garwin approached an IBM researcher, James Cooley to program a computer to run the FFT algorithm but he didn’t tell him the reason was to detect underground Soviet nuclear tests. He said it was to work out the spins in a crystal of helium-3. Cooley and Tukey published the algorithm in a seminal 1965 paper and its use immediately took off but it was too late to secure a comprehensive nuclear test ban. By that time, the U.K., France and China had joined the Soviet Union and the U.S. as nuclear powers. And the partial test ban treaty, far from deescalating the nuclear arms race just sent it underground.

The thinking was if you were only allowed to test underground, then you better be testing extensively so as not to fall behind all the other nuclear states. So from 1963, 1,500 nuclear weapons were detonated. That’s roughly one a week for 30 years. This testing facilitated the construction of an absurd number of nukes. At the peak in the mid 1980s, 70,000 nuclear warheads were in existence. Total expenditure on nukes in the 20th century is estimated at around $10 trillion each for the U.S. and the Soviet Union adjusting for inflation.

If only scientists had been confident in their ability to remotely detect underground tests in the early 1960s, then a comprehensive test ban could have been reached, stopping the nuclear arms race before it really got going. To check how realistic this is, I asked Richard Garwin himself.

Well, it’s a good story.

It is a good story.

It would’ve helped and it might have turned the tide.

But that would’ve required an earlier discovery of the Fast Fourier Transform and as luck would have it, it was discovered earlier but then forgotten.

All the way back in 1805, Mathematician Carl Friedrich Gauss was studying the newly discovered asteroids of Pallas, Ceres and Juno. To determine the orbit of Juno, Gauss devised a novel approach to harmonic analysis and he jotted it down in his notes but later used a different method and he never thought to publish that first insight. Today we can see that Gauss had figured out the discreet Fourier Transform before Joseph Fourier himself published it in 1807. And he had developed the same Fast Fourier Transform algorithm as Cooley and Tukey a century and a half earlier.

The reason his breakthrough was not widely adopted was because it only appeared after his death in volume three of his collected works and it was written with non-standard notation in a 19th century version of Latin. If Gauss had realized the significance of his discovery and published it in a way that others could understand, the Fast Fourier Transform (FFT) may have had an even more dramatic impact on our world. The FFT is now used in a wide range of applications from solving differential equations to radar and sonar, studying crystal structures, WiFi and 5G, and most compression algorithms like those that allow us to watch and listen to videos.

Most people don’t think about the cumulative impact of their life’s work, but with an average career of 80,000 hours, it is our biggest opportunity to make a difference in the world. For those who want to make a positive impact, 80,000 Hours offers tons of resources to help find fulfilling careers that make a difference, from research-backed guides to a regular newsletter and podcast, and a curated job board. For over 10 years, 80,000 Hours has conducted research with academics at Oxford University to provide advice on how to find a fulfilling career that makes a positive difference. Their recommendations are evidence-based, specific and actionable. If you’re interested in evidence-backed advice that goes beyond the cliches of “follow your passion”, then check out 80,000 Hours. Everything from their podcast to their job board is free, and signing up for the newsletter will get you a free copy of their in-depth career guide. So if you want to tackle the world’s most pressing problems, sign up now at 80000hours.org/veritasium.

I’d like to thank 80,000 Hours for sponsoring this video, and thank you for watching.