New Numbers

Where algorithms can become really interesting is when seemingly innocuous, standard inputs create entirely brand-new outputs.

Algorithms seem to be a natural consequence of repetitive actions. For most humans, doing the same thing in the same way over and over gets boring. We thus wonder if there is a way to codify those repetitive actions to streamline the process. A lot of modern math seems to be a result of the codification of the processes used to manipulate numbers. When you multiply 157×2,693, you probably don’t count the individual units in each group in sequence. You likely use a calculator (programmed with algorithms) or a pencil-and-paper method that has you starting with 7×3.

One way to interpret the history of numbers is that certain numbers didn’t exist until they were produced by an algorithm. Think of negative numbers. They are common enough now, especially for those of us who live in cold climates, but if you think about it, they aren’t intuitive. It’s hard to imagine ancient humans looking at a bunch of mammoths and thinking there might one day be a negative amount of them. There could be ten mammoths on the plain, two mammoths, or no mammoths, but negative five mammoths? Not likely.

In the book Arithmetic, Paul Lockhart suggests that negative numbers are the result of subtraction. Imagine you are in agriculture 3,000 years ago. The addition algorithm says that when you have three bags of grain, and you trade for two more, you will have five bags of grain. But then you decide to give one to your poor cousin. Now you have to “un-add,” or subtract. The act of subtracting is really the acknowledgment of the negative. You have five bags of grain and negative one bag of grain. And if you are interested in the processes you’ve just exposed your numbers to, you’ve got an interesting problem.

Lockhart says, “The issue here is symmetry—or rather, the lack thereof. With addition, no matter what number I have and no matter what number I add to it, their sum is a perfectly valid entity, already extant in the realm of numbers. With the subtraction operation, however, we have an unpleasant restriction: the number we are taking away cannot exceed the number we have. There definitely already is a number that when added to three makes five, but (at the moment) none of our numbers play the role of ‘the thing that when added to five makes three.’”1 Use of subtraction quite likely prompted the idea of negative numbers, which aren’t obvious in the physical representations of amounts we would have encountered in everyday life.

Essentially our process might have gone something like this: Let’s say our ancient subtraction algorithm for two values is something along the lines of input two distinct, countable whole quantities. Remove the quantity of the second value from the quantity of the first value. So five bags of grain minus one bag of grain becomes four bags of grain. But because our algorithm didn’t say anything about the minimum value of the quantities, we could easily input six and then nine. Which leaves us with what? The non-intuitive concept of a number representing negative three.

The same thinking gives us an understanding of how we got irrational numbers. After some playing around with numbers and their properties, a process was invented called “square root.” The square root of a number is another number that, when multiplied by itself, gives you the original number. So the square root of 9 is 3, and the square root of 64 is 8. We can have a lot of fun plugging various number inputs into the algorithm that calculates a square root. Some numbers don’t get such pretty results, and their square root is a fraction. But they are still rational numbers. However, plug 2 into the algorithm for calculating square root (and why wouldn’t you? It’s such an accessible little number) and you get something entirely different. The world’s first irrational number.

Finding quality inputs

Algorithms are developed to get a certain output. As we’ve discussed, you start with inputs, you follow a process, and you end up with expected outputs. However, sometimes it’s not obvious which inputs will result in the desired outputs. So one way to use this model is to help you determine and refine what kind of inputs to feed into it in the first place. You can consider it “algorithmic thinking.” You may not have the luxury of a completely closed system where you can implement complete end-to-end automation, but the lens of algorithms can show you how to organize your system to leave as little to chance as possible.

In the late 1920s, one company developed a repeatable process to try to create the world’s first broad-spectrum antibiotic. After World War I, scientists had a good understanding of bacterial infection. They were able to identify some of the primary bacteria, such as Streptococcus, that caused incurable infections. They also understood how and why bacterial infection often occurred, such as from exposure to contaminated tools and instruments. But once infection took hold in the body, there was no way to stop it. What was missing was an understanding of bacteria—how it worked and where it was vulnerable.

Bayer, a giant German pharmaceutical company whose origins lie in dye-making, decided there was money to be made if they could find a cure for bacterial infections inside the body. There was some indication that a substance with bacteria-fighting properties could be created; earlier research had produced a treatment for syphilis called Salvarsan, but nothing else had been found in the subsequent 15 years.23

In charge of pharmaceutical research for Bayer was Heinrich Hörlein*. He thought the research to find bacteria-killing drugs was lacking scale and therefore too much was dependent on individual scientists. So at Bayer, he created an industrial system to identify possible antibacterial compounds and hired dozens of people to put each antibiotic candidate through the same algorithmic-like process.

In The Demon Under the Microscope, Thomas Hager explains Hörlein knew the search would take years but also knew that success would result in enormous profits. Thus he aimed “to expand drug research from the lab of a single scientist to an efficiently organized industrial process with carefully chosen specialists guided by a coordinated strategy.” Hörlein hired Gerhard Domagk* to run the “recipe,” putting each compound created by the chemists through an identical testing and evaluation process to see if the result would be an antibiotic that was safe for humans.24

Domagk and his team tested the chemicals given to them by Bayer’s chemists. One of the most prolific chemists was Josef Klarer. He produced hundreds of new chemicals that were systematically tested by Domagk and his assistants. Each chemical compound was tested against a panel of “the most common and deadly bacteria: tuberculosis, pneumonia, StaphylococcusE. coli, and Streptococcus pyogenes.” After a bit of initial refining, Hörlein and Domagk created “a smooth-functioning, reliable machine for discovery.” The chemicals were tested both in test tubes and in living animals. In the animals, each chemical was “delivered three different ways (intravenously, subcutaneously, and by mouth).” Every chemical was tested the same way in mice, and meticulous records were kept of each test.25

Time went by. Thousands of mice died. But they did not give up on their process. As the years went on, “despite the repeated negative results, Domagk changed neither his methods nor his approach.”26 The team knew their recipe for testing was correct, and one day it would produce a result that would allow them to refine their inputs.

In the fall of 1932, the methodology and patience paid off. Klarer decided to attach sulfur to an azo compound. Chemical Kl-695 was put through the testing process that thousands of other chemicals had been put through in the previous years. For the first time, the process produced the desired result: mice that recovered from bacterial infection with no apparent toxicity. Domagk didn’t yet know how it worked, only that it did. “Strangely, it did not kill strep in a test tube, only in living animals. And it worked only on strep, none of the other disease-causing bacteria. But given the number and deadliness of strep diseases, it worked where it counted.” Funny enough, Domagk was on vacation during the first round of testing of Kl-695 and so missed witnessing the initial breakthrough.27 But the process by then was so entrenched, any one of the dozens of people on the team could run it.

The discovery of chemical Kl-695 allowed the team at Bayer to refine the inputs they put into their testing algorithm. “Klarer now made variations on Kl-695, finding that as long as sulfa was attached to the azo-dye frame in the correct position, the drug worked against strep. Attaching sulfa to an azo dye—any azo dye—somehow transformed it from an erratic, ineffective chemical into an efficient anti-strep medication.”28 They kept refining their inputs so that more effective azo-sulfa compounds were discovered, including Kl-730.

What the Bayer scientists didn’t realize was that it wasn’t the azo-sulfa combination that was the key, but rather the sulfa itself. Later research demonstrated the efficacy of the sulfa in treating strep infections. Structurally sulfa looks a lot like PABA, a key nutrient for some disease-causing bacteria like strep. The bacteria bind to sulfa, mistaking it for PABA, but cannot metabolize it, effectively killing them. Sulfa is cheap and widely available, so once Bayer’s sulfa antibiotic was on the market, many companies began to make their own.29

Bayer’s algorithmic-like approach that led to the discovery of the antibiotic properties of sulfa had wide-reaching effects. “Sulfa also changed the way drug research was done. Before sulfa, small laboratories followed investigators’ hunches, and patent-medicine makers cobbled together remedies without testing the results. After sulfa, industrial-scale chemical investigation guided by specific therapeutic goals—the system for finding new medicines pioneered by Hörlein and his Bayer team—became the standard. Successful drugmakers were those who followed the Bayer model.”30 Bayer continued to discover many useful antibiotics using a system that codified the process as much as possible.