Tides of War, Part Two

The first part of this post appeared on the blog in November 2016. The second part was supposed to come out within a week or two, as soon as I found a little more on the post-war use of analog tide-predicting machines. Unfortunately, the search for “a little more” ended up taking way more time than expected and turning up nothing within easy reach. I’d skip the apology if it wasn’t for the fact that the proper references to Anna Carlsson-Hyslop‘s research (discussed in part one) were buried in the source note at the end of this post. Sorry.

Tide-predicting machines, the first of which appeared in the late nineteenth century, were an elegant mechanical solution to a complex mathematical problem. Used mostly to produce information useful to commercial shipping, during the two world wars they also played an important role in the planning of amphibious operations like the Normandy landings.

That contribution is interesting enough to give them a spot in the history of war, but the basic design of the British machines – using multiple gears to create an analog approximation of a mathematical function – also has an oblique connection to one of the most important technical achievements of the war: the mechanization of cryptanalysis.

Alan Turing is justifiably famous for his role in the breaking of the German Enigma cipher, and particularly for his contribution to designing electro-mechanical computing tools that transformed the process. (Even if popular versions of the story do a terrible disservice to the story by erasing everyone except Turing from the picture. Imitation Game, I’m looking at you.) Less well known are some of Turing’s pre-war flirtations with the mechanization of mathematical problem-solving. Andrew Hodges’ biography describes two projects which Turing took on, at least briefly. The first, during his time at Princeton in 1937, was to use electromagnetic relays for binary multiplication to create an extremely large number that could be used as the key for a cipher. This was, as Hodges puts it, a “surprisingly feeble” idea for a cipher but a practical success as far as constructing the relays was concerned.

The second project was an attempt to disprove the Riemann hypothesis about the distribution of prime numbers by calculating the Riemann zeta-function (for the hypothesis and the zeta-function, read Hodges or Wikipedia. There’s no chance of me describing it properly) by showing that not all instances where the function reached zero lay on a single line, as the hypothesis stated. An Oxford mathematician had already calculated the first 104 zeroes using punched-care machines to implement one approximation of the function. Since the zeta-function was the sum of circular functions of different frequencies, just like Thomson’s harmonic analysis of the tides, Turing realized it could be calculated using the same method. Or, more precisely, the machine could rule out enough values that only a few would have to be calculated by hand.

With a grant of £40 from the Royal Society, Turing and Donald MacPhail designed a machine that, like the tide calculators, used meshed gear wheels to approximate the thirty frequencies involved. The blueprint was completed by 17 July 1939 and the grinding of the wheels was underway when the war broke out at Turing joined the Government Code and Cypher School at Bletchley Park.

Nothing in the work that Turing did at Bletchley connected directly to the zeta-function machine, but, as Hodges notes, it was unusual for a mathematician like Turing to have any interest in using machines to tackle abstract problems of this sort. Clearly, though, Turing had been mulling the question of how machines could be applied to pure mathematics long before he became involved in the specific cryptanalytic problems that were tackled at Bletchley.

Of course, the secrecy surrounding code-breaking meant that no hint of the connection, or any of Turing’s wartime work, would have leaked out to those operating the tide-predicting machines in Liverpool or elsewhere. The end of the war meant a return to usual practice, but their strategic importance remained.

Probably the last analog machine to be constructed was a thirty-four constituent machine built in 1952–5 for East Germany (and now in the collection of the German Maritime Museum in Bremen). The Soviet Union had ordered a Kelvin-type machine for forty constituents from Légé and Co. in 1941 that was delivered to the State Oceanographic Institute in Moscow in 1946, on the eve of the Cold War. Bernard Zetler, an oceanographer who worked on tide prediction at the Scripps Institution of Oceanography in San Diego, recalls that he was unable to visit the machine in 1971 because it or its location was classified. The Soviet tide tables certainly were.

The American Tide Predicting Machine No. 2 remained in use until 1966, but played no role in the American amphibious landing at Inchon during the Korean War. The wide tidal range at Inchon meant that the landing needed good tidal information, but rather than making new calculations the existing American and Japanese tide tables were supplemented by first-hand observation by Navy Lieutenant Eugene F. Clark, whose unit reconnoitered the area for two weeks preceding the landings.

When analog machines like Tide Predicting Machine No. 2 were retired, they were replaced by digital computers whose architecture originated in other wartime projects like the ENIAC computer, which had been built to calculate ballistics tables for US artillery. The world’s navies have not relinquished their interest in tools to predict the tides. Their use, though, has never matched the high drama of prediction during the Second World War.

Source Note: The D-Day predictions are discussed many places on the internet, but almost all the accounts trace back to an article oceanographer Bruce Parker published in Physics Today, adapted from his 2010 book The Power of the Sea. Where Parker disagrees with the inventory of machines commissioned by the National Oceanography Centre, Liverpool (itself a descendant of the Liverpool Tidal Institute), I’ve followed Parker. Details on the work of Arthur Doodson and the Liverpool Tidal Institute come from Anna Carlsson-Hyslop‘s work: the articles “Human Computing Practices and Patronage: Antiaircraft Ballistics and Tidal Calculations in First World War Britain,” Information & Culture: A Journal of History 50:1 (2015) and “Patronage and Practice in British Oceanography: The Mixed Patronage of Storm Surge Science at the Liverpool Tidal Institute, 1919–1959,” Historical Studies in the Natural Sciences 46:3 (2016), and her dissertation for the University of Manchester (accessible through the NERC Open Repository). The scientist suspected of Nazi sympathies was Harald Sverdrup, a Norwegian national who worked with Walter Munk on wave prediction methods used in several amphibious landings. Turing’s experiments calculating the Riemann zeta-function appear in Andrew Hodges, Alan Turing: The Engima (1983; my edition the 2014 Vintage movie tie-in).

Tides of War, Part One

The best-known story about environmental science and D-Day has to be that of the last-minute forecast that let the invasion go ahead. That prediction, though, was only one of many contributions by Allied environmental scientists to the success of the invasion. Another was the secretive preparation of mundane but vital preparations for the assault: calculating the tides for D-Day.

The theoretical basis for tide prediction was the work of Newton, Daniel Bernoulli, and Pierre Simon Laplace, the third of whom was the first to outline the equations that describe the rise and fall of the tides. Laplace’s equations were too complex to use in practice, but in the mid-nineteenth century the British scientist William Thomson (later ennobled as Lord Kelvin) demonstrated that, given enough tidal measurements, one could use harmonic analysis to divide the tide-generating forces for a particular shoreline into a series of waves of known frequencies and amplitudes (the tidal constituents). That same process, carried out in reverse, would let one predict the tides along that shore. Unfortunately, making those calculations was was time-consuming the point of impracticality. However, Thomson also demonstrated that it was possible to construct an analog machine that would do the necessary work automatically.

Thomson’s machine drew a curve representing the height of the tide with a pen that was attached to the end of a long wire. The wire ran over top of a series of pulleys, which were raised and lowered by gears which reflected the the frequency and amplitude of the tidal constituents. As each pulley rose or fell, it affected the length of the wire’s path and thus the position of the pen. Altogether, they reflected the combined effect of the tidal constituents being simulated.

Thomson's design sketch for the third tide-predicting machine, 1879. Image courtesy Wikimedia.

Thomson’s design sketch for the third tide-predicting machine, 1879. Image courtesy Wikimedia.

The first machine, built in 1872, had gears for only ten constituents, but later machines could represent many more. Machines of his design, many of them built in Great Britain, were also used in other countries to create the necessary tide tables for their ports. In the United States, a different mechanical approach developed by William Ferrel was used to build similar machines. Altogether, though, tide-predicting were specialized, expensive, and rare. According to a modern inventory, only thirty-three were ever built – twenty-five of them in London, Glasgow, or Liverpool.

During the Second World War, the Admiralty Hydrographic Office relied on two tide-predicting machines operated by Arthur Thomas Doodson at the Liverpool Tidal Institute to do all their tidal calculations. One was Thomson’s original machine, refitted to handle twenty-six constituents. The other was a machine designed by Edward Roberts in 1906 and equipped for forty constituents.

Both Doodson and the Tidal Institute had their own unique histories of military collaboration. Doodson, despite being a conscientious objector, had worked on anti-aircraft ballistics for the Ministry of Munitions during the First World War. The Institute, established in 1919 with corporate and philanthropic support, had an important connection with the Admiralty’s own Hydrographic Department. Though the Hydrographic Department did not provide any direct funding until 1923, after that it made the Institute the Admiralty’s exclusive supplier of tide calculations. At the same time, the Hydrographic Department began appointing a representative to the Institute’s governing board.

Though they were the basis for only some of the Institute’s Admiralty work during the war, the tide-predicting machines in Liverpool were busy creating tide tables for Allied ports. According to historian Anna Carlsson-Hyslop’s research, the number of tidal predictions being performed doubled from 77 for 1938, the last pre-war year, to 154 for 1945. (Carlsson-Hyslop’s research is focused on areas of the Institute’s work other than the creation of tide tables, but much of it sheds light on its relationship with the Royal Navy and state patronage.)

In 1943 the Admiralty Hydrographic Office requested calculations to create tide tables for the invasion beaches to be used on D-Day in Normandy. Since the landing zone remained top secret, Commander William Ian Farquharson was responsible for establishing the constituents and providing them (anonymized under the codename “Point Z”) to Doodson in Liverpool. Unfortunately, there were no existing calculations for the area of the beaches. Nor, because tidal constituents were sensitive to local conditions, could he just extrapolate from the data for the ports to the east and west at Le Havre and Cherbourg. Instead, Farquharson combined fragmentary data from some local measurement points near the beaches, clandestine on-the-spot measurements made by Allied beach reconnaissance teams, and guesswork to come up with eleven tidal constituents. Oceanographer Bruce Parker suspects that he began with the Le Havre constituents and then adjusted them to approximate the data he had. The calculations, despite the roughness of the information on which they were based, proved sufficiently accurate for the invasion planner.

In the Pacific, tide tables for amphibious operations were generated by the US Coast and Geodetic Survey’s Tide Predicting Machine No. 2. In both theaters, as well as the Mediterranean, oceanographers supplemented the tide tables for beaches with wind, wave, and surf forecasts. The story of wave forecasting is, if anything, even more cloak and dagger than that of the D-Day tide forecasts, since one of the scientists involved was actively suspected (incorrectly) of being a Nazi sympathizer.

Dr. E. Lester Jones, Chief, U.S. Coast and Geodetic Survey, with the Tide Predicting Machine he built. Harris & Ewing, photographer, 1915. Retrieved from the Library of Congress, https://www.loc.gov/item/hec2008004303/

A US tide predicting machine, probably No.2. The caption from the Library of Congress attributes the machine’s construction to E. Lester Jones, Chief of the Coast and Geodetic Survey. Harris & Ewing, photographer, 1915. Retrieved from the Library of Congress, https://www.loc.gov/item/hec2008004303/

Beyond their civilian and military wartime work, tide-predicting machines had an oblique impact on Second World War cryptanalysis. Those developments would eventually put the machines out of work after the war, but not before the machines would have their final strategic significance.

Forward to Part Two, including Source Notes

How the US Navy Set the Shape of the Heezen-Tharp Ocean Map

Last night’s episode of Cosmos gave a huge shout-out to the Heezen-Tharp ocean map, which sketched the contours of the Atlantic seafloor and gave a huge boost to the theory of continental drift. Behind the story of that success is the heavy hand of the Cold War: one that gave with one hand and took with another.

Oceanography was one of the hottest sciences for the US military in the early Cold War, as the navy looked at ways to understand how to hunt Soviet submarines, hide its own, and navigate across the world’s oceans. Measuring distance, depth, and gravity, among other things, was vital work for naval operations. (If you want a good account of these developments up until the 1960s, read Gary Weir’s An Ocean in Common: American Naval Officers, Scientists, and the Ocean Environment.)

Enter Bruce C. Heezen and Marie Tharp. As a trio of historians of science (Ronald E. Doel, Tanya J. Levin, and Mason K. Marker) explain in “Extending modern cartography to the ocean depths: military patronage, Cold War priorities, and the Heezen-Tharp mapping project, 1952-1959” (Journal of Historical Geography, 2006), it was the military interest in depth measurements that helped generate the data Heezen and Tharp used.

But the military involvement also had an effect on how Heezen and Tharp presented their information. Because the depth data was so militarily important, the US Navy insisted that the precise seafloor depths remain classified. Only by presenting the information in an oblique form, exaggerating the vertical height of the features by a scale of 40:1, were they able to evade this restriction. That’s why the Heezen-Tharp map and its successors have such a striking visual style.

The article goes into a lot more detail, of course, and not just on the secrecy issue. It’s well worth reading if you have access.