The Great Cat and Dog Massacre

On September 3, 1939 the United Kingdom declared war on Germany. The population of Great Britain quickly prepared for the bombing raids they expected to receive. They strung up blackout curtains, built bomb shelters, and dug trenches. Many also killed or had their pets killed, with an estimated death toll in London alone of 400,000–700,000. That’s roughly 26% of the prewar population of companion animals. Those who had their pets euthanized did so against the advice of the veterinary profession, animal welfare charities, and even the government-sponsored National Air Raid Precautions Animals’ Comittee (NARPAC). Though the British government already had an ambivalent relationship with pets before, even it had not foreseen mass preemptive killing before the first bombs had fallen. In a sense, these animals were the first casualties of Britain’s war.

It is these September killings which give Hilda Kean’s new book, The Great Cat and Dog Massacre, its title. The phrase is a play on the title of a famous essay by historian Robert Darnton on the symbolic significance of pets in early modern France and a pointer towards how Kean frames her topic. Positioning the book as a contribution to the burgeoning field of Animal Studies, Kean indicates that she is less interested in symbolism and more interested in the two-way connection between pets and their people during the war. More than that, her aim is to bring animals back from the periphery of the story, where they exist as adjuncts to how humans conceive themselves, and to put them center stage as historical actors in their own right.

For those who survived the September crisis, Kean argues, rationing, air raids, and privation brought companion animals and their masters closers together in wartime than they had been in peace – and this despite ambivalent government policy that rarely saw non-working animals as anything other than idle mouths or nuisances.

Despite Kean’s impressive research into official and unofficial sources that shed light on human-companion animal relations – government papers, personal diaries, oral histories, advertisements, and the archives of Mass Observation – its remains challenging to write an animal-centric account of these wartime moments. Still, Kean has more than enough to offer a fresh perspective on the British Home Front. That includes the lives of  the official and unofficial cat inhabitants of the prime minister’s residence at 10 Downing Street. The former was “Treasury Bill,” aka “The Munich Mouser,” a rat-catcher. The latter was “Nelson,” who served at least some of the time as Winston Churchill’s foot-warmer. Their lives, at the heart of the British war machine, are good examples of how the Second World War in Britain was more than just a “People’s War.”

Trawling for Spies

A new article in Intelligence and National Security by Stephen G. Craft reveals how the Office of Naval Intelligence (ONI) ran a counterintelligence program using fishermen along the southern Atlantic seaboard during the Second World War. Expecting that the Axis would land agents on US shores (something that happened, but only rarely) and use German and Italian-American fishermen to support U-boat operations (which seems to have happened not at all), ONI created Selected Masters & Informants (SMI) sections under naval district intelligence officers to recruit fishermen as confidential informants. Their operations caught no spies but offered some comfort that subversion was never rampant along the coast.

For fifty years from 1916 to 1966, the district intelligence officers were ONI’s contribution to local counterintelligence, security, and information-gathering in US coastal areas. The official responsibilities of the district intelligence officer were numerous. According to Wyman Packard’s A Century of U.S. Naval Intelligence they included:

maintenance of press relations for district headquarters; liaison with the investigating units of federal, state, and city agencies within the naval district; liaison with public and private research agencies and with business interests having information in intelligence fields; liaison with ONI and the intelligence services of the other naval districts, and with forces afloat within the district; counterespionage, security, and investigations; collection, evaluation, and recording of information regarding persons or organizations of value (or opposed) to the Navy; preparation and maintenance of intelligence plans for war; and administrative supervision over the recruiting, training, and activities of the appropriate personnel of the Naval Reserve within the district.

The position was only eliminated in 1966, when its investigative and counterintelligence duties passed to the Naval Investigative Service, its intelligence-collection duties to local Naval Field Operational Support Groups, and its other sundry tasks to the district staff intelligence officer.

Admiral Ingersoll, Commander-in-Chief, Atlantic Fleet, and Rear Admiral James at Charleston, South Carolina during an inspection of the Sixth Naval District, 23 November 1943. Courtesy of Mrs. Arthur C. Nagle. Collection of the Naval History and Heritage Command, NH 90955.

In October 1942, the creation of SMI sections added recruiting fishermen as counterintelligence agents to the long list of task mentioned above. By January 1943, the SMI sections had recruited 586 agents, 200 of them in the Sixth Naval District (headquartered in Charleston, South Carolina). Ship owners were paid $50 to cover installation of a radiotelephone, which were provided to about 50 craft. Otherwise masters were to report by carrier pigeon (the Navy operated lofts in Mayport, Florida and St Simon’s Island, Georgia) or by collect call once ashore. Some masters also received nautical charts that were overprinted with a confidential Navy grid for reporting purposes.

Shrimp fleet in harbor, St. Augustine, St. Johns County, Florida, 1936 or 1937. Photograph by Frances Benjamin Johnston. Retrieved from the Library of Congress, https://www.loc.gov/item/csas200800452/. (Accessed April 20, 2017.)

The absence of winter shrimp fishing, a tendency to cluster in good fishing spots, and the total absence of enemy covert activity all combined to limit the program’s impact. Further south in the Seventh District (headquartered in Jacksonville and Miami, Florida), most fishing was done so close to shore that the district did not bother to implement the program. In a few cases fishing boats were attacked by U-boat and two confidential observers were reportedly killed in submarine attacks. Though the program operated until V-J Day, few reports of interest were ever received. Similar operations took place elsewhere in the US as well, with scattered references in Packard’s Century of U.S. Naval Intelligence to fishing vessels as observers in other naval districts too.

Shrimp boats were the basis for both overt and covert surveillance. Navy patrol craft like the YP-487 were known as “Shrimpers” because of their origins as commercial fishing boats. Collection of the Navy History and Heritage Command, NH 106994.

How successful you consider the program will depend on how plausible you consider the Navy’s fear of subversion and agent landings. However, the idea of using commercial seafarers as observers and informants clearly proved itself enough to resurface from time to time after the war. In 1947, the Chief of Naval Operations issued a letter authorizing the placing of informants on US merchant ships to detect any crew members involved in subversive activities (this was known as the Special Observer–Merchant Marine Plan). In 1955, merchant ships and fishing vessels were included in plans to collect “merchant intelligence” (MERINT) on sightings of ships, submarines, and aircraft (both efforts referenced in Packard). Did the use of fishermen for counterintelligence continue into the 1960s or beyond? If so, there might have been agents on the boats involved in joint US–Soviet fishing enterprises of the 1980s, carefully watching the Soviets carefully watch the Americans.

Excluded Computers: Marie Hicks’s Programmed Inequality

It should be no surprise, fifty to seventy years after the fact, that the introduction of electronic computers in government and industry reflected societal prejudices on women’s employment in the workforce. Books released last year about female computers at the Jet Propulsion Laboratory and NASA Langley narrated the discrimination and exclusion of those women, whose jobs reflected the messy transition from human to automated calculation in large-scale engineering (both are jointly reviewed, along with Dava Sobel’s book on an earlier generation of female computers, in the New York Review of Books here)

The number of women involved in each of these endeavors were dwarfed, though, by the female workforce of the British civil service that’s discussed in Marie Hicks’s excellent Programmed Inequality: How Britain Discarded Women Technologists and Lost Its Edge in Computing. The Civil Service was large enough to document its decisions in painstaking detail and confident enough not to mince words in its internal papers, which makes Hicks’ book a cringeworthy account of the open, blatant, self-satisfied gender discrimination that accompanied the spread of electro-mechanical and then electronic data processing in the British government.

Hicks describes how, from the late 1940s all the way to the 1970s, the civil service took a pool of machine workers that was mostly female and deliberately and repeatedly hemmed them into job categories where their wages could be kept low and their promotion opportunities (which would mean raises) constrained, at the same time as it relied on their technical skills, practical knowledge, and commitment to keep the government running. Separate pay scales for women, eliminated in 1955, were replaced by a series of “excluded grades,” including machine workers, where pay rates would be lowered to the old women’s rate rather than raised the existing men’s rate. When the growth of automated data processing made the need for more senior professional and managerial positions obvious, the service recruited men for those positions – even when it meant starting them with no computer experience – rather than take the traumatic step of letting female staff from the machine operator grades manage men and be compensated at executive-level pay scales. Perhaps unsurprisingly, the government then found it hard to retain those men, with many taking their new skills into private industry or moving back out of computing to other areas in government.

As Hicks explains it, how the the civil service managed its workforce was not only immoral and inefficient but also terrible for the long-term health of the British computer industry. While segregating away the female computing workforce kept costs low, it also hamstrung modernization. By the time the government realized its need for programmers, most of the people with those skills, being women, could not actually be classed as “programmers,” since that job was conceptualized as higher-status and therefore reserved for men. That led the government to prioritize mainframe designs that could be run with a small expert staff, since retaining skilled male programmers was hard and female machine operators with no promotion opportunities were per se unreliable. Following that decision, made by the leading purchaser of British computers, led the companies that built British computers down a blind-alley in design at just the time that microelectronics were putting more computers on more desks and sparking a revolution in the American computer industry.

The blind alley. The International Computers Limited (ICL) 2966 was one of the last mainframe series to be designed in the UK. This machine is at the National Museum of Computing in Bletchley Park, though it’s so large that only about half is on display. Photograph by Steve Parker, CC-BY-2.0, from flickr as of April 4, 2017.

 

 

Tides of War, Part Two

The first part of this post appeared on the blog in November 2016. The second part was supposed to come out within a week or two, as soon as I found a little more on the post-war use of analog tide-predicting machines. Unfortunately, the search for “a little more” ended up taking way more time than expected and turning up nothing within easy reach. I’d skip the apology if it wasn’t for the fact that the proper references to Anna Carlsson-Hyslop‘s research (discussed in part one) were buried in the source note at the end of this post. Sorry.

Tide-predicting machines, the first of which appeared in the late nineteenth century, were an elegant mechanical solution to a complex mathematical problem. Used mostly to produce information useful to commercial shipping, during the two world wars they also played an important role in the planning of amphibious operations like the Normandy landings.

That contribution is interesting enough to give them a spot in the history of war, but the basic design of the British machines – using multiple gears to create an analog approximation of a mathematical function – also has an oblique connection to one of the most important technical achievements of the war: the mechanization of cryptanalysis.

Alan Turing is justifiably famous for his role in the breaking of the German Enigma cipher, and particularly for his contribution to designing electro-mechanical computing tools that transformed the process. (Even if popular versions of the story do a terrible disservice to the story by erasing everyone except Turing from the picture. Imitation Game, I’m looking at you.) Less well known are some of Turing’s pre-war flirtations with the mechanization of mathematical problem-solving. Andrew Hodges’ biography describes two projects which Turing took on, at least briefly. The first, during his time at Princeton in 1937, was to use electromagnetic relays for binary multiplication to create an extremely large number that could be used as the key for a cipher. This was, as Hodges puts it, a “surprisingly feeble” idea for a cipher but a practical success as far as constructing the relays was concerned.

The second project was an attempt to disprove the Riemann hypothesis about the distribution of prime numbers by calculating the Riemann zeta-function (for the hypothesis and the zeta-function, read Hodges or Wikipedia. There’s no chance of me describing it properly) by showing that not all instances where the function reached zero lay on a single line, as the hypothesis stated. An Oxford mathematician had already calculated the first 104 zeroes using punched-care machines to implement one approximation of the function. Since the zeta-function was the sum of circular functions of different frequencies, just like Thomson’s harmonic analysis of the tides, Turing realized it could be calculated using the same method. Or, more precisely, the machine could rule out enough values that only a few would have to be calculated by hand.

With a grant of £40 from the Royal Society, Turing and Donald MacPhail designed a machine that, like the tide calculators, used meshed gear wheels to approximate the thirty frequencies involved. The blueprint was completed by 17 July 1939 and the grinding of the wheels was underway when the war broke out at Turing joined the Government Code and Cypher School at Bletchley Park.

Nothing in the work that Turing did at Bletchley connected directly to the zeta-function machine, but, as Hodges notes, it was unusual for a mathematician like Turing to have any interest in using machines to tackle abstract problems of this sort. Clearly, though, Turing had been mulling the question of how machines could be applied to pure mathematics long before he became involved in the specific cryptanalytic problems that were tackled at Bletchley.

Of course, the secrecy surrounding code-breaking meant that no hint of the connection, or any of Turing’s wartime work, would have leaked out to those operating the tide-predicting machines in Liverpool or elsewhere. The end of the war meant a return to usual practice, but their strategic importance remained.

Probably the last analog machine to be constructed was a thirty-four constituent machine built in 1952–5 for East Germany (and now in the collection of the German Maritime Museum in Bremen). The Soviet Union had ordered a Kelvin-type machine for forty constituents from Légé and Co. in 1941 that was delivered to the State Oceanographic Institute in Moscow in 1946, on the eve of the Cold War. Bernard Zetler, an oceanographer who worked on tide prediction at the Scripps Institution of Oceanography in San Diego, recalls that he was unable to visit the machine in 1971 because it or its location was classified. The Soviet tide tables certainly were.

The American Tide Predicting Machine No. 2 remained in use until 1966, but played no role in the American amphibious landing at Inchon during the Korean War. The wide tidal range at Inchon meant that the landing needed good tidal information, but rather than making new calculations the existing American and Japanese tide tables were supplemented by first-hand observation by Navy Lieutenant Eugene F. Clark, whose unit reconnoitered the area for two weeks preceding the landings.

When analog machines like Tide Predicting Machine No. 2 were retired, they were replaced by digital computers whose architecture originated in other wartime projects like the ENIAC computer, which had been built to calculate ballistics tables for US artillery. The world’s navies have not relinquished their interest in tools to predict the tides. Their use, though, has never matched the high drama of prediction during the Second World War.

Source Note: The D-Day predictions are discussed many places on the internet, but almost all the accounts trace back to an article oceanographer Bruce Parker published in Physics Today, adapted from his 2010 book The Power of the Sea. Where Parker disagrees with the inventory of machines commissioned by the National Oceanography Centre, Liverpool (itself a descendant of the Liverpool Tidal Institute), I’ve followed Parker. Details on the work of Arthur Doodson and the Liverpool Tidal Institute come from Anna Carlsson-Hyslop‘s work: the articles “Human Computing Practices and Patronage: Antiaircraft Ballistics and Tidal Calculations in First World War Britain,” Information & Culture: A Journal of History 50:1 (2015) and “Patronage and Practice in British Oceanography: The Mixed Patronage of Storm Surge Science at the Liverpool Tidal Institute, 1919–1959,” Historical Studies in the Natural Sciences 46:3 (2016), and her dissertation for the University of Manchester (accessible through the NERC Open Repository). The scientist suspected of Nazi sympathies was Harald Sverdrup, a Norwegian national who worked with Walter Munk on wave prediction methods used in several amphibious landings. Turing’s experiments calculating the Riemann zeta-function appear in Andrew Hodges, Alan Turing: The Engima (1983; my edition the 2014 Vintage movie tie-in).

Aleksandr Zhitomirsky

During the Second World War, when it still seemed like the Germans might capture Moscow, propaganda minister Joseph Goebbels wrote a list of Soviet propagandists who were to be killed upon capture. Number one was the writer Ilya Ehrenburg. Number two was chief Radio Moscow announcer Iurii Levitan. Number three was Aleksandr Zhitomirsky, the designer and artist of one of the Red Army’s chief illustrated propaganda magazines.

That, at least, was the story, one which is mentioned – with appropriate skepticism – by Erika Wolf in the catalogue to a major exhibit of artist Aleksandr Zhitomirsky’s work at the Art Institute of Chicago. A talented designer and illustrator whose most striking works were the satirical, even grotesque, photomontages that he created in the early years of the Cold War, Zhitomirsky’s work pilloried capitalism and the United States, often with allusions to the Nazi threat against which Zhitomirsky had cut his teeth propagandizing. While his main employment from 1953 to 1991 was as chief artist for Soviet Union (Sovietskii Soiuz), a glossy magazine aimed at readers in Eastern Europe and Asia, his illustrations appeared in the Literary Newspaper (Literaturnaia gazeta), official organ of the Union of Soviet Writers; Red Fleet (Krasnyi flot); Rising Generation (Smena); the satirical magazine Krokodil (Crocodile), and even occasionally in more exalted venues such as Truth (Pravda), the official newspaper of the Communist Party, and News (Izvestiia), official paper of the Soviet government. Those works attracted attention not just at home, where he was part of a major photomontage exhibit in East Berlin in 1961/2 and had his own retrospective in Moscow, but even in the US, where some of his photomontages from the Literary Gazette drew comment in the New York Times.

On balance it’s the postwar art, not just the illustrations mentioned above but also the book covers and occasional poster, that is the focus of Wolf’s Aleksandr Zhitomirsky: Photomontage as a Weapon of World War II and the Cold War (Yale University Press, 2016). For me, though, it’s Zhitomirsky’s wartime work on Front Illustrated (Frontovaia illiustratsiia) and its complementary German-language edition aimed at enemy soldiers (Front Illustrated for German Soldiers / Front-Illustrierte für den deutschen Soldaten) that’s more captivating. The postwar designs are hardly subtle. How often can one look at a monkey-like Goebbels ventriloquizing through some American symbol?

Aleksandr Zhitomirsky CoverFront Illustrated for German Soldiers, which existed to sow unease and dissension in the German ranks, had to be more indirect. For his cover designs and leaflets, Zhitomirsky mixed captured German photographs and new photography (often with himself as the model) with images borrowed for his vast trove of reference photos, often airbrushed together to the point that they became impossible to distinguish. With one leaflet, Choose! Like This or Like That!, Wolf shows how what appears to be a single photograph of dead Germans lying on the ground was actually a composite of seven different photographs, layered together, photographed, then retouched to create a seamless image. With others, she shows how Zhitomirsky mixed background photography with physical objects (like reproduced letters and snapshots) in trompe-l’œil arrangements. Taking advantage of Zhitomirsky’s personal archive, Wolf can demonstrates just how impressive his work was.

Tides of War, Part One

The best-known story about environmental science and D-Day has to be that of the last-minute forecast that let the invasion go ahead. That prediction, though, was only one of many contributions by Allied environmental scientists to the success of the invasion. Another was the secretive preparation of mundane but vital preparations for the assault: calculating the tides for D-Day.

The theoretical basis for tide prediction was the work of Newton, Daniel Bernoulli, and Pierre Simon Laplace, the third of whom was the first to outline the equations that describe the rise and fall of the tides. Laplace’s equations were too complex to use in practice, but in the mid-nineteenth century the British scientist William Thomson (later ennobled as Lord Kelvin) demonstrated that, given enough tidal measurements, one could use harmonic analysis to divide the tide-generating forces for a particular shoreline into a series of waves of known frequencies and amplitudes (the tidal constituents). That same process, carried out in reverse, would let one predict the tides along that shore. Unfortunately, making those calculations was was time-consuming the point of impracticality. However, Thomson also demonstrated that it was possible to construct an analog machine that would do the necessary work automatically.

Thomson’s machine drew a curve representing the height of the tide with a pen that was attached to the end of a long wire. The wire ran over top of a series of pulleys, which were raised and lowered by gears which reflected the the frequency and amplitude of the tidal constituents. As each pulley rose or fell, it affected the length of the wire’s path and thus the position of the pen. Altogether, they reflected the combined effect of the tidal constituents being simulated.

Thomson's design sketch for the third tide-predicting machine, 1879. Image courtesy Wikimedia.

Thomson’s design sketch for the third tide-predicting machine, 1879. Image courtesy Wikimedia.

The first machine, built in 1872, had gears for only ten constituents, but later machines could represent many more. Machines of his design, many of them built in Great Britain, were also used in other countries to create the necessary tide tables for their ports. In the United States, a different mechanical approach developed by William Ferrel was used to build similar machines. Altogether, though, tide-predicting were specialized, expensive, and rare. According to a modern inventory, only thirty-three were ever built – twenty-five of them in London, Glasgow, or Liverpool.

During the Second World War, the Admiralty Hydrographic Office relied on two tide-predicting machines operated by Arthur Thomas Doodson at the Liverpool Tidal Institute to do all their tidal calculations. One was Thomson’s original machine, refitted to handle twenty-six constituents. The other was a machine designed by Edward Roberts in 1906 and equipped for forty constituents.

Both Doodson and the Tidal Institute had their own unique histories of military collaboration. Doodson, despite being a conscientious objector, had worked on anti-aircraft ballistics for the Ministry of Munitions during the First World War. The Institute, established in 1919 with corporate and philanthropic support, had an important connection with the Admiralty’s own Hydrographic Department. Though the Hydrographic Department did not provide any direct funding until 1923, after that it made the Institute the Admiralty’s exclusive supplier of tide calculations. At the same time, the Hydrographic Department began appointing a representative to the Institute’s governing board.

Though they were the basis for only some of the Institute’s Admiralty work during the war, the tide-predicting machines in Liverpool were busy creating tide tables for Allied ports. According to historian Anna Carlsson-Hyslop’s research, the number of tidal predictions being performed doubled from 77 for 1938, the last pre-war year, to 154 for 1945. (Carlsson-Hyslop’s research is focused on areas of the Institute’s work other than the creation of tide tables, but much of it sheds light on its relationship with the Royal Navy and state patronage.)

In 1943 the Admiralty Hydrographic Office requested calculations to create tide tables for the invasion beaches to be used on D-Day in Normandy. Since the landing zone remained top secret, Commander William Ian Farquharson was responsible for establishing the constituents and providing them (anonymized under the codename “Point Z”) to Doodson in Liverpool. Unfortunately, there were no existing calculations for the area of the beaches. Nor, because tidal constituents were sensitive to local conditions, could he just extrapolate from the data for the ports to the east and west at Le Havre and Cherbourg. Instead, Farquharson combined fragmentary data from some local measurement points near the beaches, clandestine on-the-spot measurements made by Allied beach reconnaissance teams, and guesswork to come up with eleven tidal constituents. Oceanographer Bruce Parker suspects that he began with the Le Havre constituents and then adjusted them to approximate the data he had. The calculations, despite the roughness of the information on which they were based, proved sufficiently accurate for the invasion planner.

In the Pacific, tide tables for amphibious operations were generated by the US Coast and Geodetic Survey’s Tide Predicting Machine No. 2. In both theaters, as well as the Mediterranean, oceanographers supplemented the tide tables for beaches with wind, wave, and surf forecasts. The story of wave forecasting is, if anything, even more cloak and dagger than that of the D-Day tide forecasts, since one of the scientists involved was actively suspected (incorrectly) of being a Nazi sympathizer.

Dr. E. Lester Jones, Chief, U.S. Coast and Geodetic Survey, with the Tide Predicting Machine he built. Harris & Ewing, photographer, 1915. Retrieved from the Library of Congress, https://www.loc.gov/item/hec2008004303/

A US tide predicting machine, probably No.2. The caption from the Library of Congress attributes the machine’s construction to E. Lester Jones, Chief of the Coast and Geodetic Survey. Harris & Ewing, photographer, 1915. Retrieved from the Library of Congress, https://www.loc.gov/item/hec2008004303/

Beyond their civilian and military wartime work, tide-predicting machines had an oblique impact on Second World War cryptanalysis. Those developments would eventually put the machines out of work after the war, but not before the machines would have their final strategic significance.

Forward to Part Two, including Source Notes

Kiley at Nuremberg

“What I was trying to do was have a unified and orderly and dignified [courtroom] – that’s what the courtroom should be, and it should reflect the scales of justice too.”

— Dan Kiley on the courtroom at Nuremberg

Creating the physical spaces for the war crimes trials at Nuremberg was one of the last tasks performed by the OSS’s Presentation Branch before the service was dissolved and the branch transferred to the State Department. The designer was Dan Kiley, an architect who had been recruited by his friend Eero Saarinen from the Army Corps of Engineers and had replaced Saarinen as chief of design for the branch.

Kiley was something of an odd choice to do the work. Though a trained architect, he had never designed a courtroom and would never design another again. After the war, he became famous as a landscape architect, often doing work for his friend Saarinen. As Kiley himself told it, the job was something of a fluke. The Presentation Branch was already responsible for similar work at the United Nations conference in San Francisco. In compensation for not getting to go to San Francisco, branch chief Hugh Barton offered Kiley the chance to go to Nuremberg instead.

On the other hand, despite the apparent mismatch – why was the Office of Strategic Services designing a courthouse? – the project was a return to the Presentation Branch’s roots. Since its inception, the branch’s mission was to make the presentation of complex information clear, logical, and even captivating. How else would you describe the responsibilities of the international military tribunal at Nuremberg?

In fact, the branch had been created by the OSS’s founder, Bill Donovan, to build a grand automated briefing room for President Roosevelt. Though that project had foundered, it had begat an organization with a broad range of design skills. Kiley himself showed the breath of his talents on the Nuremberg project. He planned the renovations of the entire court building, not just the courtroom but also offices, restaurants, medical clinics, and a shop (the Army PX). His attention to detail included designing furniture for the building to be made from old plywood and putting a gray velvet panel on the chief prosecutor’s lectern so that  his papers wouldn’t fall off.

His arrangement for the courtroom, Joseph Disponzio has explained,  reflected a willingness to break traditional norms to achieve the necessary impact. Instead of positioning the audience of observers and journalists behind the adjudicating parties, with the judges facing both, Kiley positioned the audience perpendicular to the axis of judge–parties, giving them a far better view of the proceedings. A film screen facing the audience allowed for the projection of some of OSS’ other work on the trials, the documentary films.

The Nuremberg courtroom as soon from the press gallery. Note the alignment of the dock and judges’ dais, with lawyers in the foreground and film screen to the back.

Kiley’s work stood in rare company alongside the Ichigaya courtroom, where the International Military Tribunal for the Far East convened, until the 1990s saw the creation of new international criminal courts, bringing their own requirements and sensibilities to the presentation of international justice.

Source Note: In the mid-1990s, Kiley gave an oral history interview to the Thomas J. Dodd Research Center at the University of Connecticut. A shortened version was published in the book Witnesses to Nuremberg, edited by Bruce M. Stave and Michele Palmer with Leslie Frank. The description of Kiley’s work at Nuremberg is mostly drawn from that printed text.