December 10, 1986 Toxic Sanctuary

Ironically, the threat posed by humans outside the exclusion zone is greater for some species than that posed by radiation, within the zone.

The accident began as a test. A carefully planned series of events intending to simulate a station blackout at the Chernobyl Nuclear Power Plant, in the Soviet Socialist Republic of Ukraine.

This most titanic of disasters began with a series of smaller mishaps. Safety systems intentionally turned off, reactor operators failing to follow checklists, inherent design flaws in the reactor itself.

Chernobyl_burning-aerial_view_of_core

Over the night of April 25-26, 1986, a nuclear fission chain reaction expanded beyond control at reactor #4, flashing water to super-heated steam resulting in a violent explosion and open air graphite fire. Massive amounts of nuclear material were expelled into the atmosphere during this explosive phase, equaled only by that released over the following nine days by intense updrafts created by the fire.  Radioactive material rained down over large swaths of the western USSR and Europe, some 60% in the Republic of Belarus.

It was the most disastrous nuclear power plant accident in history and one of only two such accidents classified as a level 7, the maximum classification on the International Nuclear Event Scale.  The other was the 2011 tsunami and subsequent nuclear disaster at the Fukushima Daiichi reactor, in Japan.

One operator died in the steam-blast phase of the accident, a second resulting from a catastrophic dose of radiation.  600 Soviet helicopter pilots risked lethal radiation, dropping 5,000 metric tons of lead, sand and boric acid in the effort to seal off the spread.

Remote controlled, robot bulldozers and carts, soon proved useless. Valery Legasov of the Kurchatov Institute of Atomic Energy in Moscow, explains: “[W]e learned that robots are not the great remedy for everything. Where there was very high radiation, the robot ceased to be a robot—the electronics quit working.”

20120409-3-Kiev-Chernobyl_Museum_(30)
Hat tip, Chernobyl Museum, Kiev , Ukraine

Soldiers in heavy protective gear shoveled the most highly radioactive materials, “bio-robots” allowed to spend a one-time maximum of only forty seconds on the rooftops of surrounding buildings. Even so, some of these “Liquidators” report having done so, five or six times.

In the aftermath, 237 suffered from Acute Radiation Sickness (ARS), 31 of whom died in the following three months.  Fourteen more died of radiation induced cancers, over the following ten years.

liquidators_chernobyl4
Chernobyl “Liquidators”, permitted to spend no more than a one-time maximum of forty seconds, cleaning the rooftops of surrounding structures.

The death toll could have been far higher, but for the heroism of first responders.  Anatoli Zakharov, a fireman stationed in Chernobyl since 1980, replied to remarks that firefighters believed this to be an ordinary electrical fire.  “Of course we knew! If we’d followed regulations, we would never have gone near the reactor. But it was a moral obligation – our duty. We were like kamikaze“.

The concrete sarcophagus designed and built to contain the wreckage has been called the largest civil engineering project in history, involving no fewer than a quarter-million construction workers, every one of whom received a lifetime maximum dose of radiation.  By December 10 the structure was nearing completion. The #3 reactor at Chernobyl continued to produce electricity, until 2000.

Abandoned nursery
A plastic doll lies abandoned on a rusting bed, 30 years after the town was evacuated following the Chernobyl disaster. H/T Dailymail.com

Officials of the top-down Soviet state first downplayed the disaster.  Asked by one Ukrainian official, “How are the people?“, acting minister of Internal Affairs Vasyl Durdynets replied that there was nothing to be concerned about: “Some are celebrating a wedding, others are gardening, and others are fishing in the Pripyat River.

As the scale of the disaster became apparent, civilians were at first ordered to shelter in place.  A 10-km exclusion zone was enacted within the first 36 hours, resulting in the hurried evacuation of some 49,000.  The exclusion zone was tripled to 30-km within a week, leading to the evacuation of 68,000 more.  Before it was over, some 350,000 were moved away, never to return.

evacuation_of_pripyat
Evacuation of Pripyat

The chaos of these evacuations, can scarcely be imagined.  Confused adults.  Crying children.  Howling dogs.  Shouting soldiers, barking orders and herding the now-homeless onto waiting buses, by the tens of thousands.  Dogs and cats, beloved companion animals, were ordered left behind.  Evacuees were never told.  There would be no return. 

Abandoned amusement park
Two bumper cars lie face to face in the rusting remains of an amusement park in the abandoned town of Pripyat near Chernobyl

There were countless and heartbreaking scenes of final abandonment, of mewling cats, and whimpering dogs.  Belorussian writer Svetlana Alexievich compiled hundreds of interviews into a single monologue, an oral history of the forgotten.  The devastating Chernobyl Prayer tells the story of: “dogs howling, trying to get on the buses. Mongrels, Alsatians. The soldiers were pushing them out again, kicking them. They ran after the buses for ages.” Heartbroken families pinned notes to their doors: “Don’t kill our Zhulka. She’s a good dog.”

Abandoned gym
View from an abandoned gym in the Prypyat ghost town, of Chernobyl. H/T Vintagenews.com

There would be no mercy.  Squads of soldiers were sent to shoot those animals, left behind.  Most died.  Some escaped discovery, and survived.

Today the descendants of those dogs, some 900 in number occupy an exclusion zone some 1,600 square miles, slightly smaller than the American state, of Delaware. They are not alone.

329F772A00000578-0-image-a-37_1459248391912

In 1998, 31 specimens of the Przewalski Horse were released into the exclusion zone which now serves as a de facto wildlife preserve. Not to be confused with the American mustang or the Australian brumby, the Przewalski Horse is a truly wild horse and not the feral descendant, of domesticated animals.

Named by the 19th century Polish-Russian naturalist Nikołaj Przewalski, Equus ferus przewalskii split from ancestors of the domestic Equus caballus some 38,000 to 160,000 years ago, forming a divergent species where neither taxonomic group is descended, from the other. The last Przewalski stallion was observed in the wild in 1969. The species is considered extinct in the wild, since that time.

Today approximately 100 Przewalski horses roam the Chernobyl Exclusion Zone one of the larger populations of this, possibly the last of the truly wild horses, alive today.

In 2016, US government wildlife biologist Sarah Webster worked at the University of Georgia. Webster and others used camera traps to demonstrate how wildlife had colonized the exclusion zone, even the most contaminated parts. A scientific paper on the subject is linked HERE, if you’re interested.

Ironically, the threat posed by humans outside the exclusion zone is greater for some species than that posed by radiation, within the zone. Wildlife spotted within the exclusion zone include wolves, badgers, swans, moose, elk, turtles, deer, foxes, beavers, boars, bison, mink, hares, otters, lynx, eagles, rodents, storks, bats and owls.

Not all animals thrive in this place. Invertebrates like spiders, butterflies and dragonflies are noticeably absent, likely because of eggs laid in surface soil layers which remain, contaminated. Radionuclides settled in lake sediments effect populations of fish, frogs, crustaceans and insect larvae. Birds in the exclusion zone have difficulty reproducing. Such animals who do successfully reproduce often demonstrate albinism, deformed beaks and feathers, malformed sperm cells and cataracts.

Tales abound of giant mushrooms, six-pawed rabbits and three headed dogs. While some such stories are undoubtedly exaggerated few such mutations survive the first few hours and those who do are unlikely to pass on the more egregious deformities.

Far from the post-apocalyptic wasteland of imagination the Chernobyl exclusion zone is a thriving preserve for some but not all, wildlife. Which brings us back to the dogs. Caught in a twilight zone neither feral nor domestic the dogs of Chernobyl are neither able to compete in the wild nor are many of them candidates for adoption, due to radiation toxicity.

Since September 2017, a partnership between the SPCA International and the US-based 501(c)(3) non-profit CleanFutures.org has worked to provide for the veterinary needs of these defenseless creatures.  Over 450 animals have been tested for radiation exposure, given medical care, vaccinations, and spayed or neutered, to bring populations within manageable limits.  Many have been socialized for human interaction and successfully decontaminated, available for adoption into homes in Ukraine and North America.

For most there is no future beyond this place and a life expectancy unlikely to exceed a span of five years.

Thirty five years after the world’s most devastating nuclear disaster a surprising number of people work in this place, on a rotating basis. Guards are stationed at access points whose job it is to control who gets in and to keep out unauthorized visitors, known as “stalkers”.

BBC wrote in April of this year about the strange companionship sprung up between these guards, and the dogs of Chernobyl. Jonathon Turnbull is a PhD candidate in geography at the University of Cambridge. He was the first outsider to recognize the relationship and gave the guards disposable cameras, with which to record the lives of these abandoned animals. The guards around this toxic sanctuary had but a single request: “please, please – bring food for the dogs”.

November 7, 1957 Nuke the Moon

Out of the mess of the Space race emerged an idea destined to go down in the Hare-Brain Hall of fame, if there is ever to be such a place. A show of force sufficient to boost domestic morale while showing the Russkies, we mean Business. It was the top-secret “Project A119”, also known as A Study of Lunar Research Flights. We were going to detonate a nuclear weapon. On the moon.

As World War II drew to a close in 1945, there arose a different sort of conflict, a contest of wills, between the two remaining Great Powers of the world. The “Cold War” pitted the free market economy and constitutional republicanism of the United States against the top-down, authoritarian governing and economic models of the Soviet Union. The stakes could not have been higher, as each side sought to demonstrate its own technological and military superiority and, by implication, the dominance of its own economic and political system.

American nuclear preeminence lasted but four short years, coming to an end with the first successful Soviet atomic weapon test code named “First Lightning”, carried out on August 29, 1949. Mutual fear and distrust fueled the Soviet-American “arms race”, a buildup of nuclear stockpiles beyond any rational purpose. A generation grew up under the shadow of nuclear annihilation.  A single mistake, misunderstanding or one fool in the wrong place at the wrong time, initiating a sequence and bringing about the extinction of life on this planet.

nuclear

The arms race acquired the dimensions of a Space Raceon July 29, 1956, when the United States announced its intention to launch an artificial satellite, into earth orbit. Two days later, the Soviet Union announced that it aimed to do the same.

The early Space Race period was a time of serial humiliation for the American side, as the Soviet Union launched the first Inter-Continental Ballistic Missile (ICBM) on August 21, 1957, and the first artificial satellite “Sputnik 1” on October 4.

Laika and capsule

The first living creature to enter space was the dog Laika“, launched aboard the spacecraft Sputnik 2 on November 3 and labeled by the more smartass specimens among the American commentariat, as “Muttnik”.

Soviet propaganda proclaimed “the first traveler in the cosmos”, replete with heroic images printed on posters, stamps and matchbook covers. The American news media could do little but focus on the politics of the launch, as animal lovers the world over questioned the ethics of sending a dog to certain death, in space.

On the American side, the giant Vanguard rocket was scheduled to launch a grapefruit-sized test satellite into earth orbit that September, but the program was plagued by one delay after another. The December 6launch was a comprehensive disaster, the rocket lifting all of four-feet from the pad before crashing to the ground in a sheet of flame, the satellite rolling free where it continued to beep, only feet from the burning wreck.

The second Vanguard launch was nearly as bad, exploding in flames only seconds after launch.  Chortling Soviet leaders were beside themselves with joy, stamping the twin disasters as “Kaputnik”, and “Flopnik”.

Out of this mess emerged an idea destined to go down in the Hare-Brain Hall of fame, if there is ever to be such a place. A show of force sufficient to boost domestic morale, while showing the Russkies, we mean business. It was the top-secret “Project A119”, also known as A Study of Lunar Research Flights.

We were going to detonate a nuclear weapon.  On the moon.

In 1957, newspapers reported a rumor. The Soviet Union planned a nuclear test explosion on the moon, timed to coincide with the lunar eclipse of November 7. A celebration of the anniversary of the Glorious October Revolution.

Edward Teller himself, the ‘Father of the H-Bomb” is said to have proposed such an idea as early as February, to test the effects of the explosion in a vacuum, and conditions of zero gravity.

644962_v1

Today, we take for granted the massively complex mathematics, involved in hitting an object like the moon. In 1957 there was a very real possibility of missing the thing and boomerang effect, returning the bomb from whence it came.

While the information is still classified, the project was revealed in 2000 by former NASA executive Leonard Reiffel, who said he was asked to “fast track” the program in 1958, by senior Air Force officials. A young Carl Sagan was all for the idea, believing at the time that living microbes may inhabit the moon, and a nuclear explosion may help in detecting such organisms.

Reiffel commented in a Guardian newspaper interview:  “It was clear the main aim of the proposed detonation was a PR exercise and a show of one-upmanship. The Air Force wanted a mushroom cloud so large it would be visible on earth. The US was lagging behind in the space race.” The now-retired NASA executive went on to explain that “The explosion would obviously be best on the dark side of the moon and the theory was that if the bomb exploded on the edge of the moon, the mushroom cloud would be illuminated by the sun.”

The Air Force canceled the A119 program in 1959, apparently out of concern that a ‘militarization of space’ would create public backlash, and that nuclear fallout may hamper future research and even colonization efforts, on the moon.

moon-nuke_wikimedia-commons_fb

Previously secret reports revealed in 2010 that Soviet leaders had indeed contemplated such a project, part of a multi-part program code named “E”.  Project E-1 involved reaching the moon, while E-2 and E-3 focused on sending a probe around the far side of the celestial body. The final stage, project E-4, involved a nuclear strike on the moon as a “display of force”.

Construction plans for the aforementioned Hare-Brain Hall of Fame have yet to be announced but, it already appears the place may need another wing.

IS00561-What-If-We-Detonated-a-Nuke-on-the-Moon-thumbnail-image-16X9.jpg

August 26, 1918 The Computer Wore a Skirt

“So the astronaut who became a hero, looked to this black woman in the still-segregated South at the time as one of the key parts of making sure his mission would be a success.”

In plasma physics, the Heliosphere is a vast cavity formed by the Sun, a “bubble” continuously “inflated” by plasma originating from that body known as “solar wind’ and separating our own solar system, from the vastness of interstellar space. The outermost reach of the Heliosphere comprises three major sections called the Termination Shock, the Heliosheath, and the Heliopause, so called because solar winds and interstellar winds meet to form, a zone of equilibrium.

Image converted using ifftoany

Only five man-made objects have traversed the heliosphere to penetrate interstellar space: Pioneer 10 and 11 launched in 1972-73, Voyager 1 and 2 launched in 1977 and New Horizons which left earth’s atmosphere, in 2006. Of those five only three remain active and continue to transmit data back to our little blue planet.

Voyager 2 Spacecraft

Spectacular images may be found on-line if you’re inclined to look them up. Images such as this jaw dropping shot of the ‘Blue Planet” Neptune taken two days before point of closest contact in August, 1989.

This picture of Neptune was taken by Voyager 2 less than five days before the probe’s closest approach of the planet on Aug. 25, 1989. The picture shows the “Great Dark Spot” – a storm in Neptune’s atmosphere – and the bright, light-blue smudge of clouds that accompanies the storm. Credit: NASA/JPL-Caltech

Or these images of the rings of Neptune taken on this day thirty two years ago before Voyager 2 left the last of the “gas giants”, behind.

Voyager 2 took these two images of the rings of Neptune on Aug. 26, 1989, just after the probe’s closest approach to the planet. Neptune’s two main rings are clearly visible; two fainter rings are visible with the help of long exposure times and backlighting from the Sun.
Credit: NASA/JPL-Caltech

Few among us are equipped to understand the complexity of such flight. Precious few. One such was a little girl, an American of African ancestry born this day in 1918 in White Silver Springs, West Virginia. The youngest of four born to Joyletta and Joshua Coleman, Creola Katherine showed unusual mathematical skills from an early age.

For black children, Greenbrier County West Virginia didn’t offer education past the eighth grade, in the 1920s. The Colemans arranged for their kids to attend high school two hours up the road in Institute, on the campus of West Virginia State College. Katherine took every math class offered by the school and graduated summa cum laude with degrees in mathematics and French, in 1937.

There were teaching jobs along the way at all-black schools and a marriage to Katherine’s first husband, James Goble. The couple would have three children together before James died of a brain tumor. Three years later she married James A. “Jim” Johnson.

With all that going on at home, Katherine found time to become one of only three black students to attend graduate school at West Virginia University and the only female, selected to integrate the school after the Supreme Court ruing Missouri ex rel. Gaines v. Canada.

Careers in research mathematics were few and far between for black women in 1952, but talent and hard work wins out where ignorance, fears to tread.

So it was Katherine Johnson joined the National Advisory Committee for Aeronautics (NACA), in 1952. Johnson worked in a pool of women who would read the data from aircraft black boxes and carry out a number of mathematical tasks. She referred to her co-workers as “computers who wore skirts”.

Flight research was a man’s world in those days but one day, Katherine and a colleague were asked to fill in, temporarily. Respect is not given it is earned, and Katherine’s knowledge of analytic geometry made quick work of that. Male bosses and colleagues alike were impressed with her skills. When her “temporary” assignment was over it no longer seemed all that important to send her, back to the pool.

Katherine would later explain that barriers of race and sex continued, but she could hold her own. Meetings were taken where decisions were made, where no women had been before. She’d simply tell them that she did the work and this was where she belonged, and that was the end of that.

Johnson worked as a human computer through most of the 1950s, calculating in-flight problems such as gust alleviation, in aircraft. Racial segregation was still in effect in those days according to state law and federal workplace segregation rules introduced under President Woodrow Wilson some forty years, earlier. The door where she worked was labeled “colored computers” but Johnson said she “didn’t feel the segregation at NASA, because everybody there was doing research. You had a mission and you worked on it, and it was important to you to do your job … and play bridge at lunch. I didn’t feel any segregation. I knew it was there, but I didn’t feel it.”

“We needed to be assertive as women in those days – assertive and aggressive – and the degree to which we had to be that way depended on where you were. I had to be. In the early days of NASA women were not allowed to put their names on the reports – no woman in my division had had her name on a report. I was working with Ted Skopinski and he wanted to leave and go to Houston … but Henry Pearson, our supervisor – he was not a fan of women – kept pushing him to finish the report we were working on. Finally, Ted told him, “Katherine should finish the report, she’s done most of the work anyway.” So Ted left Pearson with no choice; I finished the report and my name went on it, and that was the first time a woman in our division had her name on something”.

Katherine Johnson

Katherine worked as an aerospace technologist from 1958 until retirement. She calculated the trajectory for Alan Shepard’s May 1961 flight to become the first American, in space. She worked out the launch window for his 1961 Mercury mission and plotted navigational charts for backup in case of electronic failure. NASA was using electronic computers by the time of John Glenn’s first orbit around the earth but Glenn refused to fly until Katherine Johnson personally verified the computer’s calculations. Author Margot Lee Shetterly commented, “So the astronaut who became a hero, looked to this black woman in the still-segregated South at the time as one of the key parts of making sure his mission would be a success.”

Katherine Johnson retired in 1986 and lived to see six grandchildren and 11 “Greats”. Everyone should live to see their own great grandchild. Not surprisingly, Johnson encouraged hers to pursue careers in science and technology.

President Barack Obama personally awarded Johnson the medal of Freedom in 2015 for work from the Mercury program, to the Space Shuttle. NASA noted her “historical role as one of the first African-American women to work as a NASA scientist.”

A delightful side dish for this story is the Silver Snoopy award NASA gives for outstanding achievement, “For professionalism, dedication and outstanding support that greatly enhanced space flight safety and mission success.”

Following the Mercury and Gemini projects, NASA was searching for a way to focus employees and contractors alike on their own personal contribution to mission success. They wanted it to be fun and interesting, like the Smokey the Bear character, of the United States Forest service. Al Chop of the Manned Spacecraft Center came up with the idea.

Peanuts creator Charles Shulz, a combat veteran of WW2 and avid supporter of the space program, loved the idea. Shulz drew the character to be cast in a silver pin and worn into space, by a member of the Astronaut corps. It is this astronaut who personally awards his or her Snoopy to the deserving recipient.

The award is literally once in a lifetime. Of all NASA personnel and that of many contractors fewer than one percent have ever receive the coveted Silver Snoopy.

Astronaut and former NASA associate administrator for education Leland Melvin personally awarded Johnson her own Silver Snoopy at the naming ceremony in 2016, for the Katherine G. Johnson Computational Research Facility at NASA’s Langley Research Center in Hampton, Virginia.

Astronaut and former NASA associate administrator for education Leland Melvin presents Katherine Johnson with a Silver Snoopy award. / Credit: NASA, David C. Bowman

August 19, 1906 The Damn Thing Works!

A baby was born this day in 1906 in a small log cabin near Beaver, Utah. His name was Philo, the first born child of Louis Farnsworth and Serena Bastian. He would grow to be the most famous man, you probably never heard of.

Inventor Thomas Edison was once asked about his seeming inability, to invent artificial light. “I have not failed”, he explained “I’ve just found 10,000 ways that won’t work.”

A baby was born this day in 1906 in a small log cabin near Beaver, Utah. His name was Philo, the first born child of Louis Farnsworth and Serena Bastian. He would grow to be the most famous man, you probably never heard of.

Birthplace of Philo Taylor Farnsorth

Philo was constantly tinkering. He was the kind who could look at an object and understand how it worked and why this particular one, didn’t. The family moved when he was 12 to a relative’s ranch near Rigby, Idaho. Philo was delighted to learn the place had electricity. 

He found a burnt out electric motor thrown out by a previous tenant and rewound the armature, converting his mothers hand-cranked  washing machine, to electric. 

It must’ve seemed like Christmas morning when he found all those old technology magazines, in the attic. He even won a $25 prize one time in a magazine contest, for inventing a magnetized car lock.

Farnsworth was fascinated with the behavior of molecules and excelled in chemistry and physics, at Rigby high school. Harrowing a field one day behind a team of two horses, his mind got to working. What if I could “train“ electrons to work in lines like I’m doing here, with these horses? Electrons are so fast the human eye would never pick up, the individual lines. Couldn’t I use them to “paint“ an electronic picture?

Image dissector

Philo sketched his idea of an “image dissector” for his science teacher Mr. Tolman, who encouraged him to keep working on his idea. Justin Tolman kept the sketch though neither could know at that time.  Farnsworth’s 1922 drawing would prove decisive one day in a court of law, over who invented all-electronic television.

From Japan to Russia, Germany and America more than fifty inventors were working in the 1920s, to invent television. History remembers the Scottish engineer John Logie Baird as the man who built and demonstrated the world’s first electromechanical television. Amazingly, it was he who invented the first color TV tube, as well.

Scotsman John Logie Baird invented the first (electromechanical0 TV

It was all well and good but Baird’s spinning electromechanical disk was as a glacier, compared with the speed of the electron. Clearly, the future of television, lay in the field of electronics.

The Russian engineer Vladimir K. Zworykin applied for US patent on an electron scanning tube in 1923, while working for RCA. He wouldn’t get the thing to work though, until 1934. Meanwhile, Philo Taylor Farnsworth successfully demonstrated the first television signal transmission on September 7, 1927. The excited telegram Farnsworth sent to one of his backers exclaimed, “The damn thing works!”

Farnsworth’s successful patent application in 1930 resulted in additional funding to support his work and a visit, from Vladimir Zworykin. RCA offered Farnsworth $100,000 for his invention and, when he declined their offer, took him to court over his patent.

“If it weren’t for Philo T. Farnsworth, inventor of television, we’d still be eating frozen radio dinners”.

Johnny Carson

What followed was a bruising, ten year legal battle, a David vs. Goliath contest Farnsworth would win in the end, but at enormous cost both financial, and physical.

In another version of this story, the one that never happened, Philo Farnsworth went on to great fame and fortune to enjoy the fruits of his talents, and all his hard work. Instead World War 2 happened. Farnsworth’s hard fought patent rights quietly expired while the world, was busy with something else.

Ever the tinkerer, Farnsworth went on to invent a rudimentary form of radar, black light for night vision and an infrared telescope. Despite all that his company never did run in the “black”. He sold the company in 1949, to ITT.

From the 1950s on, the man’s primary interest, was in nuclear fusion. In 1965 he patented an array of tubes he called “fusors” in which he actually started a 30-second fusion reaction.

Farnsworth never did enjoy good health. The inventor of all-electronic television died of pneumonia on March 11, 1971 with well over 300 patents, to his name. Had you bought a television that day you would have owned a device with no fewer than 100 inventions, by this one man.

Ever the idealist Farnsworth believed television would bring about ever greater heights in human learning and achievement, foster a shared experience bringing about international peace and understanding. Much the same as some once believed of the internet where the sum total of human knowledge was now available for a few keystrokes, and social media fosters new worlds of harmonious relations where cheerful users discussed the collected works of Shakespeare, the Codes of Hammurabi and the vicissitudes, of life.

Right.

Farnsworth was dismayed by the dreck brought about, by his creation. “There’s nothing on it worthwhile” he would say“, and we’re not going to watch it in this household. I don’t want it in your intellectual diet…Television is a gift of God, and God will hold those who utilize his divine instrument accountable to him“. – Philo Taylor Farnsworth

That all changed if only a bit, on July 20, 1969. American astronaut Neil Armstrong stepped onto the surface of the moon and declared, “That’s one small step for man, one giant leap for mankind.” It was probably a misspeak. Most likely he intended to say “one small step for A man” but, be that as it may. The world saw it happen thanks to a miniaturized version of a device, invented by Philo Farnsworth.

Farnsworth himself was watching just like everyone else alive, that day. Years later Farnsworth’s wife Emma, he called her “Pem”, would recall in an interview, with the Academy of Television Arts & Sciences: “We were watching it, and, when Neil Armstrong landed on the moon, Phil turned to me and said, “Pem, this has made it all worthwhile.” Before then, he wasn’t too sure”.

August 13, 1941 Beans

The car itself was destroyed long ago, the ingredients for its manufacture unrecorded, but the thing lives on in the hearts of hemp enthusiasts, everywhere.

The largest museum in the United States is located in the Detroit suburb of Dearborn, the Henry Ford Museum of American Innovation. The sprawling, 12-acre indoor-outdoor complex in the old Greenfield Village is home to JFK’s Presidential limo, the Rosa Parks bus and the Wright Brothers’ bicycle shop. There you will find Abraham Lincoln’s chair from Ford’s Theater along with Thomas Edison’s laboratory and an Oscar Mayer Wienermobile. George Washington’s camp bed is there, with Igor Sikorski’s helicopter and an enormous collection of antique automobiles, locomotives and aircraft.

One object you will not find there is Henry Ford’s plastic car. Made from soybeans.

92MI_0003

As a young man, Henry Ford left the family farm outside of modern-day Detroit, and never returned. Ford’s father William thought the boy would one day own the place but young Henry couldn’t stand farm work. He later wrote, “I never had any particular love for the farm—it was the mother on the farm I loved”.

Henry Ford went on to other things, but part of him never left the soil. In 1941, the now-wealthy business magnate wanted to combine industry, with agriculture. At least, that’s what the museum says.

soybean-car-chassis-skeleton-right-rear

Ford gave the plastic car project to yacht designer Eugene Turenne Gregorie at first, but later turned to the Greenfield Village soybean laboratory. To the guy in charge over there, a guy with some experience in tool & die making. His name was Lowell Overly.

The car was made in Dearborn with help from scientist and botanist George Washington Carver, (yeah, That George Washington Carver), a man born to slavery who rose to such prodigious levels of accomplishment that Time magazine labeled the man, the “Black Leonardo”.

Carver1web
George Washington Carver, at work in his library

The soybean car, introduced to the public this day in 1941, was made from fourteen quarter-inch thick plastic panels and plexiglass windows, attached to a tubular steel frame and weighing in at 1,900 pounds, about a third lighter than comparable automobiles of the era. The finished prototype was exhibited later that year at the Dearborn Days festival, and the Michigan State Fair Grounds.

The thing was built to run on fuel derived from industrial hemp, a related strain of the green leafy herb beloved of stoners, the world over.

Ford claimed he’d be able to “grow automobiles from the soil”, a hedge against the metal rationing of world War Two. He dedicated 120,000 acres of soybeans to experimentation, but to no end. The total acreage devoted to “fuel” production went somehow, unrecorded.

Another reason for a car made from soybeans, was to help American farmers. In any case Henry Ford had a “thing”, for soybeans. He was one of the first in this country, to regularly drink soy milk. At the 1934 World’s Fair in Chicago, Ford invited reporters to a feast where he served soybean cheese, soybean crackers, soy bread and butter, soy milk, soy ice cream. If he wasn’t the Bubba Gump of soybeans, perhaps Bubba Gump was the Henry Ford, of Shrimp.

Ford’s own car was fitted with a soybean trunk and struck with an axe to demonstrate the material’s durability, though the axe was later revealed to have a rubber boot.

Henry-Ford-Soybean-Car

Henry Ford’s experiment in making cars from soybeans never got past that first prototype and came to a halt, during World War 2. The project was never revived, though several states adopted license plates stamped out of soybeans, a solution to the steel shortage farm animals found to be quite delicious.

The car itself was destroyed long ago, the ingredients for its manufacture unrecorded, but the thing lives on in the hearts of hemp enthusiasts, everywhere.

The New York Times claimed the car body and fenders were made from soy beans, wheat and corn. Other sources opine that the car was made from Bakelite or some variant of Duroplast, a plant-based auto body substance produced in the millions, for the East German Trabant.

One newspaper claimed that nothing ever came from Henry Ford’s soybean experiments, save and except for, whipped cream.

August 12, 1865 The Shoulders of Giants

Today, the idea that microorganisms such as fungi, viruses and other pathogens cause infectious disease is common knowledge, but such ideas were held in disdain among scientists and doctors, well into the 19th century.

In the 12th century, French philosopher Bernard of Chartres talked about the concept of “discovering truth by building on previous discoveries”. The idea is familiar to the reader of English as expressed by the mathematician and astronomer Isaac Newton, who observed that “If I have seen further it is by standing on the shoulders of Giants.”

gooddoc
Dr. Ignaz Semmelweis

Nowhere is there more truth to the old adage, than in the world of medicine. In 1841, the child who survived to celebrate a fifth birthday could look forward to a life of some 55 years. Today, a five-year-old can expect to live to eighty-two, fully half again that of the earlier date.

Yet, there are times when the giants who brought us here are unknown to us, as if they had never been. One such is Dr. Ignaz Semmelweis, one of the earliest pioneers in anti-septic medicine.

Semmelweis  studied law at the University of Vienna in the fall of 1837, but switched to medicine the following year. He received his MD in 1844 and, failing to gain a clinical appointment in internal medicine, decided to specialize in obstetrics.

In the third century AD, the Greek physician Galen of Pergamon first described the “miasma” theory of illness, holding that infectious diseases such as cholera, chlamydia and the Black Death were caused by noxious clouds of “bad air”.  The theory is discredited today, but such ideas die hard.

miasma-theory

The germ theory of disease was first proposed by Girolamo Fracastoro in 1546 and expanded by Marcus von Plenciz in 1762. Single-cell organisms – bacteria – were known to exist in human dental plaque as early as 1683, yet their functions were imperfectly understood. Today, the idea that microorganisms such as fungi, viruses and other pathogens cause infectious disease is common knowledge, but such ideas were held in disdain among scientists and doctors, well into the 19th century.

InfectiousDisease16_9

In the mid-19th century, birthing centers were set up all over Europe, for the care of poor and underprivileged mothers and their illegitimate infants. Care was provided free of charge, in exchange for which young mothers agreed to become training subjects for doctors and midwives.

In 1846, Semmelweis was appointed assistant to Professor Johann Klein in the First Obstetrical Clinic of the Vienna General Hospital, a position similar to a “chief resident,” of today.

300px-AAKH-1784

At the time, Vienna General Hospital ran two such clinics, the 1st a “teaching hospital” for undergraduate medical students, the 2nd for student midwives.

Semmelweis quickly noticed that one in ten women and sometimes one in five, were dying in the First Clinic of postpartum infection known as “childbed fever”, compared with less than 4% that of the Second Clinic.

The difference was well known, even outside of the hospital. Expectant mothers were admitted on alternate days into the First or Second Clinic. Desperate women begged on their knees not to be admitted into the First, some preferring even to give birth in the streets, over delivery in that place. The disparity between the two clinics “made me so miserable”, Semmelweis said, “that life seemed worthless”.

He had to know why this was happening.

Puerperal Peritonitis 1912 MA

Childbed or “puerperal” fever was rare among these “street births”, and far more prevalent in the First Clinic, than the Second. Semmelweis carefully eliminated every difference between the two, even including religious practices. In the end, the only difference was the people who worked there.

The breakthrough came in 1847, following the death of Semmelweis’ friend and colleague, Dr. Jakob Kolletschka. Kolletschka was accidentally cut by a student’s scalpel, during a post-mortem examination. The doctor’s own autopsy showed a pathology very similar to those women, dying of childbed fever. Medical students were going from post-mortem examinations of the dead to obstetrical examinations of the living, without washing their hands.

Midwife students had no such contact with the dead. This had to be it. Some unknown “cadaverous material” had to be responsible for the difference.

Ignaz Philipp Semmelweis

Semmelweis instituted a mandatory handwashing policy, using a chlorinated lime solution between autopsies and live patient examinations.

Mortality rates in the First Clinic dropped by 90 percent, to rates comparable with the Second. In April 1847, First Clinic mortality rates were 18.3% – nearly one in five. Hand washing was instituted in mid-May, and June rates dropped to 2.2%.  July was 1.2%. For two months, the rate actually stood at zero.

The European medical establishment celebrated the doctor’s findings. Semmelweis was feted as the Savior of Mothers, a giant of modern medicine. 

No, just kidding.  He wasn’t.

The imbecility of the response to Semmelweis’ findings is hard to get your head around and the doctor’s own personality, didn’t help.  The medical establishment took offense at the idea that they themselves were the cause of the mortality problem, and that the answer lay in personal hygiene.

Yearly_mortality_rates_1841-1846_two_clinics

Semmelweis himself was anything but tactful, publicly berating those who disagreed with his hypothesis and gaining powerful enemies.   For many, the doctor’s ideas were extreme and offensive, ignored or rejected and even ridiculed.  Are we not Gentlemen!?  Semmelweis was fired from his hospital position and harassed by the Vienna medical establishment, finally forced to move to Budapest.

Dr. Semmelweis was outraged by the indifference of the medical community, and began to write open and increasingly angry letters to prominent European obstetricians.  He went so far as to denounce such people as “irresponsible murderers”, leading contemporaries and even his wife, to question his mental stability.

Dr. Ignaz Philipp Semmelweis was committed to an insane asylum on July 31, 1865, twenty-three years before Dr. Louis Pasteur opened his institute for the study of microbiology.

Semmelweis bust, University of Tehran

Barely two weeks later, August 12, 1865, British surgeon and scientist Dr. Joseph Lister performed the first anti-septic surgery, in medical history. Dr. Semmelweis died the following day at the age of 47, the victim of a blood infection resulting from a gangrenous wound sustained in a severe beating, by asylum guards.

August 2, 1864 Grandissima Ruina

In an age of hand-lit sputtering fuses and hand packed (to say nothing of hand-made) powder, even a millisecond difference in ignition will give one ball a head start, to be measured in feet.

In 1642, Italian gun maker Antonio Petrini conceived a double barrel cannon with tubes joined at 45° firing solid shot joined together, by a length of chain.  This was the year of the “Great Rebellion“, the English Civil War, when King and Parliament raised armies to go to war – with each other.  Petrini’s idea must have looked good to King Charles I of England. Imagine, a weapon capable of slicing through the ranks of his enemies, like grass before a scythe.

The idea was to fire both barrels simultaneously, but there was the rub.  Wild ideas occur to the imagination of imperfect combustion, and a chained ball swinging around to take out its own gun crew.  The King himself was mute on the subject and went on to lose his head, in 1649.  Petrini’s manuscript resides to this day in the tower of London.  There is no documented evidence that the weapon was ever fired, save for the designer’s own description of the ‘Grandissima Ruina’ left behind, by his own splendid creation.

Capture_zps80a1eeae

Two-hundred years later the former British colonies across the Atlantic, were themselves embroiled in Civil War.

In the early days of independence, the Confederate Congress enacted a measure, allowing local cities and towns to form semi-military companies for the purpose of local defense. As the very flower of young southern manhood was called up and sent to the front, these “home guard” units often comprised themselves of middle-age and older gentlemen, and others for various reasons, unable to leave home and hearth.

ALHull

Augustus Longstreet Hull was born 1847 in “The Classic City” of Athens Georgia, and enlisted in the Confederate Army on September 8, 1864.

After the war, Hull worked twenty-seven years as a banker before publishing the Annals of Athens, in 1906.  In it, Mr. Hull writes with not a little biting wit, of his own home town home guard unit, Athens’ own, Mitchell Thunderbolts.

“From the name one might readily infer that it was a company made up of fierce and savage men, eager for the fray and ready at all times to ravage and slaughter; yet such was not the case, for in all their eventful career no harm was done to a human being, no property was seized and not one drop of blood stained their spotless escutcheon.

Named for one of it’s own private soldiers, the Mitchell Thunderbolts were not your standard military company. These guys were “organized strictly for home defense” and absolutely refused to take orders.  From anyone. They recognized no superior officer and the right to criticism was reserved and freely exercised by everyone from that “splendid old gentleman” Colonel John Billups, down to the lowliest private.

800px-Middleton_P._Barrow_-_Brady-Handy
Georgia Senator Middleton Pope Barrow

General Howell Cobb sent the future United States Senator Captain Middleton Pope Barrow to Athens in 1864, to inspect the Thunderbolts. Having no intention of submitting to “inspection” by any mere stripling of a Captain, Dr. Henry Hull (Augustus’ father) “politely informed him that if he wished to inspect him, he would find him on his front porch at his home every morning at 9 o’clock“.

John Gilleland, 53, was a local dentist, builder and mechanic, and private soldier in good standing, of the Mitchell Thunderbolts.  Gilleland must have liked Petrini’s idea because he took up a collection in 1862, and raised $350 to build the Confederate States of America’s own, double-barrel cannon.

Measuring 13 inches wide by 4-feet 8½” inches and weighing in at some 1,300 pounds, this monstrosity had two barrels diverging at 3° and equipped with three touch holes, one for each barrel and a third should anyone wish to fire the two, together.  It was the secret “super weapon” of the age, two cannonballs connected by a chain and designed to “mow down the enemy somewhat as a scythe cuts wheat.”

Yeah. As Mr. Petrini could have told them, the insurmountable problem remained. In an age of hand-lit sputtering fuses and hand packed (to say nothing of hand-made) powder, even a millisecond difference in ignition will give one ball a head start, to be measured in feet. How to simultaneously fire two conjoined weapons remained a problem, even for so elite an outfit, as the Mitchell Thunderbolts.

The atmosphere was festive on April 22, 1862, when a crowd gathered to watch Gilleland test the Great Yankee Killer. Aimed at two poles stuck in the ground, uneven ignition and casting imperfections sent assorted spectators scrambling for cover as two balls spun wildly off to the side where they “plowed up about an acre of ground, tore up a cornfield, mowed down saplings, and then the chain broke, the two balls going in different directions“.

046_zps0a76b542
Double Barrel Cannon model, H/T ModelExpo

On the second test, two chain-connected balls shot through the air and into a stand of trees.   According to one witness, the “thicket of young pines at which it was aimed looked as if a narrow cyclone or a giant mowing machine had passed through“.

On the third firing, the chain snapped right out of the barrel.  One ball tore into a nearby log cabin and destroyed the chimney, while the other spun off and killed a cow who wasn’t bothering anyone.

Gilleland considered all three tests successful, even though the only ones truly safe that day, were those two target posts.

The dentist went straight to the Confederate States’ arsenal in Augusta where Colonel George Rains subjected his creation to extensive testing, before reporting the thing too unreliable for military use. Outraged, an angry inventor wrote angry letters to Georgia Governor Joseph “Joe” Brown and to the Confederate government in Richmond, but to no avail.

At last, the contraption was stuck in front of the Athens town hall and used as a signal gun, to warn citizens of approaching Yankees.

7239343410_30c6de36eb_z

There the thing remained until August 2, 1864, when the gun was hauled out to the hills west of town to meet the Federal troops of Brigadier General George Stoneman.  The double-barrel cannon was positioned on a ridge near Barber’s Creek and loaded with canister shot, along with several conventional guns.  Outnumbered home guards did little real damage but the noise was horrendous, and Stoneman’s raiders withdrew to quieter pastures.

There were other skirmishes in the area, all of them minor. In the end, Athens escaped the devastation of Sherman’s march to the sea and the Confederate superweapon weapon was moved, back to town.

Gilleland’s monstrosity was sold after the war and lost, for a time.  The thing was recovered and restored back in 1891, and returned to the Athens City Hall where it remains to this day, a contributing property of the Downtown Athens Historic District.  Come and see it if you’re ever in Athens, right there at the corner of Hancock and College Avenue.  There you will find the thing, pointing north, at all those Damned Yankees.  You know. Just in case.

800px-Doublebarreledcannonathensgeorgia-plaque

July 13, 1908 Apocalypse

In 2018, the non-profit B612 Foundation dedicated to the study of near-Earth object impacts, reported that “It’s a 100 per cent certain we’ll be hit [by a devastating asteroid]”. Comfortingly, the organization’s statement concluded “we’re [just] not 100 per cent sure when.”

The first atomic bomb in the history of human conflict exploded in the skies over Japan on August 6, 1945. The bomb, code named “Little Boy”, reached an altitude of 1,900-feet over the city of Hiroshima at 8:15am, Japanese Standard Time.

A “gun-triggered” fission bomb, barometric-pressure sensors initiated the explosion of four cordite charges, propelling a small “bullet” of enriched uranium the length of a fixed barrel and into a sphere of the same material. Within picoseconds (1/.000000000001 of a second), the collision of the two bodies initiated a fission reaction, releasing an energy yield roughly equivalent to 15,000 tons of TNT.

66,000 were killed outright by the effects of the blast. The shock wave spread outward at a velocity greater than the speed of sound, flattening virtually everything in its path for a mile in all directions.

h13_36

Thirty-seven years before, the boreal forests of Siberia lit up with an explosion 1,000 times greater than the atomic bomb dropped over Hiroshima. At the time, no one had the foggiest notion that it was coming.

The Taiga occupies the high latitudes of the world’s northern regions, a vast international beltline of coniferous forests consisting mostly of pines, spruces and larches between the high tundra, and the temperate forest.  An enormous community of plants and animals, this trans-continental ecosystem comprises a vast biome, second only to the world’s oceans.

The Eastern Taiga is a region in the east of Siberia, an area 1.6 times the size of the continental United States.  The Stony Tunguska River wends its way along an 1,160-mile length of the region, its entire course flowing under great pebble fields with no open water.

Tunguska

On the morning of June 30, 1908, the Tunguska River lit up with a bluish-white light.  At 7:17a local time, a column of light too bright to look at with the naked eye moved across the skies above the Tunguska. Minutes later, a vast explosion knocked people off their feet, flattening buildings, crops and as many as 80 million trees over an area 830 miles, square. A vast “thump” was heard, the shock wave equivalent to an earthquake measuring 5.0 on the Richter scale. Within minutes came a second and then a third shock wave and finally a fourth, more distant this time and described by eyewitnesses as the “sun going to sleep”.

On July 13, 1908, the Krasnoyaretz newspaper reported “At 7:43 the noise akin to a strong wind was heard. Immediately afterward a horrific thump sounded, followed by an earthquake that literally shook the buildings as if they were hit by a large log or a heavy rock”.

Fluctuations in atmospheric pressure were detectable as far away as Great Britain.  Night skies were set aglow from Asia to Europe for days on end, theorized to have been caused by light, passing through high-altitude ice particles.

In the United States, lookout posts from the Smithsonian Astrophysical Observatory headquartered in Cambridge, Massachusetts, to the Mount Wilson Observatory in Los Angeles recorded a several months-long decrease in atmospheric transparency, attributed to an increase in dust, suspended in the atmosphere.

The “Tunguska Event” was the largest such impact event in recorded history, but far from the first. Or the last.  Mistastin Lake in northern Labrador was formed during the Eocene era 36-million years ago, cubic Zirconium deposits suggesting an impact-zone temperature of some 4,300° Fahrenheit. 

That’s halfway to the temperature, of the sun.

240px-Bolide
“A bolide – a very bright meteor of an apparent magnitude of &−14 or brighter” H/T Wikimedia

Some sixty-six million years ago, the “Chicxulub impactor” struck the Yucatan Peninsula of Mexico, unleashing a mega-tsunami of 330-feet in height from Texas to Florida. Superheated steam, ash and vapor towered over the impact zone, as colossal shock waves triggered global earthquakes and volcanic eruptions.   Vast clouds of dust blotted out the sun for months on end leading to mass extinction events, the world over.

The official history of the Ming Dynasty records the Ch’ing-yang event of 1490, a meteor shower in China in which “stones fell like rain”. Some 10,000 people were killed for all intents and purposes, stoned to death.

In 2013, a twenty-meter (66-foot) space rock estimated at 13,000-14,000 tons flashed across the skies of Chelyabinsk, Russia, breaking apart with a kinetic impact estimated at 26-times the nuclear blast over Hiroshima.  This Superbolide (a bolide is “an extremely bright meteor, especially one that explodes in the atmosphere”) entered the earth’s atmosphere on February 15, burning exposed skin and damaging retinas for miles around.  No fatalities were reported though 1,500 were injured seriously enough to require medical attention.

The 450-ton Chicora Meteor collided with western Pennsylvania on June 24, 1938, in a cataclysm comparable to the Halifax Explosion of 1917.  The good luck held, that time, the object making impact in a sparsely populated region.  The only reported casualty, was a cow.  Investigators F.W. Preston, E.P. Henderson and James R. Randolph remarked that “If it had landed on Pittsburgh there would have been few survivors”.

In 2018, the non-profit B612 Foundation dedicated to the study of near-Earth object impacts, reported that “It’s a 100 per cent certain we’ll be hit [by a devastating asteroid]”. Comfortingly, the organization’s statement concluded “we’re [just] not 100 per cent sure when.”

Impact_event

It puts a lot of things into perspective.

June 21, 1633 The Last Word

Revenge it is said, is a dish best served, cold.

From the time of antiquity, science took the “geocentric” view of the solar system. Earth exists at the center of celestial movement with the sun and planetary bodies revolving around our own little sphere.

The perspective was widely held but by no means unanimous.  In the third century BC the Greek astronomer and mathematician Aristarchus of Samos put the Sun in the center of the universe.  Later Greek astronomers Hipparchus and Ptolemy agreed, refining Aristarchus’ methods to arrive at a fairly accurate estimate for the distance to the moon, but theirs remained the minority view.

Bartolomeu_Velho_1568_br-580x428
Earth is at the center of this model of the universe created by Bartolomeu Velho, a Portuguese cartographer, in 1568. H/T: NASA/Bibliothèque Nationale, Paris

In the 15th century, Polish mathematician and astronomer Nicolaus Copernicus parted ways with the orthodoxy of his time, describing a “heliocentric” model of the universe placing the sun at the center.  The Earth and other bodies, according to this model, revolved around the sun.

Copernicus wisely refrained from publishing such ideas until the end of his life, fearing to offend the religious sensibilities of the time. Legend has it that he was presented with an advance copy of his “De revolutionibus orbium coelestium” (On the Revolutions of the Heavenly Spheres) on awakening on his death bed, from a stroke-induced coma. He took one look at his book, closed his eyes and never opened them again.

b148cfbc5e4edde112d92494ff235d4e
Copernicus’ ‘heliocentric’ view of the universe.

The Italian physicist, mathematician, and astronomer Galileo Galilei came along, about a hundred years later. The “Father of Modern Observational Astronomy”, Galileo’s improvements to the telescope and resulting astronomical observations supporting the Copernican heliocentric view.

Bad news for Galileo, they also brought him to the attention of the Roman Inquisition.

Biblical references such as, “The Lord set the Earth on its Foundations; it can Never be Moved.” (Psalm 104:5) and “And the Sun Rises and Sets and Returns to its Place.” (Ecclesiastes 1:5) were taken at the time as literal and immutable fact and formed the basis for religious objection to the heliocentric model.

Galileo_facing_the_Roman_Inquisition
Galileo faces the Roman Inquisition

Galileo was brought before inquisitor Vincenzo Maculani for trial. The astronomer backpedaled before the Inquisition, but only to a point, testifying in his fourth deposition on June 21, 1633: “I do not hold this opinion of Copernicus, and I have not held it after being ordered by injunction to abandon it. For the rest, here I am in your hands; do as you please”.

There is a story about Galileo, which may or may not be true. Refusing to accept the validity of his own conviction, the astronomer muttered “Eppur si muove” — “And yet it moves”.

The Inquisition condemned the astronomer to “abjure, curse, & detest” his Copernican heliocentric views, returning him to house arrest at his villa in 1634, there to spend the rest of his life. Galileo Galilei, the Italian polymath who all but orchestrated the transition from late middle ages to  scientific Renaissance, died on January 8, 1642, desiring to be buried in the main body of the Basilica of Santa Croce, next to the tombs of his father and ancestors. 

His final wishes were ignored at the time, though not forever. His final wishes would be honored some ninety-five years later, when Galileo was re-interred according to his wishes, in the basilica.

1200x630px-Basilica_of_Santa_Croce_Florence_1
Basilica of Santa Croce, in Florence

Often, atmospheric conditions in these burial vaults lead to natural mummification of the corpse. Sometimes, they look almost lifelike. When it came to the saints, believers took this to be proof of the incorruptibility of these individuals, and small body parts were taken as holy relics.

Such a custom seems ghoulish to us today, but the practice was was quite old by the 18th century.  Galileo is not now and never was a Saint of the Catholic church, quite the opposite.  The Inquisition had judged the man an enemy of the church, a heretic.

23galileospan-cnd-articleLarge
“A bust of Galileo at the Galileo Museum in Florence, Italy. The museum is displaying recovered parts of his body”. H/T New York Times

Even so, the condition of Galileo’s body may have made him appear thus “incorruptible”.  Be that as it may, one Anton Francesco Gori removed the thumb, index and middle fingers on March 12, 1737. The digits with which Galileo wrote down his theories of the cosmos. The digits with which he adjusted his telescope.

The other two fingers and a tooth disappeared in 1905, leaving the middle finger from Galileo’s right hand on exhibit at the Museo Galileo in Florence, Italy. 

Locked in a glass case, the finger points upward, toward the sky.

23galileo2-cnd-popup100 years later, two fingers and a tooth were purchased at auction, and since rejoined their fellow digit at the Museo Galileo. To this day these are the only human body parts, in a museum otherwise devoted to scientific instrumentation.

379 years after his death, Galileo’s extremity points upward, toward the glory of the cosmos.  Either that or the most famous middle finger on earth, flipping the bird in eternal defiance to those lesser specimens who once condemned him, for ideas ahead of his time.

May 24, 1883 First Across

For 11 years she studied higher mathematics, catenary curves, materials strength and the intricacies of cable construction, all while acting as the pivot point on the largest bridge construction project on the planet and nursemaid, to a desperately sick husband.

Focused as he was on surveying, the engineer should have paid more attention to his surroundings. The year was 1869. Civil engineer John Roebling had begun the site work two years ago, almost to the day. Now just a few more compass readings, across the East River. Soon, work would begin on the longest steel suspension span in the world. A bridge connecting the New York boroughs of Brooklyn and Manhattan.

Roebling was working on the pier with his 32-year old son Washington, also a civil engineer. As the ferry came alongside, the elder Roebling’s toes were caught and crushed so badly, as to require amputation.

brooklyn-bridge-caisson-granger“Lockjaw” is such a sterile term, it doesn’t begin to describe the condition known as Tetanus. In the early stages, the anaerobic bacterium Clostridium Tetani produces tetanospasmin, a neurotoxin producing mild spasms in the jaw muscles. As the disease progresses, sudden and involuntary contractions affect skeletal muscle groups, becoming so powerful that bones are literally fractured as the muscles tear themselves apart. These were the last days of John Roebling, the bridge engineer who would not live to see his most famous work.

The German-born civil engineer was the first casualty of the project.  He would not be the last.

Brooklyn Bridge Caisson Construction

Washington took over the project, beginning construction on January 3, 1870.

Enormous yellow pine boxes called “caissons” were built on the Brooklyn and New York sides of the river, descending at the rate of 6-inches per week in search of bedrock. Like giant diving bells, the New York side ended up at 78- feet below mean high tide, the Brooklyn side 44-feet. Pressurized air was pumped into these caissons, keeping water and mud at bay as workers excavated the bottom.

In 1872, these “sandhogs” began to experience a strange illness that came to be called “caisson disease”.

Civil War era submarine designer Julius Hermann Kroehl may have recognized what was happening, but Kroehl was five years in his grave by this time, victim of the same “fever”.

Today we call it “the bends”. Pop the top off a soda bottle and you’ll see the principle at work. Without sufficient decompression time, dissolved gasses come out of solution and the blood turns to foam. Bubbles form in or migrate to any part of the body, resulting in symptoms ranging from joint pain and skin rashes, to paralysis and death.  The younger Roebling was badly injured as a result of the bends in 1872, leaving him partially paralyzed and bedridden, incapable of supervising construction on-site.

brooklyn-anchorage

Roebling moved to an apartment in Brooklyn Heights and conducted the entire project looking out the window, designing and redesigning details while his wife, Emily Warren Roebling, became the critical connection between her husband and the job site.

To aid in the work, Emily Roebling took a crash course in bridge engineering. For 11 years she studied higher mathematics, catenary curves, materials strength and the intricacies of cable construction, all while acting as the pivot point on the largest bridge construction project on the planet and nursemaid, to a desperately sick husband.

Emily-Warren-Roebling-e1389630968571
Emily Warren Roebling, the “first woman field engineer”.

Historian David McCullough wrote in his book, The Great Bridge: The Epic Story of the Building of the Brooklyn Bridge: “By and by it was common gossip that hers was the great mind behind the great work and that this, the most monumental engineering triumph of the age, was actually the doing of a woman, which as a general proposition was taken in some quarters to be both preposterous and calamitous. In truth, she had by then a thorough grasp of the engineering involved”.

Unlikely as it sounds, fires broke out at the bottom of the river on several occasions, started by workmen’s candles, fed by the oakum used for caulking and turbocharged by all that pressurized air. On at least one occasion, the caisson was filled with millions of gallons of water, before the fire went out for good.

Brooklyn bridge builders

A footbridge connected the two sides in 1877, and soon the wires began to be strung. Wooden “buggies” carried men back and forth along wires suspended hundreds of feet above the water, as individual wires were woven into the four great cables that support the bridge. The work was exacting, with each wire bound together to precise specifications. Rumors about corruption and sleaze surrounded the project when J. Lloyd Haigh, the wire contractor, was discovered to be supplying inferior material. It was way too late to do anything about it, and 150 extra wires were bundled into each cable to compensate. The tactic worked.  Haigh’s shoddy wire remains there, to this day.

At the time it was built, the span across the East river linking Brooklyn with Manhattan was the longest suspension bridge in the world.

Construction was completed in 1883, the bridge opening for use on May 24. Emily Roebling was the first to cross, in a carriage, carrying a rooster as the sign, of victory. New York politician Abram Stevens Hewitt honored her, at that day’s dedication. Today a bronze plaque bears name of the first female field engineer.

“…an everlasting monument to the sacrificing devotion of a woman and of her capacity for that higher education from which she has been too long disbarred.’

New York politician Abram Stevens Hewitt

Six days later, a rumor started that the bridge was about to collapse.  At least 12 people were killed in the resulting stampede. A year later, a publicity stunt by P. T. Barnum helped to put people’s minds at ease when Jumbo, the circus’ prize elephant, led a parade of 20 other elephants across the bridge.

For a long time the span was called the “New York and Brooklyn Bridge” or the “East River Bridge”, officially becoming the “Brooklyn Bridge” only in 1915. At least 27 were killed in its construction. Three from the bends, several from cable stringing accidents and others crushed under granite blocks or killed in high falls.

Even today, popular culture abounds with stories of suckers “buying” the Brooklyn Bridge. It was the longest bridge in the world for its time, and would remain so until 1903. Roebling had designed his project to be six times the strength required for the job. Even with those defective cables, the bridge is four times as strong as it needs to be. Many of the Brooklyn Bridge’s contemporary structures have long since gone.  Johann Augustus Roebling’s bridge carries 145,000 cars, every day.

Brooklyn Bridge
%d bloggers like this: