August 26, 1918 The Computer Wore a Skirt

“So the astronaut who became a hero, looked to this black woman in the still-segregated South at the time as one of the key parts of making sure his mission would be a success.”

In plasma physics, the Heliosphere is a vast cavity formed by the Sun, a “bubble” continuously “inflated” by plasma originating from that body known as “solar wind’ and separating our own solar system, from the vastness of interstellar space. The outermost reach of the Heliosphere comprises three major sections called the Termination Shock, the Heliosheath, and the Heliopause, so called because solar winds and interstellar winds meet to form, a zone of equilibrium.

Image converted using ifftoany

Only five man-made objects have traversed the heliosphere to penetrate interstellar space: Pioneer 10 and 11 launched in 1972-73, Voyager 1 and 2 launched in 1977 and New Horizons which left earth’s atmosphere, in 2006. Of those five only three remain active and continue to transmit data back to our little blue planet.

Voyager 2 Spacecraft

Spectacular images may be found on-line if you’re inclined to look them up. Images such as this jaw dropping shot of the ‘Blue Planet” Neptune taken two days before point of closest contact in August, 1989.

This picture of Neptune was taken by Voyager 2 less than five days before the probe’s closest approach of the planet on Aug. 25, 1989. The picture shows the “Great Dark Spot” – a storm in Neptune’s atmosphere – and the bright, light-blue smudge of clouds that accompanies the storm. Credit: NASA/JPL-Caltech

Or these images of the rings of Neptune taken on this day thirty two years ago before Voyager 2 left the last of the “gas giants”, behind.

Voyager 2 took these two images of the rings of Neptune on Aug. 26, 1989, just after the probe’s closest approach to the planet. Neptune’s two main rings are clearly visible; two fainter rings are visible with the help of long exposure times and backlighting from the Sun.
Credit: NASA/JPL-Caltech

Few among us are equipped to understand the complexity of such flight. Precious few. One such was a little girl, an American of African ancestry born this day in 1918 in White Silver Springs, West Virginia. The youngest of four born to Joyletta and Joshua Coleman, Creola Katherine showed unusual mathematical skills from an early age.

For black children, Greenbrier County West Virginia didn’t offer education past the eighth grade, in the 1920s. The Colemans arranged for their kids to attend high school two hours up the road in Institute, on the campus of West Virginia State College. Katherine took every math class offered by the school and graduated summa cum laude with degrees in mathematics and French, in 1937.

There were teaching jobs along the way at all-black schools and a marriage to Katherine’s first husband, James Goble. The couple would have three children together before James died of a brain tumor. Three years later she married James A. “Jim” Johnson.

With all that going on at home, Katherine found time to become one of only three black students to attend graduate school at West Virginia University and the only female, selected to integrate the school after the Supreme Court ruing Missouri ex rel. Gaines v. Canada.

Careers in research mathematics were few and far between for black women in 1952, but talent and hard work wins out where ignorance, fears to tread.

So it was Katherine Johnson joined the National Advisory Committee for Aeronautics (NACA), in 1952. Johnson worked in a pool of women who would read the data from aircraft black boxes and carry out a number of mathematical tasks. She referred to her co-workers as “computers who wore skirts”.

Flight research was a man’s world in those days but one day, Katherine and a colleague were asked to fill in, temporarily. Respect is not given it is earned, and Katherine’s knowledge of analytic geometry made quick work of that. Male bosses and colleagues alike were impressed with her skills. When her “temporary” assignment was over it no longer seemed all that important to send her, back to the pool.

Katherine would later explain that barriers of race and sex continued, but she could hold her own. Meetings were taken where decisions were made, where no women had been before. She’d simply tell them that she did the work and this was where she belonged, and that was the end of that.

Johnson worked as a human computer through most of the 1950s, calculating in-flight problems such as gust alleviation, in aircraft. Racial segregation was still in effect in those days according to state law and federal workplace segregation rules introduced under President Woodrow Wilson some forty years, earlier. The door where she worked was labeled “colored computers” but Johnson said she “didn’t feel the segregation at NASA, because everybody there was doing research. You had a mission and you worked on it, and it was important to you to do your job … and play bridge at lunch. I didn’t feel any segregation. I knew it was there, but I didn’t feel it.”

“We needed to be assertive as women in those days – assertive and aggressive – and the degree to which we had to be that way depended on where you were. I had to be. In the early days of NASA women were not allowed to put their names on the reports – no woman in my division had had her name on a report. I was working with Ted Skopinski and he wanted to leave and go to Houston … but Henry Pearson, our supervisor – he was not a fan of women – kept pushing him to finish the report we were working on. Finally, Ted told him, “Katherine should finish the report, she’s done most of the work anyway.” So Ted left Pearson with no choice; I finished the report and my name went on it, and that was the first time a woman in our division had her name on something”.

Katherine Johnson

Katherine worked as an aerospace technologist from 1958 until retirement. She calculated the trajectory for Alan Shepard’s May 1961 flight to become the first American, in space. She worked out the launch window for his 1961 Mercury mission and plotted navigational charts for backup in case of electronic failure. NASA was using electronic computers by the time of John Glenn’s first orbit around the earth but Glenn refused to fly until Katherine Johnson personally verified the computer’s calculations. Author Margot Lee Shetterly commented, “So the astronaut who became a hero, looked to this black woman in the still-segregated South at the time as one of the key parts of making sure his mission would be a success.”

Katherine Johnson retired in 1986 and lived to see six grandchildren and 11 “Greats”. Everyone should live to see their own great grandchild. Not surprisingly, Johnson encouraged hers to pursue careers in science and technology.

President Barack Obama personally awarded Johnson the medal of Freedom in 2015 for work from the Mercury program, to the Space Shuttle. NASA noted her “historical role as one of the first African-American women to work as a NASA scientist.”

A delightful side dish for this story is the Silver Snoopy award NASA gives for outstanding achievement, “For professionalism, dedication and outstanding support that greatly enhanced space flight safety and mission success.”

Following the Mercury and Gemini projects, NASA was searching for a way to focus employees and contractors alike on their own personal contribution to mission success. They wanted it to be fun and interesting, like the Smokey the Bear character, of the United States Forest service. Al Chop of the Manned Spacecraft Center came up with the idea.

Peanuts creator Charles Shulz, a combat veteran of WW2 and avid supporter of the space program, loved the idea. Shulz drew the character to be cast in a silver pin and worn into space, by a member of the Astronaut corps. It is this astronaut who personally awards his or her Snoopy to the deserving recipient.

The award is literally once in a lifetime. Of all NASA personnel and that of many contractors fewer than one percent have ever receive the coveted Silver Snoopy.

Astronaut and former NASA associate administrator for education Leland Melvin personally awarded Johnson her own Silver Snoopy at the naming ceremony in 2016, for the Katherine G. Johnson Computational Research Facility at NASA’s Langley Research Center in Hampton, Virginia.

Astronaut and former NASA associate administrator for education Leland Melvin presents Katherine Johnson with a Silver Snoopy award. / Credit: NASA, David C. Bowman

August 19, 1906 The Damn Thing Works!

A baby was born this day in 1906 in a small log cabin near Beaver, Utah. His name was Philo, the first born child of Louis Farnsworth and Serena Bastian. He would grow to be the most famous man, you probably never heard of.

Inventor Thomas Edison was once asked about his seeming inability, to invent artificial light. “I have not failed”, he explained “I’ve just found 10,000 ways that won’t work.”

A baby was born this day in 1906 in a small log cabin near Beaver, Utah. His name was Philo, the first born child of Louis Farnsworth and Serena Bastian. He would grow to be the most famous man, you probably never heard of.

Birthplace of Philo Taylor Farnsorth

Philo was constantly tinkering. He was the kind who could look at an object and understand how it worked and why this particular one, didn’t. The family moved when he was 12 to a relative’s ranch near Rigby, Idaho. Philo was delighted to learn the place had electricity. 

He found a burnt out electric motor thrown out by a previous tenant and rewound the armature, converting his mothers hand-cranked  washing machine, to electric. 

It must’ve seemed like Christmas morning when he found all those old technology magazines, in the attic. He even won a $25 prize one time in a magazine contest, for inventing a magnetized car lock.

Farnsworth was fascinated with the behavior of molecules and excelled in chemistry and physics, at Rigby high school. Harrowing a field one day behind a team of two horses, his mind got to working. What if I could “train“ electrons to work in lines like I’m doing here, with these horses? Electrons are so fast the human eye would never pick up, the individual lines. Couldn’t I use them to “paint“ an electronic picture?

Image dissector

Philo sketched his idea of an “image dissector” for his science teacher Mr. Tolman, who encouraged him to keep working on his idea. Justin Tolman kept the sketch though neither could know at that time.  Farnsworth’s 1922 drawing would prove decisive one day in a court of law, over who invented all-electronic television.

From Japan to Russia, Germany and America more than fifty inventors were working in the 1920s, to invent television. History remembers the Scottish engineer John Logie Baird as the man who built and demonstrated the world’s first electromechanical television. Amazingly, it was he who invented the first color TV tube, as well.

Scotsman John Logie Baird invented the first (electromechanical0 TV

It was all well and good but Baird’s spinning electromechanical disk was as a glacier, compared with the speed of the electron. Clearly, the future of television, lay in the field of electronics.

The Russian engineer Vladimir K. Zworykin applied for US patent on an electron scanning tube in 1923, while working for RCA. He wouldn’t get the thing to work though, until 1934. Meanwhile, Philo Taylor Farnsworth successfully demonstrated the first television signal transmission on September 7, 1927. The excited telegram Farnsworth sent to one of his backers exclaimed, “The damn thing works!”

Farnsworth’s successful patent application in 1930 resulted in additional funding to support his work and a visit, from Vladimir Zworykin. RCA offered Farnsworth $100,000 for his invention and, when he declined their offer, took him to court over his patent.

“If it weren’t for Philo T. Farnsworth, inventor of television, we’d still be eating frozen radio dinners”.

Johnny Carson

What followed was a bruising, ten year legal battle, a David vs. Goliath contest Farnsworth would win in the end, but at enormous cost both financial, and physical.

In another version of this story, the one that never happened, Philo Farnsworth went on to great fame and fortune to enjoy the fruits of his talents, and all his hard work. Instead World War 2 happened. Farnsworth’s hard fought patent rights quietly expired while the world, was busy with something else.

Ever the tinkerer, Farnsworth went on to invent a rudimentary form of radar, black light for night vision and an infrared telescope. Despite all that his company never did run in the “black”. He sold the company in 1949, to ITT.

From the 1950s on, the man’s primary interest, was in nuclear fusion. In 1965 he patented an array of tubes he called “fusors” in which he actually started a 30-second fusion reaction.

Farnsworth never did enjoy good health. The inventor of all-electronic television died of pneumonia on March 11, 1971 with well over 300 patents, to his name. Had you bought a television that day you would have owned a device with no fewer than 100 inventions, by this one man.

Ever the idealist Farnsworth believed television would bring about ever greater heights in human learning and achievement, foster a shared experience bringing about international peace and understanding. Much the same as some once believed of the internet where the sum total of human knowledge was now available for a few keystrokes, and social media fosters new worlds of harmonious relations where cheerful users discussed the collected works of Shakespeare, the Codes of Hammurabi and the vicissitudes, of life.

Right.

Farnsworth was dismayed by the dreck brought about, by his creation. “There’s nothing on it worthwhile” he would say“, and we’re not going to watch it in this household. I don’t want it in your intellectual diet…Television is a gift of God, and God will hold those who utilize his divine instrument accountable to him“. – Philo Taylor Farnsworth

That all changed if only a bit, on July 20, 1969. American astronaut Neil Armstrong stepped onto the surface of the moon and declared, “That’s one small step for man, one giant leap for mankind.” It was probably a misspeak. Most likely he intended to say “one small step for A man” but, be that as it may. The world saw it happen thanks to a miniaturized version of a device, invented by Philo Farnsworth.

Farnsworth himself was watching just like everyone else alive, that day. Years later Farnsworth’s wife Emma, he called her “Pem”, would recall in an interview, with the Academy of Television Arts & Sciences: “We were watching it, and, when Neil Armstrong landed on the moon, Phil turned to me and said, “Pem, this has made it all worthwhile.” Before then, he wasn’t too sure”.

August 13, 1941 Beans

The car itself was destroyed long ago, the ingredients for its manufacture unrecorded, but the thing lives on in the hearts of hemp enthusiasts, everywhere.

The largest museum in the United States is located in the Detroit suburb of Dearborn, the Henry Ford Museum of American Innovation. The sprawling, 12-acre indoor-outdoor complex in the old Greenfield Village is home to JFK’s Presidential limo, the Rosa Parks bus and the Wright Brothers’ bicycle shop. There you will find Abraham Lincoln’s chair from Ford’s Theater along with Thomas Edison’s laboratory and an Oscar Mayer Wienermobile. George Washington’s camp bed is there, with Igor Sikorski’s helicopter and an enormous collection of antique automobiles, locomotives and aircraft.

One object you will not find there is Henry Ford’s plastic car. Made from soybeans.

92MI_0003

As a young man, Henry Ford left the family farm outside of modern-day Detroit, and never returned. Ford’s father William thought the boy would one day own the place but young Henry couldn’t stand farm work. He later wrote, “I never had any particular love for the farm—it was the mother on the farm I loved”.

Henry Ford went on to other things, but part of him never left the soil. In 1941, the now-wealthy business magnate wanted to combine industry, with agriculture. At least, that’s what the museum says.

soybean-car-chassis-skeleton-right-rear

Ford gave the plastic car project to yacht designer Eugene Turenne Gregorie at first, but later turned to the Greenfield Village soybean laboratory. To the guy in charge over there, a guy with some experience in tool & die making. His name was Lowell Overly.

The car was made in Dearborn with help from scientist and botanist George Washington Carver, (yeah, That George Washington Carver), a man born to slavery who rose to such prodigious levels of accomplishment that Time magazine labeled the man, the “Black Leonardo”.

Carver1web
George Washington Carver, at work in his library

The soybean car, introduced to the public this day in 1941, was made from fourteen quarter-inch thick plastic panels and plexiglass windows, attached to a tubular steel frame and weighing in at 1,900 pounds, about a third lighter than comparable automobiles of the era. The finished prototype was exhibited later that year at the Dearborn Days festival, and the Michigan State Fair Grounds.

The thing was built to run on fuel derived from industrial hemp, a related strain of the green leafy herb beloved of stoners, the world over.

Ford claimed he’d be able to “grow automobiles from the soil”, a hedge against the metal rationing of world War Two. He dedicated 120,000 acres of soybeans to experimentation, but to no end. The total acreage devoted to “fuel” production went somehow, unrecorded.

Another reason for a car made from soybeans, was to help American farmers. In any case Henry Ford had a “thing”, for soybeans. He was one of the first in this country, to regularly drink soy milk. At the 1934 World’s Fair in Chicago, Ford invited reporters to a feast where he served soybean cheese, soybean crackers, soy bread and butter, soy milk, soy ice cream. If he wasn’t the Bubba Gump of soybeans, perhaps Bubba Gump was the Henry Ford, of Shrimp.

Ford’s own car was fitted with a soybean trunk and struck with an axe to demonstrate the material’s durability, though the axe was later revealed to have a rubber boot.

Henry-Ford-Soybean-Car

Henry Ford’s experiment in making cars from soybeans never got past that first prototype and came to a halt, during World War 2. The project was never revived, though several states adopted license plates stamped out of soybeans, a solution to the steel shortage farm animals found to be quite delicious.

The car itself was destroyed long ago, the ingredients for its manufacture unrecorded, but the thing lives on in the hearts of hemp enthusiasts, everywhere.

The New York Times claimed the car body and fenders were made from soy beans, wheat and corn. Other sources opine that the car was made from Bakelite or some variant of Duroplast, a plant-based auto body substance produced in the millions, for the East German Trabant.

One newspaper claimed that nothing ever came from Henry Ford’s soybean experiments, save and except for, whipped cream.

August 12, 1865 The Shoulders of Giants

Today, the idea that microorganisms such as fungi, viruses and other pathogens cause infectious disease is common knowledge, but such ideas were held in disdain among scientists and doctors, well into the 19th century.

In the 12th century, French philosopher Bernard of Chartres talked about the concept of “discovering truth by building on previous discoveries”. The idea is familiar to the reader of English as expressed by the mathematician and astronomer Isaac Newton, who observed that “If I have seen further it is by standing on the shoulders of Giants.”

gooddoc
Dr. Ignaz Semmelweis

Nowhere is there more truth to the old adage, than in the world of medicine. In 1841, the child who survived to celebrate a fifth birthday could look forward to a life of some 55 years. Today, a five-year-old can expect to live to eighty-two, fully half again that of the earlier date.

Yet, there are times when the giants who brought us here are unknown to us, as if they had never been. One such is Dr. Ignaz Semmelweis, one of the earliest pioneers in anti-septic medicine.

Semmelweis  studied law at the University of Vienna in the fall of 1837, but switched to medicine the following year. He received his MD in 1844 and, failing to gain a clinical appointment in internal medicine, decided to specialize in obstetrics.

In the third century AD, the Greek physician Galen of Pergamon first described the “miasma” theory of illness, holding that infectious diseases such as cholera, chlamydia and the Black Death were caused by noxious clouds of “bad air”.  The theory is discredited today, but such ideas die hard.

miasma-theory

The germ theory of disease was first proposed by Girolamo Fracastoro in 1546 and expanded by Marcus von Plenciz in 1762. Single-cell organisms – bacteria – were known to exist in human dental plaque as early as 1683, yet their functions were imperfectly understood. Today, the idea that microorganisms such as fungi, viruses and other pathogens cause infectious disease is common knowledge, but such ideas were held in disdain among scientists and doctors, well into the 19th century.

InfectiousDisease16_9

In the mid-19th century, birthing centers were set up all over Europe, for the care of poor and underprivileged mothers and their illegitimate infants. Care was provided free of charge, in exchange for which young mothers agreed to become training subjects for doctors and midwives.

In 1846, Semmelweis was appointed assistant to Professor Johann Klein in the First Obstetrical Clinic of the Vienna General Hospital, a position similar to a “chief resident,” of today.

300px-AAKH-1784

At the time, Vienna General Hospital ran two such clinics, the 1st a “teaching hospital” for undergraduate medical students, the 2nd for student midwives.

Semmelweis quickly noticed that one in ten women and sometimes one in five, were dying in the First Clinic of postpartum infection known as “childbed fever”, compared with less than 4% that of the Second Clinic.

The difference was well known, even outside of the hospital. Expectant mothers were admitted on alternate days into the First or Second Clinic. Desperate women begged on their knees not to be admitted into the First, some preferring even to give birth in the streets, over delivery in that place. The disparity between the two clinics “made me so miserable”, Semmelweis said, “that life seemed worthless”.

He had to know why this was happening.

Puerperal Peritonitis 1912 MA

Childbed or “puerperal” fever was rare among these “street births”, and far more prevalent in the First Clinic, than the Second. Semmelweis carefully eliminated every difference between the two, even including religious practices. In the end, the only difference was the people who worked there.

The breakthrough came in 1847, following the death of Semmelweis’ friend and colleague, Dr. Jakob Kolletschka. Kolletschka was accidentally cut by a student’s scalpel, during a post-mortem examination. The doctor’s own autopsy showed a pathology very similar to those women, dying of childbed fever. Medical students were going from post-mortem examinations of the dead to obstetrical examinations of the living, without washing their hands.

Midwife students had no such contact with the dead. This had to be it. Some unknown “cadaverous material” had to be responsible for the difference.

Ignaz Philipp Semmelweis

Semmelweis instituted a mandatory handwashing policy, using a chlorinated lime solution between autopsies and live patient examinations.

Mortality rates in the First Clinic dropped by 90 percent, to rates comparable with the Second. In April 1847, First Clinic mortality rates were 18.3% – nearly one in five. Hand washing was instituted in mid-May, and June rates dropped to 2.2%.  July was 1.2%. For two months, the rate actually stood at zero.

The European medical establishment celebrated the doctor’s findings. Semmelweis was feted as the Savior of Mothers, a giant of modern medicine. 

No, just kidding.  He wasn’t.

The imbecility of the response to Semmelweis’ findings is hard to get your head around and the doctor’s own personality, didn’t help.  The medical establishment took offense at the idea that they themselves were the cause of the mortality problem, and that the answer lay in personal hygiene.

Yearly_mortality_rates_1841-1846_two_clinics

Semmelweis himself was anything but tactful, publicly berating those who disagreed with his hypothesis and gaining powerful enemies.   For many, the doctor’s ideas were extreme and offensive, ignored or rejected and even ridiculed.  Are we not Gentlemen!?  Semmelweis was fired from his hospital position and harassed by the Vienna medical establishment, finally forced to move to Budapest.

Dr. Semmelweis was outraged by the indifference of the medical community, and began to write open and increasingly angry letters to prominent European obstetricians.  He went so far as to denounce such people as “irresponsible murderers”, leading contemporaries and even his wife, to question his mental stability.

Dr. Ignaz Philipp Semmelweis was committed to an insane asylum on July 31, 1865, twenty-three years before Dr. Louis Pasteur opened his institute for the study of microbiology.

Semmelweis bust, University of Tehran

Barely two weeks later, August 12, 1865, British surgeon and scientist Dr. Joseph Lister performed the first anti-septic surgery, in medical history. Dr. Semmelweis died the following day at the age of 47, the victim of a blood infection resulting from a gangrenous wound sustained in a severe beating, by asylum guards.

August 2, 1864 Grandissima Ruina

In an age of hand-lit sputtering fuses and hand packed (to say nothing of hand-made) powder, even a millisecond difference in ignition will give one ball a head start, to be measured in feet.

In 1642, Italian gun maker Antonio Petrini conceived a double barrel cannon with tubes joined at 45° firing solid shot joined together, by a length of chain.  This was the year of the “Great Rebellion“, the English Civil War, when King and Parliament raised armies to go to war – with each other.  Petrini’s idea must have looked good to King Charles I of England. Imagine, a weapon capable of slicing through the ranks of his enemies, like grass before a scythe.

The idea was to fire both barrels simultaneously, but there was the rub.  Wild ideas occur to the imagination of imperfect combustion, and a chained ball swinging around to take out its own gun crew.  The King himself was mute on the subject and went on to lose his head, in 1649.  Petrini’s manuscript resides to this day in the tower of London.  There is no documented evidence that the weapon was ever fired, save for the designer’s own description of the ‘Grandissima Ruina’ left behind, by his own splendid creation.

Capture_zps80a1eeae

Two-hundred years later the former British colonies across the Atlantic, were themselves embroiled in Civil War.

In the early days of independence, the Confederate Congress enacted a measure, allowing local cities and towns to form semi-military companies for the purpose of local defense. As the very flower of young southern manhood was called up and sent to the front, these “home guard” units often comprised themselves of middle-age and older gentlemen, and others for various reasons, unable to leave home and hearth.

ALHull

Augustus Longstreet Hull was born 1847 in “The Classic City” of Athens Georgia, and enlisted in the Confederate Army on September 8, 1864.

After the war, Hull worked twenty-seven years as a banker before publishing the Annals of Athens, in 1906.  In it, Mr. Hull writes with not a little biting wit, of his own home town home guard unit, Athens’ own, Mitchell Thunderbolts.

“From the name one might readily infer that it was a company made up of fierce and savage men, eager for the fray and ready at all times to ravage and slaughter; yet such was not the case, for in all their eventful career no harm was done to a human being, no property was seized and not one drop of blood stained their spotless escutcheon.

Named for one of it’s own private soldiers, the Mitchell Thunderbolts were not your standard military company. These guys were “organized strictly for home defense” and absolutely refused to take orders.  From anyone. They recognized no superior officer and the right to criticism was reserved and freely exercised by everyone from that “splendid old gentleman” Colonel John Billups, down to the lowliest private.

800px-Middleton_P._Barrow_-_Brady-Handy
Georgia Senator Middleton Pope Barrow

General Howell Cobb sent the future United States Senator Captain Middleton Pope Barrow to Athens in 1864, to inspect the Thunderbolts. Having no intention of submitting to “inspection” by any mere stripling of a Captain, Dr. Henry Hull (Augustus’ father) “politely informed him that if he wished to inspect him, he would find him on his front porch at his home every morning at 9 o’clock“.

John Gilleland, 53, was a local dentist, builder and mechanic, and private soldier in good standing, of the Mitchell Thunderbolts.  Gilleland must have liked Petrini’s idea because he took up a collection in 1862, and raised $350 to build the Confederate States of America’s own, double-barrel cannon.

Measuring 13 inches wide by 4-feet 8½” inches and weighing in at some 1,300 pounds, this monstrosity had two barrels diverging at 3° and equipped with three touch holes, one for each barrel and a third should anyone wish to fire the two, together.  It was the secret “super weapon” of the age, two cannonballs connected by a chain and designed to “mow down the enemy somewhat as a scythe cuts wheat.”

Yeah. As Mr. Petrini could have told them, the insurmountable problem remained. In an age of hand-lit sputtering fuses and hand packed (to say nothing of hand-made) powder, even a millisecond difference in ignition will give one ball a head start, to be measured in feet. How to simultaneously fire two conjoined weapons remained a problem, even for so elite an outfit, as the Mitchell Thunderbolts.

The atmosphere was festive on April 22, 1862, when a crowd gathered to watch Gilleland test the Great Yankee Killer. Aimed at two poles stuck in the ground, uneven ignition and casting imperfections sent assorted spectators scrambling for cover as two balls spun wildly off to the side where they “plowed up about an acre of ground, tore up a cornfield, mowed down saplings, and then the chain broke, the two balls going in different directions“.

046_zps0a76b542
Double Barrel Cannon model, H/T ModelExpo

On the second test, two chain-connected balls shot through the air and into a stand of trees.   According to one witness, the “thicket of young pines at which it was aimed looked as if a narrow cyclone or a giant mowing machine had passed through“.

On the third firing, the chain snapped right out of the barrel.  One ball tore into a nearby log cabin and destroyed the chimney, while the other spun off and killed a cow who wasn’t bothering anyone.

Gilleland considered all three tests successful, even though the only ones truly safe that day, were those two target posts.

The dentist went straight to the Confederate States’ arsenal in Augusta where Colonel George Rains subjected his creation to extensive testing, before reporting the thing too unreliable for military use. Outraged, an angry inventor wrote angry letters to Georgia Governor Joseph “Joe” Brown and to the Confederate government in Richmond, but to no avail.

At last, the contraption was stuck in front of the Athens town hall and used as a signal gun, to warn citizens of approaching Yankees.

7239343410_30c6de36eb_z

There the thing remained until August 2, 1864, when the gun was hauled out to the hills west of town to meet the Federal troops of Brigadier General George Stoneman.  The double-barrel cannon was positioned on a ridge near Barber’s Creek and loaded with canister shot, along with several conventional guns.  Outnumbered home guards did little real damage but the noise was horrendous, and Stoneman’s raiders withdrew to quieter pastures.

There were other skirmishes in the area, all of them minor. In the end, Athens escaped the devastation of Sherman’s march to the sea and the Confederate superweapon weapon was moved, back to town.

Gilleland’s monstrosity was sold after the war and lost, for a time.  The thing was recovered and restored back in 1891, and returned to the Athens City Hall where it remains to this day, a contributing property of the Downtown Athens Historic District.  Come and see it if you’re ever in Athens, right there at the corner of Hancock and College Avenue.  There you will find the thing, pointing north, at all those Damned Yankees.  You know. Just in case.

800px-Doublebarreledcannonathensgeorgia-plaque

July 13, 1908 Apocalypse

In 2018, the non-profit B612 Foundation dedicated to the study of near-Earth object impacts, reported that “It’s a 100 per cent certain we’ll be hit [by a devastating asteroid]”. Comfortingly, the organization’s statement concluded “we’re [just] not 100 per cent sure when.”

The first atomic bomb in the history of human conflict exploded in the skies over Japan on August 6, 1945. The bomb, code named “Little Boy”, reached an altitude of 1,900-feet over the city of Hiroshima at 8:15am, Japanese Standard Time.

A “gun-triggered” fission bomb, barometric-pressure sensors initiated the explosion of four cordite charges, propelling a small “bullet” of enriched uranium the length of a fixed barrel and into a sphere of the same material. Within picoseconds (1/.000000000001 of a second), the collision of the two bodies initiated a fission reaction, releasing an energy yield roughly equivalent to 15,000 tons of TNT.

66,000 were killed outright by the effects of the blast. The shock wave spread outward at a velocity greater than the speed of sound, flattening virtually everything in its path for a mile in all directions.

h13_36

Thirty-seven years before, the boreal forests of Siberia lit up with an explosion 1,000 times greater than the atomic bomb dropped over Hiroshima. At the time, no one had the foggiest notion that it was coming.

The Taiga occupies the high latitudes of the world’s northern regions, a vast international beltline of coniferous forests consisting mostly of pines, spruces and larches between the high tundra, and the temperate forest.  An enormous community of plants and animals, this trans-continental ecosystem comprises a vast biome, second only to the world’s oceans.

The Eastern Taiga is a region in the east of Siberia, an area 1.6 times the size of the continental United States.  The Stony Tunguska River wends its way along an 1,160-mile length of the region, its entire course flowing under great pebble fields with no open water.

Tunguska

On the morning of June 30, 1908, the Tunguska River lit up with a bluish-white light.  At 7:17a local time, a column of light too bright to look at with the naked eye moved across the skies above the Tunguska. Minutes later, a vast explosion knocked people off their feet, flattening buildings, crops and as many as 80 million trees over an area 830 miles, square. A vast “thump” was heard, the shock wave equivalent to an earthquake measuring 5.0 on the Richter scale. Within minutes came a second and then a third shock wave and finally a fourth, more distant this time and described by eyewitnesses as the “sun going to sleep”.

On July 13, 1908, the Krasnoyaretz newspaper reported “At 7:43 the noise akin to a strong wind was heard. Immediately afterward a horrific thump sounded, followed by an earthquake that literally shook the buildings as if they were hit by a large log or a heavy rock”.

Fluctuations in atmospheric pressure were detectable as far away as Great Britain.  Night skies were set aglow from Asia to Europe for days on end, theorized to have been caused by light, passing through high-altitude ice particles.

In the United States, lookout posts from the Smithsonian Astrophysical Observatory headquartered in Cambridge, Massachusetts, to the Mount Wilson Observatory in Los Angeles recorded a several months-long decrease in atmospheric transparency, attributed to an increase in dust, suspended in the atmosphere.

The “Tunguska Event” was the largest such impact event in recorded history, but far from the first. Or the last.  Mistastin Lake in northern Labrador was formed during the Eocene era 36-million years ago, cubic Zirconium deposits suggesting an impact-zone temperature of some 4,300° Fahrenheit. 

That’s halfway to the temperature, of the sun.

240px-Bolide
“A bolide – a very bright meteor of an apparent magnitude of &−14 or brighter” H/T Wikimedia

Some sixty-six million years ago, the “Chicxulub impactor” struck the Yucatan Peninsula of Mexico, unleashing a mega-tsunami of 330-feet in height from Texas to Florida. Superheated steam, ash and vapor towered over the impact zone, as colossal shock waves triggered global earthquakes and volcanic eruptions.   Vast clouds of dust blotted out the sun for months on end leading to mass extinction events, the world over.

The official history of the Ming Dynasty records the Ch’ing-yang event of 1490, a meteor shower in China in which “stones fell like rain”. Some 10,000 people were killed for all intents and purposes, stoned to death.

In 2013, a twenty-meter (66-foot) space rock estimated at 13,000-14,000 tons flashed across the skies of Chelyabinsk, Russia, breaking apart with a kinetic impact estimated at 26-times the nuclear blast over Hiroshima.  This Superbolide (a bolide is “an extremely bright meteor, especially one that explodes in the atmosphere”) entered the earth’s atmosphere on February 15, burning exposed skin and damaging retinas for miles around.  No fatalities were reported though 1,500 were injured seriously enough to require medical attention.

The 450-ton Chicora Meteor collided with western Pennsylvania on June 24, 1938, in a cataclysm comparable to the Halifax Explosion of 1917.  The good luck held, that time, the object making impact in a sparsely populated region.  The only reported casualty, was a cow.  Investigators F.W. Preston, E.P. Henderson and James R. Randolph remarked that “If it had landed on Pittsburgh there would have been few survivors”.

In 2018, the non-profit B612 Foundation dedicated to the study of near-Earth object impacts, reported that “It’s a 100 per cent certain we’ll be hit [by a devastating asteroid]”. Comfortingly, the organization’s statement concluded “we’re [just] not 100 per cent sure when.”

Impact_event

It puts a lot of things into perspective.

June 21, 1633 The Last Word

Revenge it is said, is a dish best served, cold.

From the time of antiquity, science took the “geocentric” view of the solar system. Earth exists at the center of celestial movement with the sun and planetary bodies revolving around our own little sphere.

The perspective was widely held but by no means unanimous.  In the third century BC the Greek astronomer and mathematician Aristarchus of Samos put the Sun in the center of the universe.  Later Greek astronomers Hipparchus and Ptolemy agreed, refining Aristarchus’ methods to arrive at a fairly accurate estimate for the distance to the moon, but theirs remained the minority view.

Bartolomeu_Velho_1568_br-580x428
Earth is at the center of this model of the universe created by Bartolomeu Velho, a Portuguese cartographer, in 1568. H/T: NASA/Bibliothèque Nationale, Paris

In the 15th century, Polish mathematician and astronomer Nicolaus Copernicus parted ways with the orthodoxy of his time, describing a “heliocentric” model of the universe placing the sun at the center.  The Earth and other bodies, according to this model, revolved around the sun.

Copernicus wisely refrained from publishing such ideas until the end of his life, fearing to offend the religious sensibilities of the time. Legend has it that he was presented with an advance copy of his “De revolutionibus orbium coelestium” (On the Revolutions of the Heavenly Spheres) on awakening on his death bed, from a stroke-induced coma. He took one look at his book, closed his eyes and never opened them again.

b148cfbc5e4edde112d92494ff235d4e
Copernicus’ ‘heliocentric’ view of the universe.

The Italian physicist, mathematician, and astronomer Galileo Galilei came along, about a hundred years later. The “Father of Modern Observational Astronomy”, Galileo’s improvements to the telescope and resulting astronomical observations supporting the Copernican heliocentric view.

Bad news for Galileo, they also brought him to the attention of the Roman Inquisition.

Biblical references such as, “The Lord set the Earth on its Foundations; it can Never be Moved.” (Psalm 104:5) and “And the Sun Rises and Sets and Returns to its Place.” (Ecclesiastes 1:5) were taken at the time as literal and immutable fact and formed the basis for religious objection to the heliocentric model.

Galileo_facing_the_Roman_Inquisition
Galileo faces the Roman Inquisition

Galileo was brought before inquisitor Vincenzo Maculani for trial. The astronomer backpedaled before the Inquisition, but only to a point, testifying in his fourth deposition on June 21, 1633: “I do not hold this opinion of Copernicus, and I have not held it after being ordered by injunction to abandon it. For the rest, here I am in your hands; do as you please”.

There is a story about Galileo, which may or may not be true. Refusing to accept the validity of his own conviction, the astronomer muttered “Eppur si muove” — “And yet it moves”.

The Inquisition condemned the astronomer to “abjure, curse, & detest” his Copernican heliocentric views, returning him to house arrest at his villa in 1634, there to spend the rest of his life. Galileo Galilei, the Italian polymath who all but orchestrated the transition from late middle ages to  scientific Renaissance, died on January 8, 1642, desiring to be buried in the main body of the Basilica of Santa Croce, next to the tombs of his father and ancestors. 

His final wishes were ignored at the time, though not forever. His final wishes would be honored some ninety-five years later, when Galileo was re-interred according to his wishes, in the basilica.

1200x630px-Basilica_of_Santa_Croce_Florence_1
Basilica of Santa Croce, in Florence

Often, atmospheric conditions in these burial vaults lead to natural mummification of the corpse. Sometimes, they look almost lifelike. When it came to the saints, believers took this to be proof of the incorruptibility of these individuals, and small body parts were taken as holy relics.

Such a custom seems ghoulish to us today, but the practice was was quite old by the 18th century.  Galileo is not now and never was a Saint of the Catholic church, quite the opposite.  The Inquisition had judged the man an enemy of the church, a heretic.

23galileospan-cnd-articleLarge
“A bust of Galileo at the Galileo Museum in Florence, Italy. The museum is displaying recovered parts of his body”. H/T New York Times

Even so, the condition of Galileo’s body may have made him appear thus “incorruptible”.  Be that as it may, one Anton Francesco Gori removed the thumb, index and middle fingers on March 12, 1737. The digits with which Galileo wrote down his theories of the cosmos. The digits with which he adjusted his telescope.

The other two fingers and a tooth disappeared in 1905, leaving the middle finger from Galileo’s right hand on exhibit at the Museo Galileo in Florence, Italy. 

Locked in a glass case, the finger points upward, toward the sky.

23galileo2-cnd-popup100 years later, two fingers and a tooth were purchased at auction, and since rejoined their fellow digit at the Museo Galileo. To this day these are the only human body parts, in a museum otherwise devoted to scientific instrumentation.

379 years after his death, Galileo’s extremity points upward, toward the glory of the cosmos.  Either that or the most famous middle finger on earth, flipping the bird in eternal defiance to those lesser specimens who once condemned him, for ideas ahead of his time.

May 24, 1883 First Across

For 11 years she studied higher mathematics, catenary curves, materials strength and the intricacies of cable construction, all while acting as the pivot point on the largest bridge construction project on the planet and nursemaid, to a desperately sick husband.

Focused as he was on surveying, the engineer should have paid more attention to his surroundings. The year was 1869. Civil engineer John Roebling had begun the site work two years ago, almost to the day. Now just a few more compass readings, across the East River. Soon, work would begin on the longest steel suspension span in the world. A bridge connecting the New York boroughs of Brooklyn and Manhattan.

Roebling was working on the pier with his 32-year old son Washington, also a civil engineer. As the ferry came alongside, the elder Roebling’s toes were caught and crushed so badly, as to require amputation.

brooklyn-bridge-caisson-granger“Lockjaw” is such a sterile term, it doesn’t begin to describe the condition known as Tetanus. In the early stages, the anaerobic bacterium Clostridium Tetani produces tetanospasmin, a neurotoxin producing mild spasms in the jaw muscles. As the disease progresses, sudden and involuntary contractions affect skeletal muscle groups, becoming so powerful that bones are literally fractured as the muscles tear themselves apart. These were the last days of John Roebling, the bridge engineer who would not live to see his most famous work.

The German-born civil engineer was the first casualty of the project.  He would not be the last.

Brooklyn Bridge Caisson Construction

Washington took over the project, beginning construction on January 3, 1870.

Enormous yellow pine boxes called “caissons” were built on the Brooklyn and New York sides of the river, descending at the rate of 6-inches per week in search of bedrock. Like giant diving bells, the New York side ended up at 78- feet below mean high tide, the Brooklyn side 44-feet. Pressurized air was pumped into these caissons, keeping water and mud at bay as workers excavated the bottom.

In 1872, these “sandhogs” began to experience a strange illness that came to be called “caisson disease”.

Civil War era submarine designer Julius Hermann Kroehl may have recognized what was happening, but Kroehl was five years in his grave by this time, victim of the same “fever”.

Today we call it “the bends”. Pop the top off a soda bottle and you’ll see the principle at work. Without sufficient decompression time, dissolved gasses come out of solution and the blood turns to foam. Bubbles form in or migrate to any part of the body, resulting in symptoms ranging from joint pain and skin rashes, to paralysis and death.  The younger Roebling was badly injured as a result of the bends in 1872, leaving him partially paralyzed and bedridden, incapable of supervising construction on-site.

brooklyn-anchorage

Roebling moved to an apartment in Brooklyn Heights and conducted the entire project looking out the window, designing and redesigning details while his wife, Emily Warren Roebling, became the critical connection between her husband and the job site.

To aid in the work, Emily Roebling took a crash course in bridge engineering. For 11 years she studied higher mathematics, catenary curves, materials strength and the intricacies of cable construction, all while acting as the pivot point on the largest bridge construction project on the planet and nursemaid, to a desperately sick husband.

Emily-Warren-Roebling-e1389630968571
Emily Warren Roebling, the “first woman field engineer”.

Historian David McCullough wrote in his book, The Great Bridge: The Epic Story of the Building of the Brooklyn Bridge: “By and by it was common gossip that hers was the great mind behind the great work and that this, the most monumental engineering triumph of the age, was actually the doing of a woman, which as a general proposition was taken in some quarters to be both preposterous and calamitous. In truth, she had by then a thorough grasp of the engineering involved”.

Unlikely as it sounds, fires broke out at the bottom of the river on several occasions, started by workmen’s candles, fed by the oakum used for caulking and turbocharged by all that pressurized air. On at least one occasion, the caisson was filled with millions of gallons of water, before the fire went out for good.

Brooklyn bridge builders

A footbridge connected the two sides in 1877, and soon the wires began to be strung. Wooden “buggies” carried men back and forth along wires suspended hundreds of feet above the water, as individual wires were woven into the four great cables that support the bridge. The work was exacting, with each wire bound together to precise specifications. Rumors about corruption and sleaze surrounded the project when J. Lloyd Haigh, the wire contractor, was discovered to be supplying inferior material. It was way too late to do anything about it, and 150 extra wires were bundled into each cable to compensate. The tactic worked.  Haigh’s shoddy wire remains there, to this day.

At the time it was built, the span across the East river linking Brooklyn with Manhattan was the longest suspension bridge in the world.

Construction was completed in 1883, the bridge opening for use on May 24. Emily Roebling was the first to cross, in a carriage, carrying a rooster as the sign, of victory. New York politician Abram Stevens Hewitt honored her, at that day’s dedication. Today a bronze plaque bears name of the first female field engineer.

“…an everlasting monument to the sacrificing devotion of a woman and of her capacity for that higher education from which she has been too long disbarred.’

New York politician Abram Stevens Hewitt

Six days later, a rumor started that the bridge was about to collapse.  At least 12 people were killed in the resulting stampede. A year later, a publicity stunt by P. T. Barnum helped to put people’s minds at ease when Jumbo, the circus’ prize elephant, led a parade of 20 other elephants across the bridge.

For a long time the span was called the “New York and Brooklyn Bridge” or the “East River Bridge”, officially becoming the “Brooklyn Bridge” only in 1915. At least 27 were killed in its construction. Three from the bends, several from cable stringing accidents and others crushed under granite blocks or killed in high falls.

Even today, popular culture abounds with stories of suckers “buying” the Brooklyn Bridge. It was the longest bridge in the world for its time, and would remain so until 1903. Roebling had designed his project to be six times the strength required for the job. Even with those defective cables, the bridge is four times as strong as it needs to be. Many of the Brooklyn Bridge’s contemporary structures have long since gone.  Johann Augustus Roebling’s bridge carries 145,000 cars, every day.

Brooklyn Bridge

May 17, 1781 Windows on their Souls

“A daguerreotype is a unique image — it isn’t a print, it isn’t a reproduction of any kind. When you have a camera set up to take a daguerreotype and the sitter is in front of you, for example, one of these old men who actually looked and knew and talked to leaders of the Revolution … the light is coming from the sun, hitting his face, and bouncing off of his face through the camera and onto that very same plate.”- Joseph Bauman

FOTR, Dr Eneas Munson

Imagine seeing the faces of the men who fought the American Revolution.  Not the paintings. There’s nothing extraordinary about that, except for the talent of the artist.  I mean their photographs – images that make it possible for you to look into their eyes. The windows, of their souls.

In a letter dated May 17, 1781 and addressed to Alexander Scammell, General George Washington outlined his intention to form a light infantry unit, under Scammell’s leadership.

Dr. Eneas Munson

Comprised of Continental Line units from Connecticut, Massachusetts and New Hampshire, the Milford, Massachusetts-born Colonel’s unit was among defensive forces keeping Sir Henry Clinton penned up in New York City, as the Continental army made its way south to a place called Yorktown.

FOTR, Rev Levi Hayes

Among the men under Scammell’s command was Henry Dearborn, future Secretary of War under President Thomas Jefferson. A teenage medic was also present.  His name was Eneas Munson.

One day, the medic would go on to become Doctor Eneas Munson, professor of the Yale Medical School in New Haven Connecticut,  President of the Medical Society of that same state.  And a man who would live well into the age of photography.

Reverend Levi Hayes

The American Revolution ended in 1783.  By the first full year of the Civil War, only 12 Revolutionary War veterans remained on the pension rolls of a grateful nation.

Two years later, Reverend EB Hillard brought two photographers through New York and New England to visit, and to photograph what were believed to be the last six.  Each man was 100 years or older, at the time of the interview.

FOTR, Peter Mackintosh

William Hutchings of York County Maine, still part of Massachusetts at the time, was captured at the siege of Castine at the age of fifteen.  British authorities said it was a shame to hold one so young a prisoner, and he was released.

Reverend Daniel Waldo of Syracuse, New York fought under General Israel Putnam, becoming a POW at Horse Neck.

Adam Link of Maryland enlisted at 16 in the frontier service.

Peter Mackintosh

Alexander Millener of Quebec was a drummer boy in George Washington’s Life Guard.

Clarendon, New York native Lemuel Cook would live to be one of the oldest surviving veterans of the Revolution, surviving to the age of 107.  He and Alexander Millener witnessed the British surrender, at Yorktown.

FOTR, Jonathan Smith

Samuel Downing from Newburyport, Massachusetts, enlisted at the age of 16 and served in the Mohawk Valley under General Benedict Arnold.  “Arnold was our fighting general”, he’d say, “and a bloody fellow he was. He didn’t care for nothing, he’d ride right in. It was ‘Come on, boys!’ ’twasn’t ‘Go, boys!’ He was as brave a man as ever lived…He was a stern looking man, but kind to his soldiers. They didn’t treat him right: he ought to have had Burgoyne’s sword. But he ought to have been true. We had true men then, twasn’t as it is now”.

Jonathan Smith

Hillard seems to have missed Daniel F. Bakeman, but with good reason.  Bakeman had been unable to prove his service with his New York regiment.  It wasn’t until 1867 that he finally received his veteran’s pension by special act of Congress.

FOTR, James Head

Daniel Frederick Bakeman would become the Frank Buckles of his generation, the last surviving veteran of the Revolution. The 1874 Commissioner of Pensions report said that “With the death of Daniel Bakeman…April 5, 1869, the last of the pensioned soldiers of the Revolution passed away.  He was 109.

Most historians agree on 1839 as the year in which the earliest daguerreotypes became practically accessible.

James W. Head

When Utah based investigative reporter Joe Bauman came across Hillard’s photos in 1976, he believed that there must be others.  Photography had been in existence for 35 years by Reverend Hillard’s time.  What followed was 30 years’ work, first finding and identifying photographs of the right vintage, and then digging through muster rolls, pension files, genealogical records and a score of other source documents, to see if each had played a role in the Revolution.

FOTR, George Fishley

There were some, but it turned out to be a small group. 

Peter Mackintosh for one, was a 16-year-old blacksmith’s apprentice, from Boston.  He was working the night of December 16, 1773, when a group of men ran into the shop scooping up ashes from the hearth and rubbing them on their faces.  Turns out hey were going to a Tea Party.

George Fishley

James Head was a thirteen year-old Continental Naval recruit from a remote part of what was then Massachusetts.  Head would be taken prisoner but later released, walking the 224 miles home from Providence to the future town of Warren, Maine.

Head was elected a Massachusetts delegate to the convention called in Boston, to ratify the Constitution.   He would die the wealthiest man in Warren, stone deaf from service in the Continental Navy.

FOTR, Simeon Hicks

George Fishley served in the Continental army and fought in the Battle of Monmouth, and in General John Sullivan’s campaign against British-allied Indians in New York and Pennsylvania.

Fishley would spend the rest of his days in Portsmouth New Hampshire, where he was known as ‘the last of our cocked hats.”

Simeon Hicks

Daniel Spencer fought with the 2nd Continental Light Dragoons, an elite 120-man unit also known as Sheldon’s Horse after Colonel Elisha Sheldon.  First mustered at Wethersfield, Connecticut, the regiment consisted of four troops from Connecticut, one troop each from Massachusetts and New Jersey, and two companies of light infantry. On August 13, 1777, Sheldon’s horse put a unit of Loyalists to flight in the little-known Battle of the Flocky, the first cavalry charge in history, performed on American soil

FOTR, Daniel Spencer

Bauman’s research uncovered another eight in addition to Hillard’s record including a shoemaker, two ministers, a tavern-keeper, a settler on the Ohio frontier, a blacksmith and the captain of a coastal vessel, in addition to Dr. Munson.

The experiences of these eight span the distance from the Boston Tea Party to the battles at Monmouth, Quaker Hill, Charleston and Bennington.  Their eyes looked upon the likes of George Washington, Alexander Hamilton and Henry Knox, the battles of the Revolution and the final surrender, at Yorktown.

Daniel Spencer

Bauman collected the glass plate photos of eight and paper prints of another five along with each man’s story and published them, in an ebook entitled “DON’T TREAD ON ME: Photographs and Life Stories of American Revolutionaries”.

To look into the eyes of such men is to compress time. To reach back over the generations before the age of photography, and look into eyes that saw the birth of a nation.

April 14, 1958, Pupnik

The day before the launch sequence, Vladimir Yazdovsky took the small dog home to play with his kids.  “I wanted to do something nice for her,” he explained. “She had so little time left to live.”


At the dawn of the space age, no one knew whether the human body could survive conditions of rocket launch and space flight. The US Space program experimented with a variety of primate species between 1948 and 1961, including rhesus monkeys, crab-eating macaques, squirrel monkeys, pig-tailed macaques, and chimpanzees.

Baker
“Miss Baker”

On May 28, 1959, a squirrel monkey named “Miss Baker” became the first of the US space program, to survive the stresses of spaceflight and related medical procedures.  A rhesus monkey called “Miss Able” survived the mission as well, but died four days later as the result of a reaction to anesthesia.

Soviet engineers experimented with dogs on a number of orbital and sub-orbital flights, to determine the feasibility of human space flight.  The Soviet Union launched missions with positions for at least 57 dogs in the fifties and early sixties, though the actual number is smaller.  Some flew more than once.

Laika
Laika

Most survived.  As with the early US program, those who did not often died as the result of equipment malfunction.  The first animal to be sent into orbit, was a different story.

Three dogs were plucked from the streets of Moscow and trained for the purpose.  “Laika” was an 11-pound mutt, possibly a terrier-husky cross.  In Russian, the word means “Barker”.  Laika was chosen due to her small size and calm disposition.  One scientist wrote, “Laika was quiet and charming.”

First, were the long periods of close confinement, meant to replicate the tiny cabin of Sputnik 2. Then came the centrifuge, the highly nutritional but thoroughly unappetizing gel she was meant to eat in space, and then the probes and electrodes that monitored her vital signs.

sputnik-2-launched-a-month-later-and-carried-the-first-living-animal-a-dog-named-laika-into-space
Sputnik 2, Pre-Launch Propaganda

The day before the launch sequence, Vladimir Yazdovsky took her home to play with his kids.  “I wanted to do something nice for her,” he explained. “She had so little time left to live.”

Laika and capsule

Laika was placed inside the capsule for three days, tightly harnessed in a way that only allowed her to stand, sit and lie down.  Finally, it was November 3, 1957.  Launch day.  One of the technicians “kissed her nose and wished her bon voyage, knowing that she would not survive the flight”.

Sensors showed her heart rate to be 103 beats/minute at time of launch, spiking to 240 during acceleration. She ate some of her food in the early stages, but remained stressed and agitated. The thermal control system malfunctioned shortly into the flight, the temperature inside the capsule rising to 104°, Fahrenheit.  Five to seven hours into the flight, there were no further signs of life.

There were official hints about Laika parachuting safely to earth, and then tales of a painless and humane, euthanasia.  Soviet propaganda portrayed “the first traveler in the cosmos”,  heroic images printed on posters, stamps and matchbook covers.   Soviet authorities concealed Laika’s true cause of death and how long it took her to die.  That information would not be divulged , until 2002.

Mach2Sputnik2

In the beginning, the US News media focused on the politics of the launch.  It was all about the “Space Race”, and the Soviet Union running up the score. First had been the unoccupied Sputnik 1, now Sputnik 2 had put the first living creature into space.  The more smartass specimens among the American media, called the launch “Muttnik”.

Sputnik 2 became controversial, as animal lovers began to question the ethics of sending a dog to certain death in space. In the UK, the Royal Society for the Prevention of Cruelty to Animals received protests before Radio Moscow was finished with their launch broadcast.  The National Canine Defense League called on dog owners to observe a minute’s silence.

Atomic_Robo_Last_Stop_Sputnik_Poster2

Protesters gathered with their dogs in front of the UN building, to express their outrage.  In the Soviet Union, political dissent was squelched, as always. Of all Soviet bloc nations, it was probably Poland who went farthest out on that limb, when the scientific periodical Kto, Kiedy, Dlaczego (“Who, When, Why”), reported Laika’s death as “regrettable”.  “Undoubtedly a great loss for science”.

Sputnik 2 and its passenger left the vacuum of space on April 14, 1958, burning up in the outer atmosphere.

It was not until 1998 and the collapse of the Soviet tower of lies, that Oleg Gazenko, one of the scientists who had trained the dog, was free to speak his mind. “Work with animals is a source of suffering to all of us”, he said, “We treat them like babies who cannot speak. The more time passes, the more I’m sorry about it.  We shouldn’t have done it…We did not learn enough from this mission to justify the death of the dog”.

AFTERWARD

belka-strelka-2

As a lifelong dog lover, I feel the need to add a more upbeat postscript to this thoroughly depressing tale.

“Belka” and “Strelka” spent a day in space aboard Sputnik 5 on August 19, 1960 and returned safely, to Earth.  The first Earth-born creatures to go into orbit and return alive.

Charlie, Pushinka
Charlie, (l) and Pushinka, (r)

Strelka later gave birth to six puppies fathered by “Pushok”, a dog who’d participated in ground-based space experiments, but never flew.  In 1961, Nikita Khrushchev gave one of them, a puppy called “Pushinka,” to President John F. Kennedy.

Pushinka and a Kennedy dog named “Charlie” conducted their own Cold War rapprochement, resulting in four puppies. JFK called them his “pupniks”. Rumor has it their descendants are still around, to this day.

kennedy-dog-pushinka-puppies
Pushinka and her “pupniks”, enjoying a moment on the White House lawn

Tip of the hat to the 2019 Vienna Film Award winning “Space dogs” for the artwork at the top of this page.