In the seventeenth century, conventional science held the “geocentric” view of the solar system, holding that our earth exists at the center of celestial movement with the sun and planetary bodies revolving around our little sphere.
The perspective was widely held but by no means unanimous. In the third century BC the Greek astronomer and mathematician Aristarchus of Samos put the Sun in the center of the universe. Later Greek astronomers Hipparchus and Ptolemy agreed, refining Aristarchus’ methods to arrive at a fairly accurate estimate for the distance to the moon. Even so, theirs remained a minority view.
In the 15th century, Polish mathematician and astronomer Nicolaus Copernicus parted ways with the orthodoxy of his time describing a “heliocentric” model of the universe placing the sun, at the center. The Earth and other bodies, according to this model, revolved around the sun.
Copernicus wisely refrained from publishing such ideas until the end of his life, fearing to offend the religious sensibilities of the time. Legend has it that he was presented with an advance copy of his “De revolutionibus orbium coelestium” (On the Revolutions of the Heavenly Spheres) on awakening on his death bed, from a stroke-induced coma. Copernicus took one look at his book, closed his eyes and never opened them again.
The Italian physicist, mathematician, and astronomer Galileo Galilei came along about a hundred years later. The “Father of Modern Observational Astronomy”, Galileo’s improvements to the telescope and resulting astronomical observations supporting the Copernican heliocentric view.
Bad news for Galileo, they also brought him to the attention of the Roman Inquisition.
Biblical references such as, “The Lord set the Earth on its Foundations; it can Never be Moved.” (Psalm 104:5) and “And the Sun Rises and Sets and Returns to its Place.” (Ecclesiastes 1:5) were taken at the time as literal and immutable fact and formed the basis for religious objection to the heliocentric model.
Galileo was brought before inquisitor Vincenzo Maculani for trial. The astronomer backpedaled before the Inquisition, but only to a point, testifying in his fourth deposition on June 21, 1633: “I do not hold this opinion of Copernicus, and I have not held it after being ordered by injunction to abandon it. For the rest, here I am in your hands; do as you please”.
There is a story about Galileo, which may or may not be true. Refusing to accept criticism of his deeply held conviction the astronomer muttered “Eppur si muove” — “And yet it moves”.
The Inquisition condemned the astronomer to “abjure, curse, & detest” his Copernican heliocentric views, returning him to house arrest at his villa in 1634, there to spend the rest of his life. Galileo Galilei, the Italian polymath who all but orchestrated the transition from late middle ages to scientific Renaissance, died on January 8, 1642, desiring to be buried in the main body of the Basilica of Santa Croce, next to the tombs of his father and ancestors.
Galileo’s desires were ignored at the time but, 95 years later, Galileo was re-interred in the basilica, according to his wishes.
Often, atmospheric conditions in these burial vaults lead to natural mummification of a corpse. Sometimes, the dead look almost lifelike. When it came to saints, believers took this to be proof of the incorruptibility of these individuals and small body parts were taken, as holy relics.
Such a custom seems ghoulish today but the practice went back, to antiquity. Galileo is not now and never was a Saint of the Catholic church, quite the opposite. The Inquisition had judged the man to be an enemy of the church, a heretic.
Even so, the condition of Galileo’s body may have made him appear thus “incorruptible”. Be that as it may, one Anton Francesco Gori removed the thumb, index and middle fingers on March 12, 1737. The digits with which Galileo wrote down his theories of the cosmos. The digits with which he would have adjusted his telescope.
Two fingers and a tooth disappeared for a time, later purchased at auction and rejoining their fellow digit at the Museo Galileo. To this day these are the only human body parts in a museum otherwise devoted, to scientific instrumentation.
380 years after the man’s death Galileo’s extremity points upward toward the glory of the cosmos. It is the most famous middle finger on earth, flipping the bird in eternal defiance to those lesser specimens who once condemned a man for ideas ahead of his time.
The first night came in history occurred on September 2 1880 when teams from the RH White and Jordan Marsh department stores played to a 16-all tie. Organized baseball would be slow to accept the arc light.
In 18th century London, going out at night was a bad idea. Not without a lantern in one hand and a club in the other.
The city introduced its first gas-lit street in 1807 on the Central London Pall Mall, between St. James’s Street & Trafalgar Square. Before long, hundreds of “Lamp Lighters” could be seen with their ladders, gas lights bathing the city in a soft, green glow.
The Westminster Review newspaper opined that gas lamps had done more to eliminate immorality and criminality on the streets, than any number of church sermons.
The United States followed nine years later when the city of Baltimore lit up, in 1816.
Thomas Edison patented the first carbon-thread incandescent lamp in 1879. The first baseball game played “under the lights” took place the following year near Nantasket Beach, in the ‘south shore’ town of Hull, Massachusetts.
It was September 2, 1880 when two teams sponsored by the RH White & Co. and Jordan Marsh department stores of Boston, played a full nine innings to a 16-all tie. The era of the night game had arrived. The lamp lighters of London are still around to this very day albeit, fewer in number.
Except, no, it didn’t work out that way. The lamp lighter part is true enough. Today, five gas engineers keep the Victorian era alive, winding and checking the mechanisms, polishing the glass and replacing the mantles of some 2,000 gas lamps.
Across the pond though, organized baseball took another fifty years to give the arc light another try.
Evidence exists of other 19th-century night games, but these were little more than novelties. Holyoke Massachusetts inventor George F. Cahill, creator of the pitching machine, devised a portable lighting system in 1909. With the blessing of Garry Herrmann, President of the Cincinnati Reds, Cahill staged an exhibition game on the night of June 19, between the Elk Lodges of Cincinnati and Newport, Kentucky.
The crowd of 3,000 had little trouble following the ball and Cahill was an enthusiastic salesman for his invention, but the man was doomed to frustration and disappointment. Night-time exhibition games were regularly met with great enthusiasm, yet organized baseball was slow in catching on.
The Class B New England league played a night exhibition game on June 24, 1927 before a crowd of 5,000, sponsored by the General Electric Employees’ Athletic Association. The Washington Senators were in town at that time to play the Boston Red Sox. Delegations from both clubs were on-hand to watch Lynn defeat Salem in a seven-inning game, 7-2.
Washington manager Bucky Harris and Boston manager Bill Carrigan, were impressed. Senator’s star outfielder Goose Goslin expressed a desire to play a night game. Claude Johnson, President of the New England League, predicted that all leagues would have night baseball within five years, including the majors.
As the Great Depression descended across the land, minor league clubs folded by the bushel basket. Small town owners were desperate to innovate. The first-ever night game in professional baseball was played on May 2, 1930, when Des Moines, Iowa hosted Wichita for a Western League game.
The game drew 12,000 spectators at a time when Des Moines was averaging just 600 per game. Soon, minor league owners were finding night games a key to staying in business.
Even then, the Poobahs of Major League Baseball were slow to catch on. Five years later, the Cincinnati Reds defeated the Philadelphia Phillies in the first-ever big league game played under the lights.
A crowd of 25,000 spectators waited on this day in 1935, as President Roosevelt symbolically turned on the lights from Washington DC. The Reds played a night game that year against every National League opponent and, despite a losing record of 68-85, enjoyed an increase of 117% in paid attendance.
Throughout the 1930s and ’40s, teams upgraded facilities to include lights and, before long, most of Major League Baseball had night games on the schedule. Wrigley Field, home of the Chicago Cubs and the second-oldest MLB stadium after Fenway Park, was the last to begin hosting night games. To this day, the Cubbies remain the only major league team to host the majority of its games, during the day.
The first officially recorded night game at Wrigley field ended in a 6-4 win over the New York Mets on August 8, 1988.
To look into the eyes of the men who fought the Revolution is to compress time, to reach back before the age of photography, and look into the eyes of men who saw the birth of a nation.
Imagine for a moment, being able to see the faces of the American Revolution.
Not the paintings. Those are nothing out of the ordinary, save for the talent of the artist. I mean their photographs. Images that make it possible for you to look into their eyes.
In a letter dated May 17, 1781 and addressed to Alexander Scammell, General George Washington outlined his intention to form a light infantry unit, under Scammell’s leadership.
Comprised of Continental Line units from Connecticut, Massachusetts and New Hampshire, Milford, the Massachusetts-born Colonel’s unit was among the defensive forces which kept Sir Henry Clinton penned up in New York City, as much of the Continental army made its way south, toward a place called Yorktown.
Among the men under Scammell’s command was Henry Dearborn, future Secretary of War under President Thomas Jefferson. A teenage medic was also present. His name was Eneas Munson.
One day that medic would become Doctor Eneas Munson, professor of the Yale Medical School in New Haven Connecticut and President of the Medical Society of that state.
And a man who would live well into the age of photography.
The American Revolution ended in 1783. By the first full year of the Civil War, only 12 veterans of the Revolution remained on the pension rolls of a grateful nation.
Two years later, Reverend EB Hillard brought two photographers through New York and New England to visit, and to photograph what were believed to be the last six. Each was 100 years or older at the time of the interview.
William Hutchings of York County Maine, still part of Massachusetts at the time was captured at the siege of Castine, at the age of fifteen. British authorities said it was a shame to hold one so young a prisoner, and he was released.
Reverend Daniel Waldo of Syracuse, New York fought under General Israel Putnam, becoming a POW at Horse Neck.
Adam Link of Maryland enlisted at 16 in the frontier service.
Alexander Millener of Quebec was a drummer boy in George Washington’s Life Guard.
Clarendon, New York native Lemuel Cook would live to be one of the oldest veterans of the Revolution, surviving to the age of 107. He and Alexander Millener witnessed the British surrender, at Yorktown.
Samuel Downing from Newburyport, Massachusetts, enlisted at the age of 16 and served in the Mohawk Valley under General Benedict Arnold. “Arnold was our fighting general”, he’d say, “and a bloody fellow he was. He didn’t care for nothing, he’d ride right in. It was ‘Come on, boys!’ ’twasn’t ‘Go, boys!’ He was as brave a man as ever lived…He was a stern looking man, but kind to his soldiers. They didn’t treat him right: he ought to have had Burgoyne’s sword. But he ought to have been true. We had true men then, twasn’t as it is now”.
Hillard seems to have missed Daniel F. Bakeman, but with good reason. Bakeman had been unable to prove his service with his New York regiment. It wasn’t until 1867 that he finally received his veteran’s pension by special act of Congress.
Daniel Frederick Bakeman would become the Frank Buckles of his generation, the last surviving veteran of the Revolution. The 1874 Commissioner of Pensions report said that “With the death of Daniel Bakeman…April 5, 1869, the last of the pensioned soldiers of the Revolution passed away. He was 109.
Most historians agree on 1839 as the year in which the earliest daguerreotypes were practically possible.
When Utah based investigative reporter Joe Bauman came across Hillard’s photos in 1976, he believed there must be others.
Photography had been in existence for 35 years by Reverend Hillard’s time. What followed was 30 years’ work, first finding and identifying photographs of the right vintage and then digging through muster rolls, pension files, genealogical records and a score of other source documents, to see if each was involved in the Revolution.
There were some, but it turned out to be a small group. Peter Mackintosh for one, was a 16-year-old blacksmith’s apprentice, from Boston. He was working the night of December 16, 1773, when a group of men ran into the shop scooping up ashes from the hearth and rubbing them on their faces. It turns out they were going to a Tea Party.
James Head was a thirteen year-old Continental Naval recruit from a remote part of what was then Massachusetts. Head would be taken prisoner but later released. He walked 224 miles from Providence to his home in what would one day be Warren, Maine.
Head was elected a Massachusetts delegate to the convention called in Boston, to ratify the Constitution. He would die the wealthiest man in Warren, stone deaf from his service in the Continental Navy.
George Fishley served in the Continental army and fought in the Battle of Monmouth, and in General John Sullivan’s campaign against British-allied Indians in New York and Pennsylvania.
Fishley would spend the rest of his days in Portsmouth New Hampshire, where he was known as ‘the last of our cocked hats.”
Daniel Spencer fought with the 2nd Continental Light Dragoons, an elite 120-man unit also known as Sheldon’s Horse after Colonel Elisha Sheldon. First mustered at Wethersfield, Connecticut, the regiment consisted of four troops from Connecticut, one troop each from Massachusetts and New Jersey, and two companies of light infantry. On August 13 1777, Sheldon’s horse put a unit of Loyalists to flight in the little-known Battle of the Flocky, the first cavalry charge in history, performed on American soil
Bauman’s research uncovered another eight in addition to Hillard’s record, including a shoemaker, two ministers, a tavern-keeper, a settler on the Ohio frontier, a blacksmith and the captain of a coastal vessel, in addition to Dr. Munson.
The experiences of these eight span the distance from the Boston Tea Party to the battles at Monmouth, Quaker Hill, Charleston and Bennington. Their eyes saw the likes of George Washington, Alexander Hamilton & Henry Knox, the battles of the Revolution and the final surrender, at Yorktown.
Bauman collected the glass plate photos of eight and paper prints of another five along with each man’s story, and published them in an ebook entitled “DON’T TREAD ON ME: Photographs and Life Stories of American Revolutionaries”.
To look into the eyes of such men is to compress time, to reach back before the age of photography, and look into the eyes of men who saw the birth of a nation.
With communications being impossible, TV commentators used models and illustrations, to describe the unfolding drama. On board Odyssey, power was so low that voice-only transmissions became difficult. Visual communications with Mission Control were as impossible as the idea that the stranded astronauts could get out and walk home.
The seventh manned mission in the Apollo space program was scheduled to be the third moon landing, launching at 13:13 Central Standard Time from the Kennedy Space Center in Florida.
Jack Swigert was the backup pilot for the Command Module (CM), officially joining the Apollo 13 mission only 48 hours earlier, when prime crew member Ken Mattingly was grounded, following exposure to German measles.
Jim Lovell was the most seasoned astronaut in the world at that time, a veteran of two Gemini missions and Apollo 8. By launch day, April 11, 1970, Lovell had racked up 572 space flight hours. For Fred Haise, backup crew member on Apollo 8 and 11, this would be his first spaceflight.
Two separate vessels were joined to form the Apollo spacecraft, separated by an airtight hatch. The crew lived in a Command/Service module called “Odyssey”. The Landing Module (LM) dubbed “Aquarius”, would perform the actual moon landing.
56 hours into the mission and 5½ hours from the Moon’s sphere of gravitational influence, Apollo crew members had just finished a live TV broadcast. Haise was powering the LM down while Lovell stowed the TV camera. Mission Control asked Swigert to activate stirring fans in the Service Module’s hydrogen and oxygen tank. Two minutes later, the astronauts heard a “loud bang”.
Manufacturing and testing of the vessel had both missed an exposed wire in an oxygen tank. Swigert had flipped the switch for a routine procedure, causing a spark to set the oxygen tank on fire. Alarm lights lit up all over Odyssey and in Mission Control. The entire spacecraft shuddered as one oxygen tank tore itself apart and damaged another. Power began to fluctuate. Attitude control thrusters fired, and communications temporarily went dark.
The crew could not have known at the time. The entire Sector 4 panel had just blown off.
The movie takes creative license with Commander James Lovell saying “Houston, we have a problem”. On board the real Apollo 13 it was Jack Swigert who spoke: “Houston, we’ve had a problem”.
205,000 miles into deep space with life support systems shutting down, the Lunar Module became the only means of survival. There was no telling if the explosion had damaged Odyssey’s heat shields. It didn’t matter. For now, the challenge was to remain alive. Haise and Lovell frantically worked to boot up Aquarius, while Swigert shut down systems aboard Odyssey. Power needed to be preserved for splashdown.
The situation had been suggested during an earlier training simulation, but considered unlikely. As it happened, the accident would have been fatal without access to the Lunar Module.
Fifteen years before Angus “Mac” MacGyver hit your television screen, mission control teams, spacecraft manufacturers and the crew itself worked around the clock to “MacGyver” life support, navigational and propulsion systems. For four days and nights, the three-man crew lived aboard the cramped, freezing Aquarius, a landing module intended to support a crew of 2 for only a day and one-half.
With heat plummeting to near freezing food inedible and an acute shortage of water, this tiny, claustrophobic “lifeboat” would have to do what it was never intended to do.
Atmospheric re-entry alone presented near-insurmountable challenges. The earth’s atmosphere is a dense fluid medium. If you reenter at too steep an angle, you may as well be jumping off a high bridge. As it is, the human frame can withstand deceleration forces no higher than 12 Gs, equivalent to 12 individuals identical to yourself, piled on top of you. Even at that, you’re only going to survive a few minutes, at best.
We all know what it is to skip a stone off the surface of a pond. If you hit the atmosphere at too shallow an angle, the result is identical to that stone. There is no coming down a second time. You get one bounce and then there is nothing but the black void of space.
The world held its breath it seemed for seventy-eight hours, waiting for the latest update from newspaper and television news. With communications being impossible, TV commentators used models and illustrations, to describe the unfolding drama. On board Odyssey, power was so low that voice-only transmissions became difficult. Visual communications with Mission Control were as impossible as the idea that the stranded astronauts could get out and walk home.
As Odyssey neared earth, engineers and crew jury-rigged a means of jettisoning the spent Service Module, to create enough separation for safe re-entry.
One last problem to be solved, was the crew’s final transfer from Lunar Module back to Command Module, prior to re-entry. With the “reaction control system” dead, University of Toronto engineers had only slide rules and six hours in which to devise a way to “blow” the LM, by pressurizing the tunnel connecting it with the CM. Too much pressure might damage the hatch and seal. Too little wouldn’t provide enough separation between the two bodies. Either failure would result in one of those “shooting stars” you see at night, as the searing heat of re-entry incinerated the Command Module and everything in it.
By this time, the Command Module had been in “cold soak” for days. No one knew for certain, if the thing would come back to life.
Crashing into the atmosphere at over 24,000mph, the capsule had 14 minutes in which to come to a full stop, splashing down in the waters of the Pacific Ocean. External temperatures on the Command Module reached 2,691° Fahrenheit, as the kinetic energy of re-entry converted to heat.
The Apollo 13 mission ended safely with splashdown southeast of American Samoa on April 17, 1970, at 18:07:41 local time. Exhausted and hungry, the entire crew had lost weight. Haise had developed a kidney infection. Total duration was 142 hours, 54 minutes and 41 seconds.
Kirk was killed in 2329 on the Enterprise (B), after the ship was eaten by a Nexus energy ribbon on its maiden voyage. Only he didn’t die, because Jean-Luc Picard found him alive in the timeless Nexus, negotiating hotel deals for Priceline.com. Or something like that.
On March 22, 2228, a boy will be born to George and Winona Kirk. He would go on to become the youngest captain in Starfleet history but, before he could boldly go where no man has gone before, he had to have a name.
The real-world former World War 2 fighter pilot and veteran of 89 combat missions Gene Roddenberry had 16 suggestions for a name, among these “Hannibal”, “Timber”, “Flagg”, and “Raintree”. The television screenwriter and producer decided on James T. Kirk, based on a journal entry from the 18th century British explorer, Captain James Cook, who wrote “ambition leads me … farther than any other man has been before me“.
Kirk was killed in 2329 on the Enterprise (B), after the ship was eaten by a Nexus energy ribbon on its maiden voyage. Only he didn’t die, because Jean-Luc Picard found him alive in the timeless Nexus, negotiating hotel deals for Priceline.com. Or something like that.
In his 1968 book “Making of Star Trek“, Roddenberry writes that James Kirk was born in a small town in Iowa. Full time “Trekkie” and part time Riverside, Iowa Councilman Steve Miller thought “Why not Riverside”. In 1985, Miller moved that Riverside declare itself the Future Birthplace of James T. Kirk. The motion passed, unanimously. Miller poked a stick in the ground behind the barber shop, (good thing he owned the property), declaring that this was the place. An engraved monument was erected, and so it was.
Riverside Iowa, population 963, became the “Future Birthplace of Captain James T. Kirk. A bench was later added , along with a Shuttlecraft-shaped donation box.
Riverside’s official slogan was changed from “Where the best begins” to “Where the Trek begins,” the annual “River Fest” summer festival, became “Trek Fest”.
Fun fact: Turning on the television today it’s hard to remember how ground-breaking it was that a black, female character would play such a prominent role on a prime-time series as the actress Nichelle Nichols playing Communications Officer Nyota Uhura. The real life Nichols preferred the stage to TV and submitted her resignation, to pursue a career on Broadway. Gene Roddenberry asked her to take the weekend to reconsider, which she did. That weekend, Nichols attended a banquet put on by the NAACP where she was informed, a ‘fan’ wanted to meet her. Let her tell the story from here:
I thought it was a Trekkie, and so I said, ‘Sure.’ I looked across the room and whoever the fan was had to wait because there was Dr. Martin Luther King walking towards me with this big grin on his face. He reached out to me and said, ‘Yes, Ms. Nichols, I am your greatest fan.’ He said that Star Trek was the only show that he, and his wife Coretta, would allow their three little children to stay up and watch. [She told King about her plans to leave the series because she wanted to take a role that was tied to Broadway.] I never got to tell him why, because he said, ‘you cannot, you cannot…for the first time on television, we will be seen as we should be seen every day, as intelligent, quality, beautiful, people who can sing dance, and can go to space, who are professors, lawyers. If you leave, that door can be closed because your role is not a black role, and is not a female role, he can fill it with anybody even an alien”.
The conversation with Reverend King, was life-changing. Nichols returned to the series. When it was over she volunteered with NASA, working to promote the space agency and helping to recruit female and minority recruits between 1977, and 2015. The program recruited Dr. Sally Ride and United States Air Force Colonel Guion Bluford, respectively the first female and the first American astronaut of African ancestry. The program also recruited Dr. Judith Resnik and Dr. Ronald McNair, both of whom flew successful Space Shuttle missions before their deaths in the Space Shuttle Challenger disaster of January 28, 1986.
Star Trek fans, ever-jealous protectors of series trivia, sometimes wonder why the March 22, 2228 date on the Riverside monument differs from the March 22, 2233 date usually cited as Kirk’s future birthday. The 2233 date didn’t come around until eight years after the monument, with the publication The Star Trek Chronology: The History of the Future. 2228 or 2233 you may take your pick, but both agree on March 22, which just happens to be the real-life William Shatner’s, birthday.
In case you ever wondered what the “T” stands for – its “Tiberius”.
The Space Foundation of Colorado Springs bills itself as “the world’s premier organization to inspire, educate, connect, and advocate on behalf of the global space community“.
In 2010, survey conducted by the organization found that James Tiberius Kirk tied for #6 as the “most inspirational space hero of all time“, along with Russian Cosmonaut Yuri Gagarin. You can’t make this stuff up. Tied for 6th place, with the first human in space. A guy who went there, and then came back. A guy who…you know…actually…exists.
What began as a publicity stunt quickly became an overwhelming media event. 200 newspaper reporters from all over the country arrived in Dayton. Two come all the way from London. Twenty-two telegraphers sent out 165,000 words a day over thousands of miles of telegraph wires, specifically hung for the purpose.
On January 28, 1925, a measure prohibiting the teaching of evolution or denying the biblical account of the origin of man, passed the Tennesse House of Representatives, 71 to five. The Tennessee senate passed the so-called “Butler bill” named after Representative John Washington Butler on March 13, the measure signed into law that same month by Governor Austin Peay.
It was now illegal to teach the theory of evolution in Tennessee public schools, colleges and universities.
The American Civil Liberties Union (ACLU) immediately announced an intention to sue, offering to defend anyone accused of violating the act. Local businessman George Rappleyea arranged a meeting with the county superintendent of schools and local attorney Sue Kerr Hicks, possibly the inspiration for Shel Silverstein’s “A Boy Named Sue” everyone remembers from the Johnny Cash song, of 1969.
The three met at Robinson’s Drug Store and agreed their little town of Dayton could use the publicity. The trio summoned 24-year-old High School football coach John Scopes, asking the part-time substitute teacher to plead guilty to teaching the theory of evolution. Scopes replied he couldn’t recall if he had done so or not, but he’d be more than happy to be the defendant if anyone could prove that he had.
Scopes stepped into legal history barely two months later. According to charging documents Scopes had used the textbook “Civic Biology” to describe the theory of evolution, race and eugenics. The prosecution brought in William Jennings Bryan to try the case. The defense hired Clarence Darrow.
Two of the heaviest of jurisprudential heavy hitters of the day were now lined up in what promised to be, the “Trial of the Century”.
Bryan complained that evolution taught children that humans were no more than one among 35,000 mammals. He rejected the idea that humans were descended from apes. “Not even from American monkeys, but from old world monkeys”. The ACLU wanted to oppose the Butler Act on grounds that it violated the teacher’s individual rights and academic freedom, but it was Darrow who shaped the case, taking the position that theistic and evolutionary views were not mutually exclusive.
What began as a publicity stunt quickly became an overwhelming media event. 200 newspaper reporters from all over the country arrived in Dayton. Two come all the way from London. Twenty-two telegraphers sent out 165,000 words a day over thousands of miles of telegraph wires, specifically hung for the purpose.
Trained chimpanzees performed on the courthouse lawn. Chicago radio personality Quin Ryan broadcast the nation’s first on-the-scene coverage of a criminal trial. A specially constructed airstrip was prepared from which two movie cameramen had their newsreel footage flown out, daily.
H.L. Mencken, writing for the Baltimore Sun, mocked the prosecution and the jury as “unanimously hot for Genesis.” Mencken labeled the town’s inhabitants “yokels” and “morons”. Bryan was a “buffoon” he claimed, his speeches “theologic bilge”. It was Mencken who dubbed the proceedings, “Monkey Trial”. The defense, on the other hand, was “eloquent” and “magnificent”.
Or so he claimed. Not the least little bit of media bias, there.
After eight days of trial, the jury took only nine minutes to deliberate finding Scopes guilty on July 21. The gym teacher was ordered to pay a $100 fine, equivalent to something like $1,300, today. Scopes’ conviction was overturned by the Tennessee Supreme Court, on the basis that state law required fines over $50 to be decided by a jury, and not by the judge presiding.
To this day you can find American creationists who believe that media reports turned public opinion, against the religious view.
Today, the Evolution vs Creation debate has faded to the background, but never really ended. Such discussions may be reasonably expected to continue. Neither view seems supportable by anything more than the faith, of its own adherents.
“The general killed the Viet Cong; I killed the general with my camera. Still photographs are the most powerful weapon in the world. People believe them, but photographs do lie, even without manipulation. They are only half-truths”. Pulitzer prize winning photographer, Eddie Adams
According to some studies, the average World War 2 infantry soldier saw 40 days of combat in the Pacific, over 4 years. In Vietnam, the average combat infantryman saw 240 days of combat, in a year.
By 1967, the Johnson administration was coming under increasing criticism for what many in the American public saw as an endless and pointless stalemate in Vietnam.
Opinion polls revealed an increasing percentage believed that it was a mistake to send more troops into Vietnam, their number rising from 25% in 1965, to 45% by December, 1967.
The Johnson administration responded with a “success offensive” emphasizing “kill ratios” and “body counts”, of North Vietnamese and Viet Cong fighters. Vice President Hubert Humphrey stated on NBC’s Today show that November, “We are on the offensive. Territory is being gained. We are making steady progress.”
In Communist North Vietnam, the massive battlefield losses of 1966-’67 combined with economic devastation wrought by US Aerial bombing, causing moderate factions to push for peaceful coexistence with the south. More radical factions favoring military reunification on the Indochina peninsula, needed to throw a “hail Mary” pass. Plans for a winter/spring offensive began, in early 1967. By the New Year, some 80,000 Communist fighters had quietly infiltrated the length and breadth of South Vietnam.
One of the largest military operations of the war launched on January 30, 1968, coinciding with the Tết holiday, the Vietnamese New Year. In the first wave of attacks, North Vietnamese troops and Viet Cong Guerillas struck over 100 cities and towns including Saigon, the South Vietnamese capital.
Initially taken off-guard, US and South Vietnamese forces regrouped and beat back the attacks inflicting heavy losses on North Vietnamese forces. The month-long battle for Huế (pronounced “Hway”) uncovered the massacre of as many as 6,000 South Vietnamese by Communist forces, 5-10percent of the entire city. Fighting continued for over two months at the US combat base at Khe Sanh.
While the Tết offensive was a military defeat for North Vietnamese forces the political effects on the American public, were profound. Support for the war effort plummeted leading to demonstrations. Jeers could be heard in the streets. “Hey! Hey! LBJ! How many kids did you kill today?”
Lyndon Johnson’s Presidency, was finished. The following month, Johnson appeared before the nation in a televised address, saying “I shall not seek, and I will not accept, the nomination of my party for another term as your president.”
In the early morning darkness of February 1, 1968, Nguyễn Văn Lém led a Viet Cong sabotage unit in an assault on the Armor base in Go Vap. After taking control of the camp, Nguyễn arrested Lieutenant Colonel Nguyen Tuan with his family, demanding that the officer show his guerillas how to drive tanks. The officer refused and the Viet Cong slit his throat, along with those of his wife, his six children and 80-year-old mother.
The only survivor was one grievously injured 10-year-old boy.
Nguyễn was captured later that morning, near the mass grave of 34 civilians. He said he was “proud” to have carried out orders to kill them.
AP photographer Eddie Adams was out on the street that day with NBC News television cameraman Võ Sửu, looking for something interesting. The pair saw a group of South Vietnamese soldiers dragging what appeared to be an ordinary man into the road, and began to photograph the event.
Adams “…followed the three of them as they walked towards us, making an occasional picture. When they were close – maybe five feet away – the soldiers stopped and backed away. I saw a man walk into my camera viewfinder from the left. He took a pistol out of his holster and raised it. I had no idea he would shoot. It was common to hold a pistol to the head of prisoners during questioning. So I prepared to make that picture – the threat, the interrogation. But it didn’t happen. The man just pulled a pistol out of his holster, raised it to the VC’s head and shot him in the temple. I made a picture at the same time.”
The man with the pistol was Nguyễn Ngọc Loan, Chief of the national police. Loan had personally witnessed the murder of one of his officers, along with the man’s wife and three small children.
Nguyễn Văn Lém had committed atrocities. He was out of uniform and not engaged in combat when he murdered the General’s subordinates and their families. The man was a war criminal and terrorist with no protections under the Geneva Conventions, legally eligible for summary execution.
And so he was. Loan drew his .38 Special Smith & Wesson Bodyguard revolver, and fired. The execution was barely a blip on the radar screen.
In February 1968, hard fighting yet remained to retake the capitol. As always, Nguyễn Ngọc Loan was leading from the front, when a machine gun burst tore into his leg.
Meanwhile, Adams’ “Saigon Execution” photograph and Võ’s footage made their way into countless papers and news broadcasts. Stripped of context, General Nguyễn came to be seen as a “bloodthirsty sadist”, the Viet Cong terrorist his “innocent victim”.
Adams was on his way to winning a Pulitzer prize for that photograph. Meanwhile an already impassioned anti-war movement, lost the faculty of reason.
The political outcry reached all the way to Australia, where General Nguyễn was recuperating from an amputation. An Australian hospital refused him treatment, and so he traveled to America, to recover.
“I was getting money for showing one man killing another,” Adams said at a later awards ceremony. “Two lives were destroyed, and I was getting paid for it. I was a hero.”
American politics looked inward in the years to come, as the Nixon administration sought the “Vietnamization” of the war. By January 1973, direct US involvement in the war, had come to an end.
Military aid to South Vietnam was $2.8 billion in fiscal year 1973. The US Congress placed a Billion dollar ceiling on that number the following year, cutting it to $300 million, in 1975. The Republic of Vietnam collapsed some fifty-five days later.
General Nguyễn was forced to flee the country he had served. American immigration authorities sought deportation on his arrival, in part because of Eddie Adams’ image. The photographer was recruited to testify against the General but surprised his interrogators, by speaking on his behalf.
General Nguyễn was a devoted Patriot and South Vietnamese Nationalist. An accomplished pilot who led an airstrike on Việt Cộng forces at Bo Duc in 1967, he was loved and admired by his soldiers. He and his wife were permitted to stay
The couple opened a pizza shop in the Rolling Valley Mall of Virginia and called it, “Les Trois Continents”. The restaurant was a success for a time, until word got out about the owner’s identity. Knowing nothing about the man except for Adams’ photograph, locals began to make trouble. Business plummeted as the owner was assaulted in his own restaurant, his life threatened.
The photographer and the General stayed in touch after the war and even became friends. The last time Adams visited Nguyễn’s pizza parlor, some self-righteous coward had scrawled the words “We know who you are, fucker“, across the bathroom wall.
In 1991, the couple was forced to close the restaurant. Seven years later Nguyễn Ngọc Loan died of cancer.
Eddie Adams won his Pulitzer in 1969, but came to regret that he had ever taken that photograph. Years later he wrote in Time Magazine:
‘The general killed the Viet Cong; I killed the general with my camera. Still photographs are the most powerful weapon in the world. People believe them, but photographs do lie, even without manipulation. They are only half-truths. What the photograph didn’t say was, “What would you do if you were the general at that time and place on that hot day, and you caught the so-called bad guy after he blew away one, two or three American soldiers?”‘
Photography has been edited to deceive, for nearly as long as there have been photographs. Consider this image of Leon Trotsky next to Vladimir Lenin. Now you see him, now you don’t. And yet, sometimes images lie, without the aid or even the intent, of dishonesty.
Before Nguyễn died, Adams apologized to the General and his family for what the image had done to his reputation. “The guy was a hero” said the photographer, after the General’s death. “America should be crying. I just hate to see him go this way, without people knowing anything about him.”
Ironically, the threat posed by humans outside the exclusion zone is greater for some species than that posed by radiation, within the zone.
The accident began as a test. A carefully planned series of events intending to simulate a station blackout at the Chernobyl Nuclear Power Plant, in the Soviet Socialist Republic of Ukraine.
This most titanic of disasters began with a series of smaller mishaps. Safety systems intentionally turned off, reactor operators failing to follow checklists, inherent design flaws in the reactor itself.
Over the night of April 25-26, 1986, a nuclear fission chain reaction expanded beyond control at reactor #4, flashing water to super-heated steam resulting in a violent explosion and open air graphite fire. Massive amounts of nuclear material were expelled into the atmosphere during this explosive phase, equaled only by that released over the following nine days by intense updrafts created by the fire. Radioactive material rained down over large swaths of the western USSR and Europe, some 60% in the Republic of Belarus.
It was the most disastrous nuclear power plant accident in history and one of only two such accidents classified as a level 7, the maximum classification on the International Nuclear Event Scale. The other was the 2011 tsunami and subsequent nuclear disaster at the Fukushima Daiichi reactor, in Japan.
One operator died in the steam-blast phase of the accident, a second resulting from a catastrophic dose of radiation. 600 Soviet helicopter pilots risked lethal radiation, dropping 5,000 metric tons of lead, sand and boric acid in the effort to seal off the spread.
Remote controlled, robot bulldozers and carts, soon proved useless. Valery Legasov of the Kurchatov Institute of Atomic Energy in Moscow, explains: “[W]e learned that robots are not the great remedy for everything. Where there was very high radiation, the robot ceased to be a robot—the electronics quit working.”
Soldiers in heavy protective gear shoveled the most highly radioactive materials, “bio-robots” allowed to spend a one-time maximum of only forty seconds on the rooftops of surrounding buildings. Even so, some of these “Liquidators” report having done so, five or six times.
In the aftermath, 237 suffered from Acute Radiation Sickness (ARS), 31 of whom died in the following three months. Fourteen more died of radiation induced cancers, over the following ten years.
The death toll could have been far higher, but for the heroism of first responders. Anatoli Zakharov, a fireman stationed in Chernobyl since 1980, replied to remarks that firefighters believed this to be an ordinary electrical fire. “Of course we knew! If we’d followed regulations, we would never have gone near the reactor. But it was a moral obligation – our duty. We were like kamikaze“.
The concrete sarcophagus designed and built to contain the wreckage has been called the largest civil engineering project in history, involving no fewer than a quarter-million construction workers, every one of whom received a lifetime maximum dose of radiation. By December 10 the structure was nearing completion. The #3 reactor at Chernobyl continued to produce electricity, until 2000.
Officials of the top-down Soviet state first downplayed the disaster. Asked by one Ukrainian official, “How are the people?“, acting minister of Internal Affairs Vasyl Durdynets replied that there was nothing to be concerned about: “Some are celebrating a wedding, others are gardening, and others are fishing in the Pripyat River.”
As the scale of the disaster became apparent, civilians were at first ordered to shelter in place. A 10-km exclusion zone was enacted within the first 36 hours, resulting in the hurried evacuation of some 49,000. The exclusion zone was tripled to 30-km within a week, leading to the evacuation of 68,000 more. Before it was over, some 350,000 were moved away, never to return.
The chaos of these evacuations, can scarcely be imagined. Confused adults. Crying children. Howling dogs. Shouting soldiers, barking orders and herding the now-homeless onto waiting buses, by the tens of thousands. Dogs and cats, beloved companion animals, were ordered left behind. Evacuees were never told. There would be no return.
There were countless and heartbreaking scenes of final abandonment, of mewling cats, and whimpering dogs. Belorussian writer Svetlana Alexievich compiled hundreds of interviews into a single monologue, an oral history of the forgotten. The devastating Chernobyl Prayer tells the story of: “dogs howling, trying to get on the buses. Mongrels, Alsatians. The soldiers were pushing them out again, kicking them. They ran after the buses for ages.” Heartbroken families pinned notes to their doors: “Don’t kill our Zhulka. She’s a good dog.”
There would be no mercy. Squads of soldiers were sent to shoot those animals, left behind. Most died. Some escaped discovery, and survived.
Today the descendants of those dogs, some 900 in number occupy an exclusion zone some 1,600 square miles, slightly smaller than the American state, of Delaware. They are not alone.
In 1998, 31 specimens of the Przewalski Horse were released into the exclusion zone which now serves as a de facto wildlife preserve. Not to be confused with the American mustang or the Australian brumby, the Przewalski Horse is a truly wild horse and not the feral descendant, of domesticated animals.
Named by the 19th century Polish-Russian naturalist Nikołaj Przewalski, Equus ferus przewalskii split from ancestors of the domestic Equus caballus some 38,000 to 160,000 years ago, forming a divergent species where neither taxonomic group is descended, from the other. The last Przewalski stallion was observed in the wild in 1969. The species is considered extinct in the wild, since that time.
Today approximately 100 Przewalski horses roam the Chernobyl Exclusion Zone one of the larger populations of this, possibly the last of the truly wild horses, alive today.
In 2016, US government wildlife biologist Sarah Webster worked at the University of Georgia. Webster and others used camera traps to demonstrate how wildlife had colonized the exclusion zone, even the most contaminated parts. A scientific paper on the subject is linked HERE, if you’re interested.
Ironically, the threat posed by humans outside the exclusion zone is greater for some species than that posed by radiation, within the zone. Wildlife spotted within the exclusion zone include wolves, badgers, swans, moose, elk, turtles, deer, foxes, beavers, boars, bison, mink, hares, otters, lynx, eagles, rodents, storks, bats and owls.
Not all animals thrive in this place. Invertebrates like spiders, butterflies and dragonflies are noticeably absent, likely because of eggs laid in surface soil layers which remain, contaminated. Radionuclides settled in lake sediments effect populations of fish, frogs, crustaceans and insect larvae. Birds in the exclusion zone have difficulty reproducing. Such animals who do successfully reproduce often demonstrate albinism, deformed beaks and feathers, malformed sperm cells and cataracts.
Tales abound of giant mushrooms, six-pawed rabbits and three headed dogs. While some such stories are undoubtedly exaggerated few such mutations survive the first few hours and those who do are unlikely to pass on the more egregious deformities.
Far from the post-apocalyptic wasteland of imagination the Chernobyl exclusion zone is a thriving preserve for some but not all, wildlife. Which brings us back to the dogs. Caught in a twilight zone neither feral nor domestic the dogs of Chernobyl are neither able to compete in the wild nor are many of them candidates for adoption, due to radiation toxicity.
Since September 2017, a partnership between the SPCA International and the US-based 501(c)(3) non-profit CleanFutures.org has worked to provide for the veterinary needs of these defenseless creatures. Over 450 animals have been tested for radiation exposure, given medical care, vaccinations, and spayed or neutered, to bring populations within manageable limits. Many have been socialized for human interaction and successfully decontaminated, available for adoption into homes in Ukraine and North America.
For most there is no future beyond this place and a life expectancy unlikely to exceed a span of five years.
Thirty five years after the world’s most devastating nuclear disaster a surprising number of people work in this place, on a rotating basis. Guards are stationed at access points whose job it is to control who gets in and to keep out unauthorized visitors, known as “stalkers”.
BBC wrote in April of this year about the strange companionship sprung up between these guards, and the dogs of Chernobyl. Jonathon Turnbull is a PhD candidate in geography at the University of Cambridge. He was the first outsider to recognize the relationship and gave the guards disposable cameras, with which to record the lives of these abandoned animals. The guards around this toxic sanctuary had but a single request: “please, please – bring food for the dogs”.
Out of the mess of the Space race emerged an idea destined to go down in the Hare-Brain Hall of fame, if there is ever to be such a place. A show of force sufficient to boost domestic morale while showing the Russkies, we mean Business. It was the top-secret “Project A119”, also known as A Study of Lunar Research Flights. We were going to detonate a nuclear weapon. On the moon.
As World War II drew to a close in 1945, there arose a different sort of conflict, a contest of wills, between the two remaining Great Powers of the world. The “Cold War” pitted the free market economy and constitutional republicanism of the United States against the top-down, authoritarian governing and economic models of the Soviet Union. The stakes could not have been higher, as each side sought to demonstrate its own technological and military superiority and, by implication, the dominance of its own economic and political system.
American nuclear preeminence lasted but four short years, coming to an end with the first successful Soviet atomic weapon test code named “First Lightning”, carried out on August 29, 1949. Mutual fear and distrust fueled the Soviet-American “arms race”, a buildup of nuclear stockpiles beyond any rational purpose. A generation grew up under the shadow of nuclear annihilation. A single mistake, misunderstanding or one fool in the wrong place at the wrong time, initiating a sequence and bringing about the extinction of life on this planet.
The arms race acquired the dimensions of a Space Raceon July 29, 1956, when the United States announced its intention to launch an artificial satellite, into earth orbit. Two days later, the Soviet Union announced that it aimed to do the same.
The early Space Race period was a time of serial humiliation for the American side, as the Soviet Union launched the first Inter-Continental Ballistic Missile (ICBM) on August 21, 1957, and the first artificial satellite “Sputnik 1” on October 4.
The first living creature to enter space was the dog “Laika“, launched aboard the spacecraft Sputnik 2 on November 3 and labeled by the more smartass specimens among the American commentariat, as “Muttnik”.
Soviet propaganda proclaimed “the first traveler in the cosmos”, replete with heroic images printed on posters, stamps and matchbook covers. The American news media could do little but focus on the politics of the launch, as animal lovers the world over questioned the ethics of sending a dog to certain death, in space.
On the American side, the giant Vanguard rocket was scheduled to launch a grapefruit-sized test satellite into earth orbit that September, but the program was plagued by one delay after another. The December 6launch was a comprehensive disaster, the rocket lifting all of four-feet from the pad before crashing to the ground in a sheet of flame, the satellite rolling free where it continued to beep, only feet from the burning wreck.
The second Vanguard launch was nearly as bad, exploding in flames only seconds after launch. Chortling Soviet leaders were beside themselves with joy, stamping the twin disasters as “Kaputnik”, and “Flopnik”.
Out of this mess emerged an idea destined to go down in the Hare-Brain Hall of fame, if there is ever to be such a place. A show of force sufficient to boost domestic morale, while showing the Russkies, we mean business. It was the top-secret “Project A119”, also known as A Study of Lunar Research Flights.
We were going to detonate a nuclear weapon. On the moon.
In 1957, newspapers reported a rumor. The Soviet Union planned a nuclear test explosion on the moon, timed to coincide with the lunar eclipse of November 7. A celebration of the anniversary of the Glorious October Revolution.
Edward Teller himself, the ‘Father of the H-Bomb” is said to have proposed such an idea as early as February, to test the effects of the explosion in a vacuum, and conditions of zero gravity.
Today, we take for granted the massively complex mathematics, involved in hitting an object like the moon. In 1957 there was a very real possibility of missing the thing and boomerang effect, returning the bomb from whence it came.
While the information is still classified, the project was revealed in 2000 by former NASA executive Leonard Reiffel, who said he was asked to “fast track” the program in 1958, by senior Air Force officials. A young Carl Sagan was all for the idea, believing at the time that living microbes may inhabit the moon, and a nuclear explosion may help in detecting such organisms.
Reiffel commented in a Guardian newspaper interview: “It was clear the main aim of the proposed detonation was a PR exercise and a show of one-upmanship. The Air Force wanted a mushroom cloud so large it would be visible on earth. The US was lagging behind in the space race.” The now-retired NASA executive went on to explain that “The explosion would obviously be best on the dark side of the moon and the theory was that if the bomb exploded on the edge of the moon, the mushroom cloud would be illuminated by the sun.”
The Air Force canceled the A119 program in 1959, apparently out of concern that a ‘militarization of space’ would create public backlash, and that nuclear fallout may hamper future research and even colonization efforts, on the moon.
Previously secret reports revealed in 2010 that Soviet leaders had indeed contemplated such a project, part of a multi-part program code named “E”. Project E-1 involved reaching the moon, while E-2 and E-3 focused on sending a probe around the far side of the celestial body. The final stage, project E-4, involved a nuclear strike on the moon as a “display of force”.
Construction plans for the aforementioned Hare-Brain Hall of Fame have yet to be announced but, it already appears the place may need another wing.
“So the astronaut who became a hero, looked to this black woman in the still-segregated South at the time as one of the key parts of making sure his mission would be a success.”
In plasma physics, the Heliosphere is a vast cavity formed by the Sun, a “bubble” continuously “inflated” by plasma originating from that body known as “solar wind’ and separating our own solar system, from the vastness of interstellar space. The outermost reach of the Heliosphere comprises three major sections called the Termination Shock, the Heliosheath, and the Heliopause, so called because solar winds and interstellar winds meet to form, a zone of equilibrium.
Only five man-made objects have traversed the heliosphere to penetrate interstellar space: Pioneer 10 and 11 launched in 1972-73, Voyager 1 and 2 launched in 1977 and New Horizons which left earth’s atmosphere, in 2006. Of those five only three remain active and continue to transmit data back to our little blue planet.
Spectacular images may be found on-line if you’re inclined to look them up. Images such as this jaw dropping shot of the ‘Blue Planet” Neptune taken two days before point of closest contact in August, 1989.
Or these images of the rings of Neptune taken on this day thirty two years ago before Voyager 2 left the last of the “gas giants”, behind.
Few among us are equipped to understand the complexity of such flight. Precious few. One such was a little girl, an American of African ancestry born this day in 1918 in White Silver Springs, West Virginia. The youngest of four born to Joyletta and Joshua Coleman, Creola Katherine showed unusual mathematical skills from an early age.
For black children, Greenbrier County West Virginia didn’t offer education past the eighth grade, in the 1920s. The Colemans arranged for their kids to attend high school two hours up the road in Institute, on the campus of West Virginia State College. Katherine took every math class offered by the school and graduated summa cum laude with degrees in mathematics and French, in 1937.
There were teaching jobs along the way at all-black schools and a marriage to Katherine’s first husband, James Goble. The couple would have three children together before James died of a brain tumor. Three years later she married James A. “Jim” Johnson.
With all that going on at home, Katherine found time to become one of only three black students to attend graduate school at West Virginia University and the only female, selected to integrate the school after the Supreme Court ruing Missouri ex rel. Gaines v. Canada.
Careers in research mathematics were few and far between for black women in 1952, but talent and hard work wins out where ignorance, fears to tread.
So it was Katherine Johnson joined the National Advisory Committee for Aeronautics (NACA), in 1952. Johnson worked in a pool of women who would read the data from aircraft black boxes and carry out a number of mathematical tasks. She referred to her co-workers as “computers who wore skirts”.
Flight research was a man’s world in those days but one day, Katherine and a colleague were asked to fill in, temporarily. Respect is not given it is earned, and Katherine’s knowledge of analytic geometry made quick work of that. Male bosses and colleagues alike were impressed with her skills. When her “temporary” assignment was over it no longer seemed all that important to send her, back to the pool.
Katherine would later explain that barriers of race and sex continued, but she could hold her own. Meetings were taken where decisions were made, where no women had been before. She’d simply tell them that she did the work and this was where she belonged, and that was the end of that.
Johnson worked as a human computer through most of the 1950s, calculating in-flight problems such as gust alleviation, in aircraft. Racial segregation was still in effect in those days according to state law and federal workplace segregation rules introduced under President Woodrow Wilson some forty years, earlier. The door where she worked was labeled “colored computers” but Johnson said she “didn’t feel the segregation at NASA, because everybody there was doing research. You had a mission and you worked on it, and it was important to you to do your job … and play bridge at lunch. I didn’t feel any segregation. I knew it was there, but I didn’t feel it.”
“We needed to be assertive as women in those days – assertive and aggressive – and the degree to which we had to be that way depended on where you were. I had to be. In the early days of NASA women were not allowed to put their names on the reports – no woman in my division had had her name on a report. I was working with Ted Skopinski and he wanted to leave and go to Houston … but Henry Pearson, our supervisor – he was not a fan of women – kept pushing him to finish the report we were working on. Finally, Ted told him, “Katherine should finish the report, she’s done most of the work anyway.” So Ted left Pearson with no choice; I finished the report and my name went on it, and that was the first time a woman in our division had her name on something”.
Katherine worked as an aerospace technologist from 1958 until retirement. She calculated the trajectory for Alan Shepard’s May 1961 flight to become the first American, in space. She worked out the launch window for his 1961 Mercury mission and plotted navigational charts for backup in case of electronic failure. NASA was using electronic computers by the time of John Glenn’s first orbit around the earth but Glenn refused to fly until Katherine Johnson personally verified the computer’s calculations. Author Margot Lee Shetterly commented, “So the astronaut who became a hero, looked to this black woman in the still-segregated South at the time as one of the key parts of making sure his mission would be a success.”
Katherine Johnson retired in 1986 and lived to see six grandchildren and 11 “Greats”. Everyone should live to see their own great grandchild. Not surprisingly, Johnson encouraged hers to pursue careers in science and technology.
President Barack Obama personally awarded Johnson the medal of Freedom in 2015 for work from the Mercury program, to the Space Shuttle. NASA noted her “historical role as one of the first African-American women to work as a NASA scientist.”
A delightful side dish for this story is the Silver Snoopy award NASA gives for outstanding achievement, “For professionalism, dedication and outstanding support that greatly enhanced space flight safety and mission success.”
Following the Mercury and Gemini projects, NASA was searching for a way to focus employees and contractors alike on their own personal contribution to mission success. They wanted it to be fun and interesting, like the Smokey the Bear character, of the United States Forest service. Al Chop of the Manned Spacecraft Center came up with the idea.
Peanuts creator Charles Shulz, a combat veteran of WW2 and avid supporter of the space program, loved the idea. Shulz drew the character to be cast in a silver pin and worn into space, by a member of the Astronaut corps. It is this astronaut who personally awards his or her Snoopy to the deserving recipient.
The award is literally once in a lifetime. Of all NASA personnel and that of many contractors fewer than one percent have ever receive the coveted Silver Snoopy.
Astronaut and former NASA associate administrator for education Leland Melvin personally awarded Johnson her own Silver Snoopy at the naming ceremony in 2016, for the Katherine G. Johnson Computational Research Facility at NASA’s Langley Research Center in Hampton, Virginia.