What began as a publicity stunt quickly became an overwhelming media event. 200 newspaper reporters from all over the country arrived in Dayton. Two come all the way from London. Twenty-two telegraphers sent out 165,000 words a day over thousands of miles of telegraph wires, specifically hung for the purpose.
On January 28, 1925, a measure prohibiting the teaching of evolution or denying the biblical account of the origin of man, passed the Tennesse House of Representatives, 71 to five. The Tennessee senate passed the so-called “Butler bill” named after Representative John Washington Butler on March 13, the measure signed into law that same month by Governor Austin Peay.
It was now illegal to teach the theory of evolution in Tennessee public schools, colleges and universities.
The American Civil Liberties Union (ACLU) immediately announced an intention to sue, offering to defend anyone accused of violating the act. Local businessman George Rappleyea arranged a meeting with the county superintendent of schools and local attorney Sue Kerr Hicks, possibly the inspiration for Shel Silverstein’s “A Boy Named Sue” everyone remembers from the Johnny Cash song, of 1969.
The three met at Robinson’s Drug Store and agreed their little town of Dayton could use the publicity. The trio summoned 24-year-old High School football coach John Scopes, asking the part-time substitute teacher to plead guilty to teaching the theory of evolution. Scopes replied he couldn’t recall if he had done so or not, but he’d be more than happy to be the defendant if anyone could prove that he had.
Scopes stepped into legal history barely two months later. According to charging documents Scopes had used the textbook “Civic Biology” to describe the theory of evolution, race and eugenics. The prosecution brought in William Jennings Bryan to try the case. The defense hired Clarence Darrow.
Two of the heaviest of jurisprudential heavy hitters of the day were now lined up in what promised to be, the “Trial of the Century”.
Bryan complained that evolution taught children that humans were no more than one among 35,000 mammals. He rejected the idea that humans were descended from apes. “Not even from American monkeys, but from old world monkeys”. The ACLU wanted to oppose the Butler Act on grounds that it violated the teacher’s individual rights and academic freedom, but it was Darrow who shaped the case, taking the position that theistic and evolutionary views were not mutually exclusive.
What began as a publicity stunt quickly became an overwhelming media event. 200 newspaper reporters from all over the country arrived in Dayton. Two come all the way from London. Twenty-two telegraphers sent out 165,000 words a day over thousands of miles of telegraph wires, specifically hung for the purpose.
Trained chimpanzees performed on the courthouse lawn. Chicago radio personality Quin Ryan broadcast the nation’s first on-the-scene coverage of a criminal trial. A specially constructed airstrip was prepared from which two movie cameramen had their newsreel footage flown out, daily.
H.L. Mencken, writing for the Baltimore Sun, mocked the prosecution and the jury as “unanimously hot for Genesis.” Mencken labeled the town’s inhabitants “yokels” and “morons”. Bryan was a “buffoon” he claimed, his speeches “theologic bilge”. It was Mencken who dubbed the proceedings, “Monkey Trial”. The defense, on the other hand, was “eloquent” and “magnificent”.
Or so he claimed. Not the least little bit of media bias, there.
After eight days of trial, the jury took only nine minutes to deliberate finding Scopes guilty on July 21. The gym teacher was ordered to pay a $100 fine, equivalent to something like $1,300, today. Scopes’ conviction was overturned by the Tennessee Supreme Court, on the basis that state law required fines over $50 to be decided by a jury, and not by the judge presiding.
To this day you can find American creationists who believe that media reports turned public opinion, against the religious view.
Today, the Evolution vs Creation debate has faded to the background, but never really ended. Such discussions may be reasonably expected to continue. Neither view seems supportable by anything more than the faith, of its own adherents.
“The general killed the Viet Cong; I killed the general with my camera. Still photographs are the most powerful weapon in the world. People believe them, but photographs do lie, even without manipulation. They are only half-truths”. Pulitzer prize winning photographer, Eddie Adams
According to some studies, the average World War 2 infantry soldier saw 40 days of combat in the Pacific, over 4 years. In Vietnam, the average combat infantryman saw 240 days of combat, in a year.
Gallup poll, 1965 – 1971
By 1967, the Johnson administration was coming under increasing criticism for what many in the American public saw as an endless and pointless stalemate in Vietnam.
Opinion polls revealed an increasing percentage believed that it was a mistake to send more troops into Vietnam, their number rising from 25% in 1965, to 45% by December, 1967.
The Johnson administration responded with a “success offensive” emphasizing “kill ratios” and “body counts”, of North Vietnamese and Viet Cong fighters. Vice President Hubert Humphrey stated on NBC’s Today show that November, “We are on the offensive. Territory is being gained. We are making steady progress.”
In Communist North Vietnam, the massive battlefield losses of 1966-’67 combined with economic devastation wrought by US Aerial bombing, causing moderate factions to push for peaceful coexistence with the south. More radical factions favoring military reunification on the Indochina peninsula, needed to throw a “hail Mary” pass. Plans for a winter/spring offensive began, in early 1967. By the New Year, some 80,000 Communist fighters had quietly infiltrated the length and breadth of South Vietnam.
One of the largest military operations of the war launched on January 30, 1968, coinciding with the Tết holiday, the Vietnamese New Year. In the first wave of attacks, North Vietnamese troops and Viet Cong Guerillas struck over 100 cities and towns including Saigon, the South Vietnamese capital.
Initially taken off-guard, US and South Vietnamese forces regrouped and beat back the attacks inflicting heavy losses on North Vietnamese forces. The month-long battle for Huế (pronounced “Hway”) uncovered the massacre of as many as 6,000 South Vietnamese by Communist forces, 5-10percent of the entire city. Fighting continued for over two months at the US combat base at Khe Sanh.
While the Tết offensive was a military defeat for North Vietnamese forces the political effects on the American public, were profound. Support for the war effort plummeted leading to demonstrations. Jeers could be heard in the streets. “Hey! Hey! LBJ! How many kids did you kill today?”
Lyndon Johnson’s Presidency, was finished. The following month, Johnson appeared before the nation in a televised address, saying “I shall not seek, and I will not accept, the nomination of my party for another term as your president.”
In the early morning darkness of February 1, 1968, Nguyễn Văn Lém led a Viet Cong sabotage unit in an assault on the Armor base in Go Vap. After taking control of the camp, Nguyễn arrested Lieutenant Colonel Nguyen Tuan with his family, demanding that the officer show his guerillas how to drive tanks. The officer refused and the Viet Cong slit his throat, along with those of his wife, his six children and 80-year-old mother.
The only survivor was one grievously injured 10-year-old boy.
Nguyễn was captured later that morning, near the mass grave of 34 civilians. He said he was “proud” to have carried out orders to kill them.
AP photographer Eddie Adams was out on the street that day with NBC News television cameraman Võ Sửu, looking for something interesting. The pair saw a group of South Vietnamese soldiers dragging what appeared to be an ordinary man into the road, and began to photograph the event.
Adams “…followed the three of them as they walked towards us, making an occasional picture. When they were close – maybe five feet away – the soldiers stopped and backed away. I saw a man walk into my camera viewfinder from the left. He took a pistol out of his holster and raised it. I had no idea he would shoot. It was common to hold a pistol to the head of prisoners during questioning. So I prepared to make that picture – the threat, the interrogation. But it didn’t happen. The man just pulled a pistol out of his holster, raised it to the VC’s head and shot him in the temple. I made a picture at the same time.”
The man with the pistol was Nguyễn Ngọc Loan, Chief of the national police. Loan had personally witnessed the murder of one of his officers, along with the man’s wife and three small children.
Composite sequence published by the Dolph Briscoe Center for American History of the University of Texas, at Austin.
Nguyễn Văn Lém had committed atrocities. He was out of uniform and not engaged in combat when he murdered the General’s subordinates and their families. The man was a war criminal and terrorist with no protections under the Geneva Conventions, legally eligible for summary execution.
And so he was. Loan drew his .38 Special Smith & Wesson Bodyguard revolver, and fired. The execution was barely a blip on the radar screen.
In February 1968, hard fighting yet remained to retake the capitol. As always, Nguyễn Ngọc Loan was leading from the front, when a machine gun burst tore into his leg.
Meanwhile, Adams’ “Saigon Execution” photograph and Võ’s footage made their way into countless papers and news broadcasts. Stripped of context, General Nguyễn came to be seen as a “bloodthirsty sadist”, the Viet Cong terrorist his “innocent victim”.
Adams was on his way to winning a Pulitzer prize for that photograph. Meanwhile an already impassioned anti-war movement, lost the faculty of reason.
Photographer Eddie Adams (right), holds up his Pulitzer prize. H/T BBC.com
The political outcry reached all the way to Australia, where General Nguyễn was recuperating from an amputation. An Australian hospital refused him treatment, and so he traveled to America, to recover.
“I was getting money for showing one man killing another,” Adams said at a later awards ceremony. “Two lives were destroyed, and I was getting paid for it. I was a hero.”
H/T BBC
American politics looked inward in the years to come, as the Nixon administration sought the “Vietnamization” of the war. By January 1973, direct US involvement in the war, had come to an end.
Scenes from the final evacuation of Saigon, April, 1975
Military aid to South Vietnam was $2.8 billion in fiscal year 1973. The US Congress placed a Billion dollar ceiling on that number the following year, cutting it to $300 million, in 1975. The Republic of Vietnam collapsed some fifty-five days later.
General Nguyễn was forced to flee the country he had served. American immigration authorities sought deportation on his arrival, in part because of Eddie Adams’ image. The photographer was recruited to testify against the General but surprised his interrogators, by speaking on his behalf.
General Nguyễn was a devoted Patriot and South Vietnamese Nationalist. An accomplished pilot who led an airstrike on Việt Cộng forces at Bo Duc in 1967, he was loved and admired by his soldiers. He and his wife were permitted to stay
The couple opened a pizza shop in the Rolling Valley Mall of Virginia and called it, “Les Trois Continents”. The restaurant was a success for a time, until word got out about the owner’s identity. Knowing nothing about the man except for Adams’ photograph, locals began to make trouble. Business plummeted as the owner was assaulted in his own restaurant, his life threatened.
The photographer and the General stayed in touch after the war and even became friends. The last time Adams visited Nguyễn’s pizza parlor, some self-righteous coward had scrawled the words “We know who you are, fucker“, across the bathroom wall.
In 1991, the couple was forced to close the restaurant. Seven years later Nguyễn Ngọc Loan died of cancer.
Eddie Adams won his Pulitzer in 1969, but came to regret that he had ever taken that photograph. Years later he wrote in Time Magazine:
‘The general killed the Viet Cong; I killed the general with my camera. Still photographs are the most powerful weapon in the world. People believe them, but photographs do lie, even without manipulation. They are only half-truths. What the photograph didn’t say was, “What would you do if you were the general at that time and place on that hot day, and you caught the so-called bad guy after he blew away one, two or three American soldiers?”‘
Photography has been edited to deceive, for nearly as long as there have been photographs. Consider this image of Leon Trotsky next to Vladimir Lenin. Now you see him, now you don’t. And yet, sometimes images lie, without the aid or even the intent, of dishonesty.
Before Nguyễn died, Adams apologized to the General and his family for what the image had done to his reputation. “The guy was a hero” said the photographer, after the General’s death. “America should be crying. I just hate to see him go this way, without people knowing anything about him.”
Ironically, the threat posed by humans outside the exclusion zone is greater for some species than that posed by radiation, within the zone.
The accident began as a test. A carefully planned series of events intending to simulate a station blackout at the Chernobyl Nuclear Power Plant, in the Soviet Socialist Republic of Ukraine.
This most titanic of disasters began with a series of smaller mishaps. Safety systems intentionally turned off, reactor operators failing to follow checklists, inherent design flaws in the reactor itself.
Over the night of April 25-26, 1986, a nuclear fission chain reaction expanded beyond control at reactor #4, flashing water to super-heated steam resulting in a violent explosion and open air graphite fire. Massive amounts of nuclear material were expelled into the atmosphere during this explosive phase, equaled only by that released over the following nine days by intense updrafts created by the fire. Radioactive material rained down over large swaths of the western USSR and Europe, some 60% in the Republic of Belarus.
It was the most disastrous nuclear power plant accident in history and one of only two such accidents classified as a level 7, the maximum classification on the International Nuclear Event Scale. The other was the 2011 tsunami and subsequent nuclear disaster at the Fukushima Daiichi reactor, in Japan.
One operator died in the steam-blast phase of the accident, a second resulting from a catastrophic dose of radiation. 600 Soviet helicopter pilots risked lethal radiation, dropping 5,000 metric tons of lead, sand and boric acid in the effort to seal off the spread.
Remote controlled, robot bulldozers and carts, soon proved useless. Valery Legasov of the Kurchatov Institute of Atomic Energy in Moscow, explains: “[W]e learned that robots are not the great remedy for everything. Where there was very high radiation, the robot ceased to be a robot—the electronics quit working.”
Hat tip, Chernobyl Museum, Kiev , Ukraine
Soldiers in heavy protective gear shoveled the most highly radioactive materials, “bio-robots” allowed to spend a one-time maximum of only forty seconds on the rooftops of surrounding buildings. Even so, some of these “Liquidators” report having done so, five or six times.
In the aftermath, 237 suffered from Acute Radiation Sickness (ARS), 31 of whom died in the following three months. Fourteen more died of radiation induced cancers, over the following ten years.
Chernobyl “Liquidators”, permitted to spend no more than a one-time maximum of forty seconds, cleaning the rooftops of surrounding structures.
The death toll could have been far higher, but for the heroism of first responders. Anatoli Zakharov, a fireman stationed in Chernobyl since 1980, replied to remarks that firefighters believed this to be an ordinary electrical fire. “Of course we knew! If we’d followed regulations, we would never have gone near the reactor. But it was a moral obligation – our duty. We were like kamikaze“.
The concrete sarcophagus designed and built to contain the wreckage has been called the largest civil engineering project in history, involving no fewer than a quarter-million construction workers, every one of whom received a lifetime maximum dose of radiation. By December 10 the structure was nearing completion. The #3 reactor at Chernobyl continued to produce electricity, until 2000.
A plastic doll lies abandoned on a rusting bed, 30 years after the town was evacuated following the Chernobyl disaster. H/T Dailymail.com
Officials of the top-down Soviet state first downplayed the disaster. Asked by one Ukrainian official, “How are the people?“, acting minister of Internal Affairs Vasyl Durdynets replied that there was nothing to be concerned about: “Some are celebrating a wedding, others are gardening, and others are fishing in the Pripyat River.”
As the scale of the disaster became apparent, civilians were at first ordered to shelter in place. A 10-km exclusion zone was enacted within the first 36 hours, resulting in the hurried evacuation of some 49,000. The exclusion zone was tripled to 30-km within a week, leading to the evacuation of 68,000 more. Before it was over, some 350,000 were moved away, never to return.
Evacuation of Pripyat
The chaos of these evacuations, can scarcely be imagined. Confused adults. Crying children. Howling dogs. Shouting soldiers, barking orders and herding the now-homeless onto waiting buses, by the tens of thousands. Dogs and cats, beloved companion animals, were ordered left behind. Evacuees were never told. There would be no return.
Two bumper cars lie face to face in the rusting remains of an amusement park in the abandoned town of Pripyat near Chernobyl
There were countless and heartbreaking scenes of final abandonment, of mewling cats, and whimpering dogs. Belorussian writer Svetlana Alexievich compiled hundreds of interviews into a single monologue, an oral history of the forgotten. The devastating Chernobyl Prayer tells the story of: “dogs howling, trying to get on the buses. Mongrels, Alsatians. The soldiers were pushing them out again, kicking them. They ran after the buses for ages.” Heartbroken families pinned notes to their doors: “Don’t kill our Zhulka. She’s a good dog.”
View from an abandoned gym in the Prypyat ghost town, of Chernobyl. H/T Vintagenews.com
There would be no mercy. Squads of soldiers were sent to shoot those animals, left behind. Most died. Some escaped discovery, and survived.
Today the descendants of those dogs, some 900 in number occupy an exclusion zone some 1,600 square miles, slightly smaller than the American state, of Delaware. They are not alone.
In 1998, 31 specimens of the Przewalski Horse were released into the exclusion zone which now serves as a de facto wildlife preserve. Not to be confused with the American mustang or the Australian brumby, the Przewalski Horse is a truly wild horse and not the feral descendant, of domesticated animals.
Named by the 19th century Polish-Russian naturalist Nikołaj Przewalski, Equus ferus przewalskii split from ancestors of the domestic Equus caballus some 38,000 to 160,000 years ago, forming a divergent species where neither taxonomic group is descended, from the other. The last Przewalski stallion was observed in the wild in 1969. The species is considered extinct in the wild, since that time.
Today approximately 100 Przewalski horses roam the Chernobyl Exclusion Zone one of the larger populations of this, possibly the last of the truly wild horses, alive today.
In 2016, US government wildlife biologist Sarah Webster worked at the University of Georgia. Webster and others used camera traps to demonstrate how wildlife had colonized the exclusion zone, even the most contaminated parts. A scientific paper on the subject is linked HERE, if you’re interested.
Ironically, the threat posed by humans outside the exclusion zone is greater for some species than that posed by radiation, within the zone. Wildlife spotted within the exclusion zone include wolves, badgers, swans, moose, elk, turtles, deer, foxes, beavers, boars, bison, mink, hares, otters, lynx, eagles, rodents, storks, bats and owls.
Not all animals thrive in this place. Invertebrates like spiders, butterflies and dragonflies are noticeably absent, likely because of eggs laid in surface soil layers which remain, contaminated. Radionuclides settled in lake sediments effect populations of fish, frogs, crustaceans and insect larvae. Birds in the exclusion zone have difficulty reproducing. Such animals who do successfully reproduce often demonstrate albinism, deformed beaks and feathers, malformed sperm cells and cataracts.
Tales abound of giant mushrooms, six-pawed rabbits and three headed dogs. While some such stories are undoubtedly exaggerated few such mutations survive the first few hours and those who do are unlikely to pass on the more egregious deformities.
Far from the post-apocalyptic wasteland of imagination the Chernobyl exclusion zone is a thriving preserve for some but not all, wildlife. Which brings us back to the dogs. Caught in a twilight zone neither feral nor domestic the dogs of Chernobyl are neither able to compete in the wild nor are many of them candidates for adoption, due to radiation toxicity.
Since September 2017, a partnership between the SPCA International and the US-based 501(c)(3) non-profit CleanFutures.org has worked to provide for the veterinary needs of these defenseless creatures. Over 450 animals have been tested for radiation exposure, given medical care, vaccinations, and spayed or neutered, to bring populations within manageable limits. Many have been socialized for human interaction and successfully decontaminated, available for adoption into homes in Ukraine and North America.
For most there is no future beyond this place and a life expectancy unlikely to exceed a span of five years.
Thirty five years after the world’s most devastating nuclear disaster a surprising number of people work in this place, on a rotating basis. Guards are stationed at access points whose job it is to control who gets in and to keep out unauthorized visitors, known as “stalkers”.
BBC wrote in April of this year about the strange companionship sprung up between these guards, and the dogs of Chernobyl. Jonathon Turnbull is a PhD candidate in geography at the University of Cambridge. He was the first outsider to recognize the relationship and gave the guards disposable cameras, with which to record the lives of these abandoned animals. The guards around this toxic sanctuary had but a single request: “please, please – bring food for the dogs”.
Out of the mess of the Space race emerged an idea destined to go down in the Hare-Brain Hall of fame, if there is ever to be such a place. A show of force sufficient to boost domestic morale while showing the Russkies, we mean Business. It was the top-secret “Project A119”, also known as A Study of Lunar Research Flights. We were going to detonate a nuclear weapon. On the moon.
As World War II drew to a close in 1945, there arose a different sort of conflict, a contest of wills, between the two remaining Great Powers of the world. The “Cold War” pitted the free market economy and constitutional republicanism of the United States against the top-down, authoritarian governing and economic models of the Soviet Union. The stakes could not have been higher, as each side sought to demonstrate its own technological and military superiority and, by implication, the dominance of its own economic and political system.
American nuclear preeminence lasted but four short years, coming to an end with the first successful Soviet atomic weapon test code named “First Lightning”, carried out on August 29, 1949. Mutual fear and distrust fueled the Soviet-American “arms race”, a buildup of nuclear stockpiles beyond any rational purpose. A generation grew up under the shadow of nuclear annihilation. A single mistake, misunderstanding or one fool in the wrong place at the wrong time, initiating a sequence and bringing about the extinction of life on this planet.
The arms race acquired the dimensions of a Space Raceon July 29, 1956, when the United States announced its intention to launch an artificial satellite, into earth orbit. Two days later, the Soviet Union announced that it aimed to do the same.
The early Space Race period was a time of serial humiliation for the American side, as the Soviet Union launched the first Inter-Continental Ballistic Missile (ICBM) on August 21, 1957, and the first artificial satellite “Sputnik 1” on October 4.
The first living creature to enter space was the dog “Laika“, launched aboard the spacecraft Sputnik 2 on November 3 and labeled by the more smartass specimens among the American commentariat, as “Muttnik”.
Soviet propaganda proclaimed “the first traveler in the cosmos”, replete with heroic images printed on posters, stamps and matchbook covers. The American news media could do little but focus on the politics of the launch, as animal lovers the world over questioned the ethics of sending a dog to certain death, in space.
On the American side, the giant Vanguard rocket was scheduled to launch a grapefruit-sized test satellite into earth orbit that September, but the program was plagued by one delay after another. The December 6launch was a comprehensive disaster, the rocket lifting all of four-feet from the pad before crashing to the ground in a sheet of flame, the satellite rolling free where it continued to beep, only feet from the burning wreck.
The second Vanguard launch was nearly as bad, exploding in flames only seconds after launch. Chortling Soviet leaders were beside themselves with joy, stamping the twin disasters as “Kaputnik”, and “Flopnik”.
Out of this mess emerged an idea destined to go down in the Hare-Brain Hall of fame, if there is ever to be such a place. A show of force sufficient to boost domestic morale, while showing the Russkies, we mean business. It was the top-secret “Project A119”, also known as A Study of Lunar Research Flights.
We were going to detonate a nuclear weapon. On the moon.
In 1957, newspapers reported a rumor. The Soviet Union planned a nuclear test explosion on the moon, timed to coincide with the lunar eclipse of November 7. A celebration of the anniversary of the Glorious October Revolution.
Edward Teller himself, the ‘Father of the H-Bomb” is said to have proposed such an idea as early as February, to test the effects of the explosion in a vacuum, and conditions of zero gravity.
Today, we take for granted the massively complex mathematics, involved in hitting an object like the moon. In 1957 there was a very real possibility of missing the thing and boomerang effect, returning the bomb from whence it came.
While the information is still classified, the project was revealed in 2000 by former NASA executive Leonard Reiffel, who said he was asked to “fast track” the program in 1958, by senior Air Force officials. A young Carl Sagan was all for the idea, believing at the time that living microbes may inhabit the moon, and a nuclear explosion may help in detecting such organisms.
Reiffel commented in a Guardian newspaper interview: “It was clear the main aim of the proposed detonation was a PR exercise and a show of one-upmanship. The Air Force wanted a mushroom cloud so large it would be visible on earth. The US was lagging behind in the space race.” The now-retired NASA executive went on to explain that “The explosion would obviously be best on the dark side of the moon and the theory was that if the bomb exploded on the edge of the moon, the mushroom cloud would be illuminated by the sun.”
The Air Force canceled the A119 program in 1959, apparently out of concern that a ‘militarization of space’ would create public backlash, and that nuclear fallout may hamper future research and even colonization efforts, on the moon.
Previously secret reports revealed in 2010 that Soviet leaders had indeed contemplated such a project, part of a multi-part program code named “E”. Project E-1 involved reaching the moon, while E-2 and E-3 focused on sending a probe around the far side of the celestial body. The final stage, project E-4, involved a nuclear strike on the moon as a “display of force”.
Construction plans for the aforementioned Hare-Brain Hall of Fame have yet to be announced but, it already appears the place may need another wing.
“So the astronaut who became a hero, looked to this black woman in the still-segregated South at the time as one of the key parts of making sure his mission would be a success.”
In plasma physics, the Heliosphere is a vast cavity formed by the Sun, a “bubble” continuously “inflated” by plasma originating from that body known as “solar wind’ and separating our own solar system, from the vastness of interstellar space. The outermost reach of the Heliosphere comprises three major sections called the Termination Shock, the Heliosheath, and the Heliopause, so called because solar winds and interstellar winds meet to form, a zone of equilibrium.
Image converted using ifftoany
Only five man-made objects have traversed the heliosphere to penetrate interstellar space: Pioneer 10 and 11 launched in 1972-73, Voyager 1 and 2 launched in 1977 and New Horizons which left earth’s atmosphere, in 2006. Of those five only three remain active and continue to transmit data back to our little blue planet.
Voyager 2 Spacecraft
Spectacular images may be found on-line if you’re inclined to look them up. Images such as this jaw dropping shot of the ‘Blue Planet” Neptune taken two days before point of closest contact in August, 1989.
This picture of Neptune was taken by Voyager 2 less than five days before the probe’s closest approach of the planet on Aug. 25, 1989. The picture shows the “Great Dark Spot” – a storm in Neptune’s atmosphere – and the bright, light-blue smudge of clouds that accompanies the storm. Credit: NASA/JPL-Caltech
Or these images of the rings of Neptune taken on this day thirty two years ago before Voyager 2 left the last of the “gas giants”, behind.
Voyager 2 took these two images of the rings of Neptune on Aug. 26, 1989, just after the probe’s closest approach to the planet. Neptune’s two main rings are clearly visible; two fainter rings are visible with the help of long exposure times and backlighting from the Sun. Credit: NASA/JPL-Caltech
Few among us are equipped to understand the complexity of such flight. Precious few. One such was a little girl, an American of African ancestry born this day in 1918 in White Silver Springs, West Virginia. The youngest of four born to Joyletta and Joshua Coleman, Creola Katherine showed unusual mathematical skills from an early age.
For black children, Greenbrier County West Virginia didn’t offer education past the eighth grade, in the 1920s. The Colemans arranged for their kids to attend high school two hours up the road in Institute, on the campus of West Virginia State College. Katherine took every math class offered by the school and graduated summa cum laude with degrees in mathematics and French, in 1937.
There were teaching jobs along the way at all-black schools and a marriage to Katherine’s first husband, James Goble. The couple would have three children together before James died of a brain tumor. Three years later she married James A. “Jim” Johnson.
With all that going on at home, Katherine found time to become one of only three black students to attend graduate school at West Virginia University and the only female, selected to integrate the school after the Supreme Court ruing Missouri ex rel. Gaines v. Canada.
Careers in research mathematics were few and far between for black women in 1952, but talent and hard work wins out where ignorance, fears to tread.
So it was Katherine Johnson joined the National Advisory Committee for Aeronautics (NACA), in 1952. Johnson worked in a pool of women who would read the data from aircraft black boxes and carry out a number of mathematical tasks. She referred to her co-workers as “computers who wore skirts”.
Flight research was a man’s world in those days but one day, Katherine and a colleague were asked to fill in, temporarily. Respect is not given it is earned, and Katherine’s knowledge of analytic geometry made quick work of that. Male bosses and colleagues alike were impressed with her skills. When her “temporary” assignment was over it no longer seemed all that important to send her, back to the pool.
Katherine would later explain that barriers of race and sex continued, but she could hold her own. Meetings were taken where decisions were made, where no women had been before. She’d simply tell them that she did the work and this was where she belonged, and that was the end of that.
Johnson worked as a human computer through most of the 1950s, calculating in-flight problems such as gust alleviation, in aircraft. Racial segregation was still in effect in those days according to state law and federal workplace segregation rules introduced under President Woodrow Wilson some forty years, earlier. The door where she worked was labeled “colored computers” but Johnson said she “didn’t feel the segregation at NASA, because everybody there was doing research. You had a mission and you worked on it, and it was important to you to do your job … and play bridge at lunch. I didn’t feel any segregation. I knew it was there, but I didn’t feel it.”
“We needed to be assertive as women in those days – assertive and aggressive – and the degree to which we had to be that way depended on where you were. I had to be. In the early days of NASA women were not allowed to put their names on the reports – no woman in my division had had her name on a report. I was working with Ted Skopinski and he wanted to leave and go to Houston … but Henry Pearson, our supervisor – he was not a fan of women – kept pushing him to finish the report we were working on. Finally, Ted told him, “Katherine should finish the report, she’s done most of the work anyway.” So Ted left Pearson with no choice; I finished the report and my name went on it, and that was the first time a woman in our division had her name on something”.
Katherine Johnson
Katherine worked as an aerospace technologist from 1958 until retirement. She calculated the trajectory for Alan Shepard’s May 1961 flight to become the first American, in space. She worked out the launch window for his 1961 Mercury mission and plotted navigational charts for backup in case of electronic failure. NASA was using electronic computers by the time of John Glenn’s first orbit around the earth but Glenn refused to fly until Katherine Johnson personally verified the computer’s calculations. Author Margot Lee Shetterly commented, “So the astronaut who became a hero, looked to this black woman in the still-segregated South at the time as one of the key parts of making sure his mission would be a success.”
Katherine Johnson retired in 1986 and lived to see six grandchildren and 11 “Greats”. Everyone should live to see their own great grandchild. Not surprisingly, Johnson encouraged hers to pursue careers in science and technology.
President Barack Obama personally awarded Johnson the medal of Freedom in 2015 for work from the Mercury program, to the Space Shuttle. NASA noted her “historical role as one of the first African-American women to work as a NASA scientist.”
A delightful side dish for this story is the Silver Snoopy award NASA gives for outstanding achievement, “For professionalism, dedication and outstanding support that greatly enhanced space flight safety and mission success.”
Following the Mercury and Gemini projects, NASA was searching for a way to focus employees and contractors alike on their own personal contribution to mission success. They wanted it to be fun and interesting, like the Smokey the Bear character, of the United States Forest service. Al Chop of the Manned Spacecraft Center came up with the idea.
Peanuts creator Charles Shulz, a combat veteran of WW2 and avid supporter of the space program, loved the idea. Shulz drew the character to be cast in a silver pin and worn into space, by a member of the Astronaut corps. It is this astronaut who personally awards his or her Snoopy to the deserving recipient.
The award is literally once in a lifetime. Of all NASA personnel and that of many contractors fewer than one percent have ever receive the coveted Silver Snoopy.
Astronaut and former NASA associate administrator for education Leland Melvin personally awarded Johnson her own Silver Snoopy at the naming ceremony in 2016, for the Katherine G. Johnson Computational Research Facility at NASA’s Langley Research Center in Hampton, Virginia.
Astronaut and former NASA associate administrator for education Leland Melvin presents Katherine Johnson with a Silver Snoopy award. / Credit: NASA, David C. Bowman
A baby was born this day in 1906 in a small log cabin near Beaver, Utah. His name was Philo, the first born child of Louis Farnsworth and Serena Bastian. He would grow to be the most famous man, you probably never heard of.
Inventor Thomas Edison was once asked about his seeming inability, to invent artificial light. “I have not failed”, he explained “I’ve just found 10,000 ways that won’t work.”
A baby was born this day in 1906 in a small log cabin near Beaver, Utah. His name was Philo, the first born child of Louis Farnsworth and Serena Bastian. He would grow to be the most famous man, you probably never heard of.
Birthplace of Philo Taylor Farnsorth
Philo was constantly tinkering. He was the kind who could look at an object and understand how it worked and why this particular one, didn’t. The family moved when he was 12 to a relative’s ranch near Rigby, Idaho. Philo was delighted to learn the place had electricity.
He found a burnt out electric motor thrown out by a previous tenant and rewound the armature, converting his mothers hand-cranked washing machine, to electric.
It must’ve seemed like Christmas morning when he found all those old technology magazines, in the attic. He even won a $25 prize one time in a magazine contest, for inventing a magnetized car lock.
Farnsworth was fascinated with the behavior of molecules and excelled in chemistry and physics, at Rigby high school. Harrowing a field one day behind a team of two horses, his mind got to working. What if I could “train“ electrons to work in lines like I’m doing here, with these horses? Electrons are so fast the human eye would never pick up, the individual lines. Couldn’t I use them to “paint“ an electronic picture?
Image dissector
Philo sketched his idea of an “image dissector” for his science teacher Mr. Tolman, who encouraged him to keep working on his idea. Justin Tolman kept the sketch though neither could know at that time. Farnsworth’s 1922 drawing would prove decisive one day in a court of law, over who invented all-electronic television.
From Japan to Russia, Germany and America more than fifty inventors were working in the 1920s, to invent television. History remembers the Scottish engineer John Logie Baird as the man who built and demonstrated the world’s first electromechanical television. Amazingly, it was he who invented the first color TV tube, as well.
Scotsman John Logie Baird invented the first (electromechanical0 TV
It was all well and good but Baird’s spinning electromechanical disk was as a glacier, compared with the speed of the electron. Clearly, the future of television, lay in the field of electronics.
The Russian engineer Vladimir K. Zworykin applied for US patent on an electron scanning tube in 1923, while working for RCA. He wouldn’t get the thing to work though, until 1934. Meanwhile, Philo Taylor Farnsworth successfully demonstrated the first television signal transmission on September 7, 1927. The excited telegram Farnsworth sent to one of his backers exclaimed, “The damn thing works!”
Farnsworth’s successful patent application in 1930 resulted in additional funding to support his work and a visit, from Vladimir Zworykin. RCA offered Farnsworth $100,000 for his invention and, when he declined their offer, took him to court over his patent.
“If it weren’t for Philo T. Farnsworth, inventor of television, we’d still be eating frozen radio dinners”.
Johnny Carson
What followed was a bruising, ten year legal battle, a David vs. Goliath contest Farnsworth would win in the end, but at enormous cost both financial, and physical.
In another version of this story, the one that never happened, Philo Farnsworth went on to great fame and fortune to enjoy the fruits of his talents, and all his hard work. Instead World War 2 happened. Farnsworth’s hard fought patent rights quietly expired while the world, was busy with something else.
Ever the tinkerer, Farnsworth went on to invent a rudimentary form of radar, black light for night vision and an infrared telescope. Despite all that his company never did run in the “black”. He sold the company in 1949, to ITT.
From the 1950s on, the man’s primary interest, was in nuclear fusion. In 1965 he patented an array of tubes he called “fusors” in which he actually started a 30-second fusion reaction.
Farnsworth never did enjoy good health. The inventor of all-electronic television died of pneumonia on March 11, 1971 with well over 300 patents, to his name. Had you bought a television that day you would have owned a device with no fewer than 100 inventions, by this one man.
Ever the idealist Farnsworth believed television would bring about ever greater heights in human learning and achievement, foster a shared experience bringing about international peace and understanding. Much the same as some once believed of the internet where the sum total of human knowledge was now available for a few keystrokes, and social media fosters new worlds of harmonious relations where cheerful users discussed the collected works of Shakespeare, the Codes of Hammurabi and the vicissitudes, of life.
Right.
Farnsworth was dismayed by the dreck brought about, by his creation. “There’s nothing on it worthwhile” he would say“, and we’re not going to watch it in this household. I don’t want it in your intellectual diet…Television is a gift of God, and God will hold those who utilize his divine instrument accountable to him“. – Philo Taylor Farnsworth
That all changed if only a bit, on July 20, 1969. American astronaut Neil Armstrong stepped onto the surface of the moon and declared, “That’s one small step for man, one giant leap for mankind.” It was probably a misspeak. Most likely he intended to say “one small step for Aman” but, be that as it may. The world saw it happen thanks to a miniaturized version of a device, invented by Philo Farnsworth.
Farnsworth himself was watching just like everyone else alive, that day. Years later Farnsworth’s wife Emma, he called her “Pem”, would recall in an interview, with the Academy of Television Arts & Sciences: “We were watching it, and, when Neil Armstrong landed on the moon, Phil turned to me and said, “Pem, this has made it all worthwhile.” Before then, he wasn’t too sure”.
The car itself was destroyed long ago, the ingredients for its manufacture unrecorded, but the thing lives on in the hearts of hemp enthusiasts, everywhere.
The largest museum in the United States is located in the Detroit suburb of Dearborn, the Henry Ford Museum of American Innovation. The sprawling, 12-acre indoor-outdoor complex in the old Greenfield Village is home to JFK’s Presidential limo, the Rosa Parks bus and the Wright Brothers’ bicycle shop. There you will find Abraham Lincoln’s chair from Ford’s Theater along with Thomas Edison’s laboratory and an Oscar Mayer Wienermobile. George Washington’s camp bed is there, with Igor Sikorski’s helicopter and an enormous collection of antique automobiles, locomotives and aircraft.
One object you will not find there is Henry Ford’s plastic car. Made from soybeans.
As a young man, Henry Ford left the family farm outside of modern-day Detroit, and never returned. Ford’s father William thought the boy would one day own the place but young Henry couldn’t stand farm work. He later wrote, “I never had any particular love for the farm—it was the mother on the farm I loved”.
Henry Ford went on to other things, but part of him never left the soil. In 1941, the now-wealthy business magnate wanted to combine industry, with agriculture. At least, that’s what the museum says.
Ford gave the plastic car project to yacht designer Eugene Turenne Gregorie at first, but later turned to the Greenfield Village soybean laboratory. To the guy in charge over there, a guy with some experience in tool & die making. His name was Lowell Overly.
The car was made in Dearborn with help from scientist and botanist George Washington Carver, (yeah, That George Washington Carver), a man born to slavery who rose to such prodigious levels of accomplishment that Time magazine labeled the man, the “Black Leonardo”.
George Washington Carver, at work in his library
The soybean car, introduced to the public this day in 1941, was made from fourteen quarter-inch thick plastic panels and plexiglass windows, attached to a tubular steel frame and weighing in at 1,900 pounds, about a third lighter than comparable automobiles of the era. The finished prototype was exhibited later that year at the Dearborn Days festival, and the Michigan State Fair Grounds.
The thing was built to run on fuel derived from industrial hemp, a related strain of the green leafy herb beloved of stoners, the world over.
Ford claimed he’d be able to “grow automobiles from the soil”, a hedge against the metal rationing of world War Two. He dedicated 120,000 acres of soybeans to experimentation, but to no end. The total acreage devoted to “fuel” production went somehow, unrecorded.
Another reason for a car made from soybeans, was to help American farmers. In any case Henry Ford had a “thing”, for soybeans. He was one of the first in this country, to regularly drink soy milk. At the 1934 World’s Fair in Chicago, Ford invited reporters to a feast where he served soybean cheese, soybean crackers, soy bread and butter, soy milk, soy ice cream. If he wasn’t the Bubba Gump of soybeans, perhaps Bubba Gump was the Henry Ford, of Shrimp.
Ford’s own car was fitted with a soybean trunk and struck with an axe to demonstrate the material’s durability, though the axe was later revealed to have a rubber boot.
Henry Ford’s experiment in making cars from soybeans never got past that first prototype and came to a halt, during World War 2. The project was never revived, though several states adopted license plates stamped out of soybeans, a solution to the steel shortage farm animals found to be quite delicious.
The car itself was destroyed long ago, the ingredients for its manufacture unrecorded, but the thing lives on in the hearts of hemp enthusiasts, everywhere.
The New York Times claimed the car body and fenders were made from soy beans, wheat and corn. Other sources opine that the car was made from Bakelite or some variant of Duroplast, a plant-based auto body substance produced in the millions, for the East German Trabant.
One newspaper claimed that nothing ever came from Henry Ford’s soybean experiments, save and except for, whipped cream.
Today, the idea that microorganisms such as fungi, viruses and other pathogens cause infectious disease is common knowledge, but such ideas were held in disdain among scientists and doctors, well into the 19th century.
In the 12th century, French philosopher Bernard of Chartres talked about the concept of “discovering truth by building on previous discoveries”. The idea is familiar to the reader of English as expressed by the mathematician and astronomer Isaac Newton, who observed that “If I have seen further it is by standing on the shoulders of Giants.”
Dr. Ignaz Semmelweis
Nowhere is there more truth to the old adage, than in the world of medicine. In 1841, the child who survived to celebrate a fifth birthday could look forward to a life of some 55 years. Today, a five-year-old can expect to live to eighty-two, fully half again that of the earlier date.
Yet, there are times when the giants who brought us here are unknown to us, as if they had never been. One such is Dr. Ignaz Semmelweis, one of the earliest pioneers in anti-septic medicine.
Semmelweis studied law at the University of Vienna in the fall of 1837, but switched to medicine the following year. He received his MD in 1844 and, failing to gain a clinical appointment in internal medicine, decided to specialize in obstetrics.
In the third century AD, the Greek physician Galen of Pergamon first described the “miasma” theory of illness, holding that infectious diseases such as cholera, chlamydia and the Black Death were caused by noxious clouds of “bad air”. The theory is discredited today, but such ideas die hard.
The germ theory of disease was first proposed by Girolamo Fracastoro in 1546 and expanded by Marcus von Plenciz in 1762. Single-cell organisms – bacteria – were known to exist in human dental plaque as early as 1683, yet their functions were imperfectly understood. Today, the idea that microorganisms such as fungi, viruses and other pathogens cause infectious disease is common knowledge, but such ideas were held in disdain among scientists and doctors, well into the 19th century.
In the mid-19th century, birthing centers were set up all over Europe, for the care of poor and underprivileged mothers and their illegitimate infants. Care was provided free of charge, in exchange for which young mothers agreed to become training subjects for doctors and midwives.
In 1846, Semmelweis was appointed assistant to Professor Johann Klein in the First Obstetrical Clinic of the Vienna General Hospital, a position similar to a “chief resident,” of today.
At the time, Vienna General Hospital ran two such clinics, the 1st a “teaching hospital” for undergraduate medical students, the 2nd for student midwives.
Semmelweis quickly noticed that one in ten women and sometimes one in five, were dying in the First Clinic of postpartum infection known as “childbed fever”, compared with less than 4% that of the Second Clinic.
The difference was well known, even outside of the hospital. Expectant mothers were admitted on alternate days into the First or Second Clinic. Desperate women begged on their knees not to be admitted into the First, some preferring even to give birth in the streets, over delivery in that place. The disparity between the two clinics “made me so miserable”, Semmelweis said, “that life seemed worthless”.
He had to know why this was happening.
Childbed or “puerperal” fever was rare among these “street births”, and far more prevalent in the First Clinic, than the Second. Semmelweis carefully eliminated every difference between the two, even including religious practices. In the end, the only difference was the people who worked there.
The breakthrough came in 1847, following the death of Semmelweis’ friend and colleague, Dr. Jakob Kolletschka. Kolletschka was accidentally cut by a student’s scalpel, during a post-mortem examination. The doctor’s own autopsy showed a pathology very similar to those women, dying of childbed fever. Medical students were going from post-mortem examinations of the dead to obstetrical examinations of the living, without washing their hands.
Midwife students had no such contact with the dead. This had to be it. Some unknown “cadaverous material” had to be responsible for the difference.
Semmelweis instituted a mandatory handwashing policy, using a chlorinated lime solution between autopsies and live patient examinations.
Mortality rates in the First Clinic dropped by 90 percent, to rates comparable with the Second. In April 1847, First Clinic mortality rates were 18.3% – nearly one in five. Hand washing was instituted in mid-May, and June rates dropped to 2.2%. July was 1.2%. For two months, the rate actually stood at zero.
The European medical establishment celebrated the doctor’s findings. Semmelweis was feted as the Savior of Mothers, a giant of modern medicine.
No, just kidding. He wasn’t.
The imbecility of the response to Semmelweis’ findings is hard to get your head around and the doctor’s own personality, didn’t help. The medical establishment took offense at the idea that they themselves were the cause of the mortality problem, and that the answer lay in personal hygiene.
Semmelweis himself was anything but tactful, publicly berating those who disagreed with his hypothesis and gaining powerful enemies. For many, the doctor’s ideas were extreme and offensive, ignored or rejected and even ridiculed. Are we not Gentlemen!? Semmelweis was fired from his hospital position and harassed by the Vienna medical establishment, finally forced to move to Budapest.
Dr. Semmelweis was outraged by the indifference of the medical community, and began to write open and increasingly angry letters to prominent European obstetricians. He went so far as to denounce such people as “irresponsible murderers”, leading contemporaries and even his wife, to question his mental stability.
Dr. Ignaz Philipp Semmelweis was committed to an insane asylum on July 31, 1865, twenty-three years before Dr. Louis Pasteur opened his institute for the study of microbiology.
Semmelweis bust, University of Tehran
Barely two weeks later, August 12, 1865, British surgeon and scientist Dr. Joseph Lister performed the first anti-septic surgery, in medical history. Dr. Semmelweis died the following day at the age of 47, the victim of a blood infection resulting from a gangrenous wound sustained in a severe beating, by asylum guards.
In an age of hand-lit sputtering fuses and hand packed (to say nothing of hand-made) powder, even a millisecond difference in ignition will give one ball a head start, to be measured in feet.
In 1642, Italian gun maker Antonio Petrini conceived a double barrel cannon with tubes joined at 45° firing solid shot joined together, by a length of chain. This was the year of the “Great Rebellion“, the English Civil War, when King and Parliament raised armies to go to war – with each other. Petrini’s idea must have looked good to King Charles I of England. Imagine, a weapon capable of slicing through the ranks of his enemies, like grass before a scythe.
The idea was to fire both barrels simultaneously, but there was the rub. Wild ideas occur to the imagination of imperfect combustion, and a chained ball swinging around to take out its own gun crew. The King himself was mute on the subject and went on to lose his head, in 1649. Petrini’s manuscript resides to this day in the tower of London. There is no documented evidence that the weapon was ever fired, save for the designer’s own description of the ‘Grandissima Ruina’ left behind, by his own splendid creation.
Two-hundred years later the former British colonies across the Atlantic, were themselves embroiled in Civil War.
In the early days of independence, the Confederate Congress enacted a measure, allowing local cities and towns to form semi-military companies for the purpose of local defense. As the very flower of young southern manhood was called up and sent to the front, these “home guard” units often comprised themselves of middle-age and older gentlemen, and others for various reasons, unable to leave home and hearth.
Augustus Longstreet Hull was born 1847 in “The Classic City” of Athens Georgia, and enlisted in the Confederate Army on September 8, 1864.
After the war, Hull worked twenty-seven years as a banker before publishing the Annals of Athens, in 1906. In it, Mr. Hull writes with not a little biting wit, of his own home town home guard unit, Athens’ own, Mitchell Thunderbolts.
“From the name one might readily infer that it was a company made up of fierce and savage men, eager for the fray and ready at all times to ravage and slaughter; yet such was not the case, for in all their eventful career no harm was done to a human being, no property was seized and not one drop of blood stained their spotless escutcheon.
Named for one of it’s own private soldiers, the Mitchell Thunderbolts were not your standard military company. These guys were “organized strictly for home defense” and absolutely refused to take orders. From anyone. They recognized no superior officer and the right to criticism was reserved and freely exercised by everyone from that “splendid old gentleman” Colonel John Billups, down to the lowliest private.
Georgia Senator Middleton Pope Barrow
General Howell Cobb sent the future United States Senator Captain Middleton Pope Barrow to Athens in 1864, to inspect the Thunderbolts. Having no intention of submitting to “inspection” by any mere stripling of a Captain, Dr. Henry Hull (Augustus’ father) “politely informed him that if he wished to inspect him, he would find him on his front porch at his home every morning at 9 o’clock“.
John Gilleland, 53, was a local dentist, builder and mechanic, and private soldier in good standing, of the Mitchell Thunderbolts. Gilleland must have liked Petrini’s idea because he took up a collection in 1862, and raised $350 to build the Confederate States of America’s own, double-barrel cannon.
Measuring 13 inches wide by 4-feet 8½” inches and weighing in at some 1,300 pounds, this monstrosity had two barrels diverging at 3° and equipped with three touch holes, one for each barrel and a third should anyone wish to fire the two, together. It was the secret “super weapon” of the age, two cannonballs connected by a chain and designed to “mow down the enemy somewhat as a scythe cuts wheat.”
Yeah. As Mr. Petrini could have told them, the insurmountable problem remained. In an age of hand-lit sputtering fuses and hand packed (to say nothing of hand-made) powder, even a millisecond difference in ignition will give one ball a head start, to be measured in feet. How to simultaneously fire two conjoined weapons remained a problem, even for so elite an outfit, as the Mitchell Thunderbolts.
The atmosphere was festive on April 22, 1862, when a crowd gathered to watch Gilleland test the Great Yankee Killer. Aimed at two poles stuck in the ground, uneven ignition and casting imperfections sent assorted spectators scrambling for cover as two balls spun wildly off to the side where they “plowed up about an acre of ground, tore up a cornfield, mowed down saplings, and then the chain broke, the two balls going in different directions“.
Double Barrel Cannon model, H/T ModelExpo
On the second test, two chain-connected balls shot through the air and into a stand of trees. According to one witness, the “thicket of young pines at which it was aimed looked as if a narrow cyclone or a giant mowing machine had passed through“.
On the third firing, the chain snapped right out of the barrel. One ball tore into a nearby log cabin and destroyed the chimney, while the other spun off and killed a cow who wasn’t bothering anyone.
Gilleland considered all three tests successful, even though the only ones truly safe that day, were those two target posts.
The dentist went straight to the Confederate States’ arsenal in Augusta where Colonel George Rains subjected his creation to extensive testing, before reporting the thing too unreliable for military use. Outraged, an angry inventor wrote angry letters to Georgia Governor Joseph “Joe” Brown and to the Confederate government in Richmond, but to no avail.
At last, the contraption was stuck in front of the Athens town hall and used as a signal gun, to warn citizens of approaching Yankees.
There the thing remained until August 2, 1864, when the gun was hauled out to the hills west of town to meet the Federal troops of Brigadier General George Stoneman. The double-barrel cannon was positioned on a ridge near Barber’s Creek and loaded with canister shot, along with several conventional guns. Outnumbered home guards did little real damage but the noise was horrendous, and Stoneman’s raiders withdrew to quieter pastures.
There were other skirmishes in the area, all of them minor. In the end, Athens escaped the devastation of Sherman’s march to the sea and the Confederate superweapon weapon was moved, back to town.
Gilleland’s monstrosity was sold after the war and lost, for a time. The thing was recovered and restored back in 1891, and returned to the Athens City Hall where it remains to this day, a contributing property of the Downtown Athens Historic District. Come and see it if you’re ever in Athens, right there at the corner of Hancock and College Avenue. There you will find the thing, pointing north, at all those Damned Yankees. You know. Just in case.
In 2018, the non-profit B612 Foundation dedicated to the study of near-Earth object impacts, reported that “It’s a 100 per cent certain we’ll be hit [by a devastating asteroid]”. Comfortingly, the organization’s statement concluded “we’re [just] not 100 per cent sure when.”
The first atomic bomb in the history of human conflict exploded in the skies over Japan on August 6, 1945. The bomb, code named “Little Boy”, reached an altitude of 1,900-feet over the city of Hiroshima at 8:15am, Japanese Standard Time.
A “gun-triggered” fission bomb, barometric-pressure sensors initiated the explosion of four cordite charges, propelling a small “bullet” of enriched uranium the length of a fixed barrel and into a sphere of the same material. Within picoseconds (1/.000000000001 of a second), the collision of the two bodies initiated a fission reaction, releasing an energy yield roughly equivalent to 15,000 tons of TNT.
66,000 were killed outright by the effects of the blast. The shock wave spread outward at a velocity greater than the speed of sound, flattening virtually everything in its path for a mile in all directions.
Thirty-seven years before, the boreal forests of Siberia lit up with an explosion 1,000 times greater than the atomic bomb dropped over Hiroshima. At the time, no one had the foggiest notion that it was coming.
The Taiga occupies the high latitudes of the world’s northern regions, a vast international beltline of coniferous forests consisting mostly of pines, spruces and larches between the high tundra, and the temperate forest. An enormous community of plants and animals, this trans-continental ecosystem comprises a vast biome, second only to the world’s oceans.
The Eastern Taiga is a region in the east of Siberia, an area 1.6 times the size of the continental United States. The Stony Tunguska River wends its way along an 1,160-mile length of the region, its entire course flowing under great pebble fields with no open water.
On the morning of June 30, 1908, the Tunguska River lit up with a bluish-white light. At 7:17a local time, a column of light too bright to look at with the naked eye moved across the skies above the Tunguska. Minutes later, a vast explosion knocked people off their feet, flattening buildings, crops and as many as 80 million trees over an area 830 miles, square. A vast “thump” was heard, the shock wave equivalent to an earthquake measuring 5.0 on the Richter scale. Within minutes came a second and then a third shock wave and finally a fourth, more distant this time and described by eyewitnesses as the “sun going to sleep”.
On July 13, 1908, the Krasnoyaretz newspaper reported “At 7:43 the noise akin to a strong wind was heard. Immediately afterward a horrific thump sounded, followed by an earthquake that literally shook the buildings as if they were hit by a large log or a heavy rock”.
Fluctuations in atmospheric pressure were detectable as far away as Great Britain. Night skies were set aglow from Asia to Europe for days on end, theorized to have been caused by light, passing through high-altitude ice particles.
In the United States, lookout posts from the Smithsonian Astrophysical Observatory headquartered in Cambridge, Massachusetts, to the Mount Wilson Observatory in Los Angeles recorded a several months-long decrease in atmospheric transparency, attributed to an increase in dust, suspended in the atmosphere.
The “Tunguska Event” was the largest such impact event in recorded history, but far from the first. Or the last. Mistastin Lake in northern Labrador was formed during the Eocene era 36-million years ago, cubic Zirconium deposits suggesting an impact-zone temperature of some 4,300° Fahrenheit.
That’s halfway to the temperature, of the sun.
“A bolide – a very bright meteor of an apparent magnitude of &−14 or brighter” H/T Wikimedia
Some sixty-six million years ago, the “Chicxulub impactor” struck the Yucatan Peninsula of Mexico, unleashing a mega-tsunami of 330-feet in height from Texas to Florida. Superheated steam, ash and vapor towered over the impact zone, as colossal shock waves triggered global earthquakes and volcanic eruptions. Vast clouds of dust blotted out the sun for months on end leading to mass extinction events, the world over.
The official history of the Ming Dynasty records the Ch’ing-yang event of 1490, a meteor shower in China in which “stones fell like rain”. Some 10,000 people were killed for all intents and purposes, stoned to death.
In 2013, a twenty-meter (66-foot) space rock estimated at 13,000-14,000 tons flashed across the skies of Chelyabinsk, Russia, breaking apart with a kinetic impact estimated at 26-times the nuclear blast over Hiroshima. This Superbolide (a bolide is “an extremely bright meteor, especially one that explodes in the atmosphere”) entered the earth’s atmosphere on February 15, burning exposed skin and damaging retinas for miles around. No fatalities were reported though 1,500 were injured seriously enough to require medical attention.
The 450-ton Chicora Meteor collided with western Pennsylvania on June 24, 1938, in a cataclysm comparable to the Halifax Explosion of 1917. The good luck held, that time, the object making impact in a sparsely populated region. The only reported casualty, was a cow. Investigators F.W. Preston, E.P. Henderson and James R. Randolph remarked that “If it had landed on Pittsburgh there would have been few survivors”.
In 2018, the non-profit B612 Foundation dedicated to the study of near-Earth object impacts, reported that “It’s a 100 per cent certain we’ll be hit [by a devastating asteroid]”. Comfortingly, the organization’s statement concluded “we’re [just] not 100 per cent sure when.”
You must be logged in to post a comment.