February 13, 1961 A Prehistoric Spark Plug

Winston Churchill once quipped “A lie gets halfway around the world before the truth has a chance to get its pants on”.  Guess he got that right.

Sometime around Easter Sunday, 1900, Captain Dimitrios Kondos set sail from the island of Symi. Kondos and a team of Greek sponge divers worked their way through the Peloponnese, across the Aegean en route to the rich fishing grounds off the coast of North Africa. The team was stopped and waiting for favorable winds off the Greek Island of Antikythera, when some of the divers thought they’d have a look around.

Elias Stadiatis descended some 150-feet, and quickly signaled that he wanted to come back up.  Stadiatis told a wild tale about a rocky bottom, strewn with the rotting corpses and men and horses. Dozens of them.

greek-sponge-divers-135-930x620
Greek sponge divers

The effects of nitrogen narcosis were well understood by this time, that lethally narcotic-like state of drunkenness where deep divers have been known to hand regulators, to fish. Captain Kondos was convinced that Stadiatis was drunk on nitrogen. He donned the canvas suit and brass helmet, and went down to look for himself.

The divers had discovered a 1st-century (BC) shipwreck, a treasure trove of statuary: four marble horses, and thirty-six stone statues including Hercules, Ulysses, Diomedes, Hermes and Apollo.

The most astonishing find from the wreck was a complex clock-like mechanism, believed to be built around 100-200BC and vastly more sophisticated than anything known to have come from antiquity. In more recent years, computer x-ray tomography and high resolution surface scanning have revealed the enormous sophistication of the “Antikythera mechanism“, an analog computer comprising some 37 exquisitely precise gear wheels, enabling the device to follow the moon and sun through the full cycle of the zodiac.

The thing can even recreate the variable velocity of the moon, as the body speeds up through its perigee, and slows through the apogee.

maxresdefault (21)
Antikythera mechanism, recreation

What those Greek sponge divers had discovered was an Out-of-Place Artifact, (OOPArt), an object which called into question, our understanding of what has come before.  OOPArts are artifacts of historical, archaeological, or paleontological interest, evincing a more advanced technology than known to have existed at the time, or even a human (or at least intelligent) hand at a time and place, where none are known to exist.

OOPArts run the gamut from the genuinely surprising to risible hoaxes to the favorites of cryptozoologists, UFOologists, paranormal enthusiasts and proponents of ancient astronaut theories.  Some turn out to be objects of mistaken interpretation, based on little more than wishful thinking.

The Iron Pillar outside the Quwwat-ul-Islam Mosque in Delhi is believed to date from the fifth century Gupta monarchs of India.  Standing 23-feet, 8-inches and weighing in at 13,000-pounds, the thing is almost entirely free of rust, demonstrating a level of metallurgical proficiency, surprising for the time.

Iron_Pillar,_Delhi

The “London Hammer” was found in June 1934 near London Texas. It’s a common enough object, except is seems to be embedded, in 400 million year-old rock. Geologist J.R. Cole explains the conundrum:

kingoodiehammer
London Hammer

“The stone is real, and it looks impressive to someone unfamiliar with geological processes. How could a modern artifact be stuck in Ordovician rock? The answer is that the concretion itself is not Ordovician. Minerals in solution can harden around an intrusive object dropped in a crack or simply left on the ground if the source rock (in this case, reportedly Ordovician) is chemically soluble”.

“Young Earth Creationist” Carl Baugh has other ideas, claiming the object to be a “monumental pre-flood discovery”. You can see the London Hammer and decide for yourself, at the Creation Evidence Museum, in Glen Rose Texas.

Eltanin_Antenna
USNS Eltanin photo, 1964

The “Eltanin Antennae” was photographed by the cargo-carrying icebreaker and oceanographic research vessel USNS Eltanin in 1964.  Located on the sea floor off the Antarctic coast, the object lies under 12,808 feet of water.

To many, the object is clearly the result of intelligent life, even extra-terrestrials.  Author Brad Steiger has called it“an astonishing piece of machinery… very much like the cross between a TV antenna and a telemetry antenna“.

Other authorities have identified the object as Chondrocladia concrescens, an unusual carnivorous sponge.

Artist Karl Weingärtner created a mobile phone-style clay tablet for a museum display in 2012, complete with cuneiform script, keypad. Weingärtner posted a photo to his Facebook account, to help sell his art. Some wag dubbed the thing “BabyloNokia”, and it was off to the races. The “Conspiracy Club” website ran the image with the caption: “800-Year-Old Mobile Phone Found In Austria? Check This Out.”

Babylonokia

For the editors at UFO Sighting Daily, the BabyloNokia was proof positive that ancient astronauts had been here. Not to be outdone, the British tabloid Daily Express ran with Weingärtner’s image, claiming the object dated to the 13th century, BC.

Winston Churchill once quipped “A lie gets halfway around the world before the truth has a chance to get its pants on“.  Guess he got that right.

800px-WLANL_-_Urville_Djasim_-_Geode_amethist_-_Amethyst_geode_(2)
A large geode, lined with Amethyst crystals

Wallace Lane, Virginia Maxey and Mike Mikesell liked to prospect for geodes, near the California town of Olancha.

A geode is a hollow stone formation,  containing a secondary lining of crystals or mineral matter.  Geodes form slowly, over geologic time.  There’s no way of knowing what’s inside, until it’s broken or cut, apart.

On this day in 1961, the trio discovered the “Coso Artifact”, a geode containing an unusual object.  A Champion spark plug.

A reader wrote to Desert Magazine, claiming a trained geologist had dated the thing, at 500,000 years old.  The identity of the “trained geologist”, went unsaid.

A number of Pseudoscientific theories arose, to explain the object:

• The spark plug was evidence of an ancient, advanced civilization, possibly proof of the long lost city of Atlantis, itself.
• Prehistoric extraterrestrial visitors came to Earth. How such creatures came to possess a “Champion” spark plug, went unanswered.
• Human time-travelers from the future had left or lost the spark plug, thus proving their visit to the past.

The answer it seems, was more prosaic.  Researchers determined with help from the Spark Plug Collectors of America (who knew?), that this was a 1920s-era Champion spark plug, widely used in the engines of Model T and Model A Fords. The “geode” wasn’t that at all, but the accretion of iron and other minerals, produced as the object rusted in the ground.

Coso_artifact_1

Geologists from the University of Washington Earth and Space Science department were invited to inspect the thing again, just last year. Scientists confirmed the opinion that this was a 1920s-vintage plug but, I don’t know.

Sounds to me like someone’s still betting on the 500,000-year version of the story.

If you enjoyed this “Today in History”, please feel free to re-blog, “like” & share on social media, so that others may find and enjoy it as well. Please click the “follow” button on the right, to receive email updates on new articles.  Thank you for your interest, in the history we all share.
Advertisements

January 12, 1967 Cryonic Suspension

Suffering from an incurable and metastatic kidney cancer, Dr. James Hiram Bedford became the first person in history to be cryonically preserved on January 12, 1967. Frozen at the boiling point of liquid nitrogen, −321° Fahrenheit, he remains in cryonic suspension, to this day.

The human brain is an awesome thing. Weighing in at about 3-pounds, the organ is comprised of something like 86 billion neurons, each comprised of a stoma or cell body, an axon to take information away from the cell, and anywhere between a handful and a hundred thousand dendrites bringing information in. Chemical signals transmit information over minute gaps between neurons called synapses, about 1/25,000th to 1/50,000th of the thickness of a sheet of paper.
There are roughly a quadrillion such synapses, meaning that any given thought could wend its way through more pathways than there are molecules in the known universe. This is roughly the case, whether you are Stephen J. Hawking, or Forrest Gump.
signaltransmissiondendritescellbodynucleussynapse
Figure Dendrites. Cell body. Nucleus. Axon hillock. Axon. Signal direction. Synapse. Myelin sheath. Synaptic terminals. Presynaptic cell. Postsynaptic cell.

For all of this, the brain cannot store either oxygen or glucose (blood sugar), meaning that there’s about 6 minutes after the heart stops, before the brain itself begins to die.
Legally, brain death occurs at “that time when a physician(s) has determined that the brain and the brain stem have irreversibly lost all neurological function”. Brain death defines the legal end of life in every state except New York and New Jersey, where the law requires that a person’s lungs and heart must also have stopped, before that person is declared legally dead.

“Information-theoretic death” is defined as death which is final and irreversible by any technology.  Clearly then, there is a gap, a small span of time, between the moment of legal death and a person’s permanent and irreversible passing.
So, what if it were possible to get down to the molecular level and repair damaged brain tissue. For that matter, when exactly does such damage become “irreversible”?
cryonics (1)
The Alcor Life Extension Foundation, the self-described “world leader in cryonics, cryonics research, and cryonics technology” explains “Cryonics is an effort to save lives by using temperatures so cold that a person beyond help by today’s medicine can be preserved for decades or centuries until a future medical technology can restore that person to full health”.
The practice is highly controversial, and not to be confused with Cryogenics, the study of extremely low temperatures, approaching the still-theoretical cessation of all molecular activity. Absolute zero.
The Cryogenic Society of America, Inc. includes this statement on its home page:
“We wish to clarify that cryogenics, which deals with extremely low temperatures, has no connection with cryonics, the belief that a person’s body or body parts can be frozen at death, stored in a cryogenic vessel, and later brought back to life. We do NOT endorse this belief, and indeed find it untenable”.
cryonic capsules
The modern era of cryonics began in 1962, when Michigan College physics professor Robert Ettinger proposed that freezing people may be a way to reach out to some future medical technology.
The Life Extension Society, founded by Evan Cooper in 1964 to promote cryonic suspension, offered to preserve one person free of charge in 1965. Dr. James Hiram Bedford was suffering from untreatable kidney cancer at that time, which had metastasized to his lungs.
Bedford became the first person to be cryonically preserved on January 12, 1967, frozen at the boiling point of liquid nitrogen, −321° Fahrenheit, and sealed up in a double-walled, vacuum cylinder called a “dewar”, named after Sir James Dewar, the 19th century Scottish chemist and physicist best known for inventing the vacuum flask, and for research into the liquefaction of gases.
kendi-istegiyle-dondurulan-ilk-insan-51-yildir-uyanmayi-bekliyor
Dr. James Hiram Bedford

Fifty-one years later, cryonics societies around the world celebrate January 12 as “Bedford Day”. Dr. Bedford has since received two new “suits”, and remains in cryonic suspension, to this day.

Advocates experienced a major breakthrough in the 1980s, when MIT engineer Eric Drexler began to publish on the subject of nanotechnology. Drexler’s work offered the hope that, theoretically, one day injured tissue may be repaired at the molecular level.
In 1988, television writer Dick Clair, best known for television sitcoms “It’s a Living”, “The Facts of Life”, and “Mama’s Family”, was dying of AIDS related complications. In his successful suit against the state of California, “Roe v. Mitchell” (Dick Clair was John Roe), Judge Aurelio Munoz “upheld the constitutional right to be cryonically suspended”, winning the “right” for everyone in California.
munoz
Judge Aurelio Munoz

The decision failed to make clear who was going to pay for it.

As to cost, a Cryonics Institute (CI) video advertises a cryopreservation fee of $28,000, payable in monthly installments of $25.
Ted Williams went into cryonic preservation in 2002, despite the bitter controversy that split the Williams first-born daughter Bobby-Jo Williams Ferrell, from her two half-siblings John-Henry and Claudia. The pair were adamant that the greatest hitter in baseball history wanted to be preserved to be brought back in the future, while Ferrell pointed out the will, which specified that Williams be cremated, his ashes scattered off the Florida coast.
The court battle produced a “family pact” written on a cocktail napkin, which was ruled authentic and allowed into evidence. So it is that Ted Williams’ head went into cryonic preservation in one container, his body in another.
ben_franklin-1-2-e1336601575917The younger Williams died of Leukemia two years later, despite a bone marrow donation from his sister. John-Henry joined his father, in 2004.
Walt Disney has long been rumored to be in frozen suspension, but the story isn’t true. After his death in 1966, Walt Disney was interred at Forest Lawn Memorial Park in Glendale, California.
In April 1773, Benjamin Franklin wrote a letter to Jacques Dubourg. “I wish it were possible”, Franklin wrote, “to invent a method of embalming drowned persons, in such a manner that they might be recalled to life at any period, however distant; for having a very ardent desire to see and observe the state of America a hundred years hence, I should prefer to an ordinary death, being immersed with a few friends in a cask of Madeira, until that time, then to be recalled to life by the solar warmth of my dear country! But…in all probability, we live in a century too little advanced, and too near the infancy of science, to see such an art brought in our time to its perfection”.
Maybe so but, for the several hundred individuals who have plunked down $25,000 to upwards of $200,000 to follow Dr. Bedford into cryonic suspension, hope springs eternal.

If you enjoyed this “Today in History”, please feel free to re-blog, “like” & share on social media, so that others may find and enjoy it as well. Please click the “follow” button on the right, to receive email updates on new articles.  Thank you for your interest, in the history we all share.

 

January 11, 1693 The Wrath of God

“Then came an earthquake so horrible and ghastly that the soil undulated like the waves of a stormy sea, and the mountains danced as if drunk, and the city collapsed in one miserable moment killing more than a thousand people.” Eyewitness quoted by Stephen Tobriner: The Genesis of Noto: An Eighteenth-century Sicilian City

In his 1897 short story The Open Boat, Stephen Crane writes of the puniness of humanity, when bared and exposed to the wrath of God, or of Nature, as you please.
“If I am going to be drowned — if I am going to be drowned — if I am going to be drowned, why, in the name of the seven mad gods, who rule the sea, was I allowed to come thus far and contemplate sand and trees? 

On this day in 1693, those Seven Mad Gods got together, and unleashed the wrath of the ages.

ABWCWW Earth s CoreDeep in the ground beneath our feet, a rocky shell comprising an outer Crust and an inner Mantle forms a hard and rigid outer shell, closing off and containing the solid inner core of our planet. Between these hard inner and outer layers exists a liquid core of molten material, comprising approximately two-thirds the cross-section of planet Earth.

The air around us is a liquid, exerting a ‘weight’ or barometric pressure at sea level, of 14.696 pounds per square inch. Scientists estimate the pressures within this outer core to be approximately 3.3 million times atmospheric pressure, generating temperatures of 10,800° Fahrenheit, a temperature comparable to the surface of the sun.

That rocky shell closing us off from all that is actually quite elastic, broken into seven or eight major pieces, (depending on how you define them), and several minor bits called Tectonic Plates.

Over millions of years, these plates move apart along constructive boundaries, where oceanic plates form mid-oceanic ridges. Roughly equal and opposite to these are the Subduction Zones, where one plate moves under another and down into the mantle.

The planet is literally “eating’ itself.

Sicily, the largest island in the Mediterranean and one of twenty regions of Italy, lies on the convergent boundary of two such pieces of the planet’s outer shell, where the African plate is subducting beneath the Eurasian plate.  Over time, the forces built up along these subduction zones, are nothing short of Titanic.

Sicily is also home to the terrifying Mount Etna, one of the most active volcanoes, in the world.

The first foretaste of what was about to happen began at 21:00 local time, January 9, 1693. The earthquake, centered on the east Sicilian coast and felt as far away as the south of Italy and the island nation of Malta, had an estimated magnitude of 6.2 on the Richter scale, and a perceived intensity on the Mercali Intensity Scale of VIII – XI: Severe to Extreme. Mercali describes a Category XI Extreme earthquake:

Few, if any, (masonry) structures remain standing. Bridges destroyed. Broad fissures in ground. Underground pipe lines completely out of service. Earth slumps and land slips in soft ground. Rails bent greatly”.

This thing was only stretching and yawning.  Just getting out of bed.

1200px-sicilia_sisma_1693

The main shock of January 11 lasted four minutes with an estimated magnitude of 7.4 and a very large area that reached X on the Mercali scale, and XI in the province of Syracuse.

The soil beneath our feet, ordinarily so substantial and unmoving, behaves like a liquid at times like this. Low density, sandy soils compress in response to applied loads while dense soils expand in volume or dilate. Saturated soils are like unto quicksand, as underground liquids are driven up to form miniature volcanoes called “sand boils, water spouting up from the ground in geysers, rising 30-feet and more.

christchurch_quake,_2011-02-22
Sand boils resulting from the 2011 earthquake, in Christchurch

The catastrophic eruption of 1669 was well within living memory and reports describe minor eruptions on this day as well.  As if even a small volcanic eruption could be called “minor”.

Several large fractures opened in the earth, one 1,600-feet long and nearly seven-feet wide.

Meanwhile the ocean withdrew from the coast, as the Ionian Sea gathered itself, to strike. The initial withdrawal left the harbor dry at Augusta, damaging several Galleys owned by the Knights of Malta.   The tsunami when it came was at least eight feet in height and possibly as high as 26-feet, inundating an area nearly a mile from the shore.

The final death toll of as many as 60,000 is uncertain, unsurprising in light of the fact that whole regions were blotted out. 63% of the entire population was wiped out in Catania, 51% in Ragusa. Syracuse, Noto, Augusta, Modica – all lost between one-out-of-five, and one-in-three.

Reconstruction in the wake of the catastrophe was so extensive, as to spawn a new and unique form of art and architecture, known as Sicilian Baroque.

la_cattedrale_di_noto_restaurata
The Cathedral of Noto is one of the many buildings constructed in Sicilian Baroque style after the earthquake of 1693

Today, the colossal Mount Etna remains one of the most active volcanoes, on earth.  Sensors placed along the land and seaward flanks of the volcano reveal the alarming discovery that the volcano itself, is moving.  Mount Etna is sliding at a rate of an inch per year and sometimes more.  One eight-day period in 2008 showed a movement of two inches, raising concerns that Mount Etna may one day collapse into itself.

On May 18, 1980 Mount St. Helens erupted after a 5.1 magnitude earthquake, resulting in 57 deaths and inflation-adjusted property damage, of $3.3 Billion.  The US Geological Survey called the resulting collapse of the north face of the volcano “the largest debris avalanche on earth, in recorded history”.  Should such an event strike the Stratovolcano that is Mount Etna, the result would be felt from the Spanish coast to the shores of Israel, from North Africa to the French Riviera.

Given geologic time scales, such an event could happen next year, or ten thousand years from now.  No one knows.  We are so puny when compared with the Wrath of God, or of Nature, as you please.

castello_di_noto
Ruins of the Norman castle in Noto Antica

Featured image, top of page:  New life before the shattered ruins of the old city of Not (Noto Antica), destroyed on January 11, 1693.  The new city of Noto was built, eleven kilometers away

December 11, 1970 The Man who saved a Billion People

It’s hard to get the modern head around the notion of “food insecurity”.  We’re not talking about what’s in the fridge. This is the problem of acute malnutrition, of epidemic starvation, of cyclical famine and massive increases in mortality, due to starvation and hunger-induced disease.

All too often, history is measured in terms of the monsters. The ten worst dictators of the last 1½ centuries account for the loss of nearly 150 million lives. Most of us remember their names. At least some of them. Who remembers the name of the man who Saved the lives of seven times the number, of this whole Parade of Horribles, put together?

3fd1fec87a98191d70a45bd887bacca8

We live in a time and place where the National Institutes of Health (NIH) can report “The U.S. is one of the wealthiest countries in the world and accordingly has high obesity rates; one-third of the population has obesity plus another third is overweight”.

It wasn’t always so. In 1820, 94% of the world’s population lived in “absolute poverty.” American economic historian and scientist Robert Fogel, winner of the 1993 Nobel Prize in Economics, wrote that: “Individuals in the bottom 20% of the caloric distributions of France and England near the end of the eighteenth century, lacked the energy for sustained work and were effectively excluded from the labor force.”

It’s hard to get the modern head around the notion of “food insecurity”.  We’re not talking about what’s in the fridge. This is the problem of acute malnutrition, of epidemic starvation, of cyclical famine and massive increases in mortality, due to starvation and hunger-induced disease.

Nels Olson Borlaug once told his grandson Norman, “You’re wiser to fill your head now if you want to fill your belly later on.” An Iowa farm kid educated during the Great Depression, Norman Ernest Borlaug periodically put his studies on hold, in order to earn money. A Civilian Conservation Corps leader working with unemployed people on CCC projects, many of his co-workers faced persistent and real, hunger. Borlaug later recalled, “I saw how food changed them … All of this left scars on me”.

norman-borlaug1Borlaug earned his Bachelor of Science in Forestry, in 1937. Nearing the end of his undergraduate education, he attended a lecture by Professor Elvin Charles Stakman discussing plant rust disease, a parasitic fungus which feeds on phytonutrients in wheat, oats, and barley crops.

Stakman was exploring special breeding methods, resulting in rust-resistant plants. The research greatly interested Borlaug, who later enrolled at the University of Minnesota, to study plant pathology under Stakman. Borlaug earned a Master of Science degree in 1940, and a Ph.D. in plant pathology and genetics, in 1942.

Borlaug attempted to enlist in the military following the attack on Pearl Harbor, but his application was rejected under wartime labor regulations. He was put to work in a lab, doing research for the United States armed forces.

Between 1939 and ’41, Mexican farmers suffered major crop failures, due to stem rust. In July 1944, Borlaug declined an offer to double his salary, traveling instead to Mexico City where he headed a new program focusing on soil development, maize and wheat production, and plant pathology.

borlaug_2

“Pure line” (genotypically identical) plant varieties possess only one to a handful of disease-resistance genes. Random mutations of rusts and other plant diseases overcome pure line survival strategies, resulting in crop failures. “Multi-line” plant breeding involves back-crossing and hybridizing plant varieties, transferring multiple disease-resistance genes into recurrent parents. In the first ten years Borlaug worked for the Mexican agricultural program, he and his team made over 6,000 individual crossings of wheat. Mexico transformed from a net-importer of food, to a net exporter.

In the early sixties, Borlaug’s dwarf spring wheat strains went out for multi-location testing around the world, in a program administered by the US Department of Agriculture. In March 1963, Borlaug himself traveled to India with Dr. Robert Glenn Anderson, along with 220-pounds of seed from four of the most promising strains.

aaa

The Indian subcontinent experienced minor famine and starvation at this time, limited only by the US’ shipping 1/5th of its wheat production into the region in 1966 – ’67. Despite resistance from Indian and Pakistani bureaucracies, Borlaug imported 550 tons of seeds.

American biologist Paul Ehrlich wrote in his 1968 bestselling book The Population Bomb, “The battle to feed all of humanity is over … In the 1970s and 1980s hundreds of millions of people will starve to death in spite of any crash programs embarked upon now.” Ehrlich went on: “I have yet to meet anyone familiar with the situation who thinks India will be self-sufficient in food by 1971…India couldn’t possibly feed two hundred million more people by 1980.”

Ehrlich could not have been more comprehensively wrong.

topimg_25013_dr_norman_borlaug

Borlaug’s initial yields were higher than any other crop, ever harvested in South Asia. Countries from Pakistan to India to Turkey imported 80,000 tons and more of seeds. By the time of Ehrlich’s book release in 1968, massive crop yields had substituted famine and starvation, with a host of new problems. There were labor shortages at harvest, and insufficient numbers of bullock carts to haul it to the threshing floor. Jute bags were needed, along with trucks, rail cars, and grain storage facilities. Local governments even closed school buildings, to use them for grain storage.

norman-borlaug-001

In three years, the world increase in cereal-grain production was nothing short of spectacular, dubbed a “Green Revolution”.   Borlaug won the Nobel Peace Prize in 1970, protesting to be only “one member of a vast team made up of many organizations, officials, thousands of scientists, and millions of farmers – mostly small and humble…”

BORLAUG ATTEND QPM MAIZE FIELD IN GUIZHOU-bf3726c08c
Norman Borlaug works with Chinese agricultural leaders, 1974

With mass starvation or widespread deforestation being the only historic alternatives, the “Borlaug Hypothesis” introduced a third option, that of increasing yields on existing farmland.  The work however, was not without critics. Environmentalists criticized what they saw as large-scale monoculture, in nations previously reliant on subsistence farming. Critics railed against “agribusiness” and the building of roads through what had once been wilderness.

David Seckler, Director General of the International Water Management Institute said “The environmental community in the 1980s went crazy pressuring the donor countries and the big foundations not to support ideas like inorganic fertilizers for Africa.”

953650D6-9A57-430E-82AE-1803733FC42E_mw1024_s_n-copy
Norman Borlaug, Nobel Prize Acceptance Speech, December 11, 1970

The Rockefeller and Ford foundations withdrew funding, along with the World Bank. Well fed environmentalist-types congratulated themselves on “success”, as the Ethiopian famine of 1984-’85 destroyed over a million lives. Millions more were left destitute, on the brink of starvation.

Borlaug fired back, “[S]ome of the environmental lobbyists of the Western nations are the salt of the earth, but many of them are elitists. They’ve never experienced the physical sensation of hunger. They do their lobbying from comfortable office suites in Washington or Brussels. If they lived just one month amid the misery of the developing world, as I have for fifty years, they’d be crying out for tractors and fertilizer and irrigation canals and be outraged that fashionable elitists back home were trying to deny them these things.

Borlaug became involved at the invitation of Ryoichi Sasakawa, chairman of the Japan Shipbuilding Industry Foundation, who wondered why methods used so successfully in Asia, were not being employed in Africa. Since that time, the Sasakawa Africa Association (SAA) has trained over 8 million farmers in SAA farming techniques. Maize crops developed in African countries have tripled, along with increased yields of wheat, sorghum, cassava, and cowpeas.

4fig7

The world population when Ehrlich released his book in 1968, was about 3.53 billion. Today, that number stands at 7.7 billion and, when we hear about starvation, such events are almost exclusively, man-made. The American magician and entertainer Penn Jillette once described Norman Borlaug as “The greatest human being who ever lived…and you’ve probably never heard of him.” Let that be the answer to the self-satisfied and well-fed, environmentalist types.

norman-borlaug-6

“I now say that the world has the technology—either available or well advanced in the research pipeline—to feed on a sustainable basis a population of 10 billion people. The more pertinent question today is whether farmers and ranchers will be permitted to use this new technology? While the affluent nations can certainly afford to adopt ultra low-risk positions, and pay more for food produced by the so-called ‘organic’ methods, the one billion chronically undernourished people of the low income, food-deficit nations cannot.” – Norman Borlaug, 2000

December 9, 1952 Gasping for Air

When such a weather system occurs over areas with high levels of atmospheric contaminants, the resulting ground fog can be catastrophic. 63 people perished during a similar episode in 1930, in the Meuse River Valley area of Belgium. In 1950, 22 people were killed in Poza Rica, Mexico. In 1952, the infamous “Great Smog of London” claimed the lives of thousands, over a course of five days.

schwarzenegger-ap-18337474166966
“Actor Arnold Schwarzenegger delivers a speech during the opening of COP24 UN Climate Change Conference 2018 in Katowice, Poland, Monday, Dec. 3, 2018.
Czarek Sokolowski / AP” H/T CBS News, Inc.

Last week, climate activists and world leaders gathered in Poland to discuss carbon pollution resulting from the use of fossil fuels, and ways to combat what they see as a future of anthropogenic global warming.

Adherents to current climate change theories hold onto such ideas with a fervor bordering on the religious while skeptics raise any number of questions but, one thing is certain. There was a time when the air and water around us was tainted with impunity, with sometimes deadly results.

In 1969, the Cuyahoga River in Cleveland Ohio caught fire, resulting in property damage worth $100,000, equivalent to nearly $700,000, today. The fire resulted in important strategies to clean up the river, but this wasn’t the first such fire. The Cuyahoga wasn’t even the first river to catch fire. There were at least thirteen such incidents on the Cuyahoga, the first occurring in 1868. The Rouge River in Michigan caught fire in the area around Detroit in 1969, and a welder’s torch lit up the Buffalo River in New York, the year before. The Schuylkill River in Philadelphia caught fire from a match tossed into the water, in 1892.

Fire on the Cuyahoga River
Cuyahoga River burning, in 1952. H/T Getty Images

Today, the coal silts, oil and chemical contaminants at the heart of these episodes are largely under control in the developed world, but not the world over. One section of Meiyu River in Wenzhou, Zhejiang China burst into flame in the early morning of March 5, 2014. Toxic chemical pollution and other garbage dumped into Bellandur Lake in Bangalore India resulted in part of the lake catching fire the following year, the fire spreading to the nearby Sun City apartments.

If you happen to visit the “Iron City”, Pittsburgh Pennsylvania, photographs may be found of streetlights turned on in the middle of the day. In November 1939, St. Louis brought a new meaning to the term “Black Tuesday”, when photographs of the Federal building at Twelfth Boulevard and Market Street show the sun little more than a “pale lemon disk” and streetlights on at 9:00 in the morning.

Federal Building, St. Louis
Federal Building, St. Louis

Air pollution turned deadly in the early morning hours of October 26, 1948 when an atmospheric inversion trapped flourine gases over Donora Pennsylvania, home of US Steel Corporation’s Donora Zinc Works and American Steel and Wire. By the 29th, the inversion had trapped so much grime that spectators gathered to watch a high school football game, couldn’t see the kids on the field. The “Death Fog” hung over Donora for four days, killing 22 and putting half the town, in the hospital.

mid-day
Donora Smog at Midday with streetlights on. H/T Donora Historical Society

The Donora episode was caused by an “anticyclone”, a weather event in which a large high pressure front draws air down through the system and out in a clockwise motion.

When such a weather system occurs over areas with high levels of atmospheric contaminants, the resulting ground fog can be catastrophic. 63 people perished during a similar episode in 1930, in the Meuse River Valley area of Belgium. In 1950, 22 people were killed in Poza Rica, Mexico. In 1952, the infamous “Great Smog of London” claimed the lives of thousands, over a course of five days.

Nelson's_Column_during_the_Great_Smog_of_1952
Nelson’s Column during the Great Smog of 1952

On December 5, a body of cold, stagnant air descended over a near-windless London, trapped under a “lid” of warm air. London had suffered poor air quality since the 13th century and airborne pollutants had combined to create “pea soupers” in the past, but this was unlike anything in living memory. The smoke from home and industrial chimneys and other pollutants such as sulphur dioxide combined with automobile exhaust, with nowhere to go.

Yellow-black particles of the stuff built and accumulated at an unprecedented rate. Visibility was down to a meter and driving all but impossible. Public transportation shut down, requiring those rendered sick by the fog, to transport themselves to the hospital.  Outdoor sporting events were canceled and even indoor air quality, was affected.  Weather conditions held until December 9, when the fog dispersed.

hith-london-fog-2660357-ABThere was no panic, Londoners are quite accustomed to the fog, but this one was different. Over the weeks that followed, public health authorities estimated that 4,000 people had died as a direct result of the smog by December 8, and another 100,000 made permanently ill. Research pointed to another 6,000 losing their lives in the following months, as a result of the event.

More recently, research puts the death toll of the Great Smog at 12,000.

A similar event took place about ten years later in December 1962, but without the same lethal impact. A spate of environmental legislation in the wake of the 1952 disaster began to remove black smoke from chimneys.  Financial incentives moved homeowners away from open coal fires toward less polluting alternatives such as gas or oil, or less polluting coke.

Today, the wealthier, developed nations have made great strides toward improvement in air and water quality, though problems persist in the developing world.  In the United States, the Environmental Protection Agency (EPA) reports that:

“[B]etween 1980 and 2017, gross domestic product increased 165 percent, vehicle miles traveled increased 110 percent, energy consumption increased 25 percent, and U.S. population grew by 44 percent. During the same time period, total emissions of the six principal air pollutants dropped by 67 percent”.

The same report shows that, during the same period, CO2 emissions have increased by 12 percent.  Policy makers continue to wrangle with the long-term effects of carbon.  Now, it’s hard to separate the politics from the science.

While politicians and climate activists jet around the planet to devise trillion dollar “solutions”, let us hope that cooler heads than that of Arnold Schwarzenegger, prevail.  There is scarcely a man, woman or child among us who do not want clean air and clean water, and a beautiful, natural environment around us, for ourselves and our posterity.  It’s only a matter of how we get there.

If you enjoyed this “Today in History”, please feel free to re-blog, “like” & share on social media, so that others may find and enjoy it as well. Please click the “follow” button on the right, to receive email updates on new articles.  Thank you for your interest, in the history we all share.

November 20, 1984 The Search for Extra-Terrestrial Intelligence

“The feeling is constantly growing on me, that I had been the first to hear the greeting of one planet to another.” Nikola Tesla, 1901

In the 5th century BC, the Greek philosopher Democritus taught that the world was made of atoms. Physically indestructible and always in motion, these atoms are infinite in number, differing only in shape and size. Democritus taught that everything around us is the result of physical laws without reason or purpose, the only question to be answered, “What circumstances caused this event?

Philosophers like Aristotle and Socrates took a less mechanistic approach, asking “What purpose did this event serve?” Plato disliked Democritus so much that he wanted to burn all his books.

The prevailing view throughout antiquity was that our planet is special.  That we are alone in the cosmos. Democritus believed there were infinite numbers of worlds such as our own, with inhabitants like ourselves.

screen-shot-2013-08-28-at-2-03-17-pm
The 13th-century Paisley Abbey in Scotland, had its deteriorating gargoyles refurbished in the 1990s. One of their stonemasons was clearly, an Alien fan

In the time of Copernicus, it was widely believed that there was life on other planets. Astronomers saw several features of the moon as evidence, if not of life, then at least that intelligent life had once paid a visit.

Interest in Mars began to develop in the 1870s, when the Italian astronomer Giovanni Virginio Schiaparelli described physical features of the red planet as “canali”. The word means “channels” in Italian, but it was mis-translated as “canals”. The English speaking world was off to the races.

Speculation and folklore about intelligent life on Mars was soon replaced by the popular near-certainty, that canals were excavated by Martians.

The idea was near-universal by the turn of the century.  In 1900, the French Academy of Science offered a prize of 100,000 francs to the first person to make contact with an alien civilization. Provided that it was anything but Martian. That would have been too easy.

In 1901, Nikola Tesla believed he had picked up electrical disturbances “with such a clear suggestion of number and order”, they could only be signals from Mars. “The feeling is constantly growing on me,” Tesla said, “that I had been the first to hear the greeting of one planet to another. A purpose was behind these electrical signals.”

Guglielmo Marconi said essentially the same in 1919, commenting about “queer sounds and indications, which might come from somewhere outside the earth.”

mars_life

In 1924, the idea was put to the test. American astronomer David Peck Todd believed that martians might well attempt to communicate on the day the two bodies were in closest proximity on August 21, 1924.  The date became “National Radio Silence Day”. Americans were urged to observe “radio silence” for the first five minutes of every hour, while a radio receiver at the U.S. Naval Observatory, two miles aloft on board a dirigible, listened for the signal that never came.

The British author H. G. Wells wrote the War of the Worlds in 1897, telling the story of an alien earth invasion by Martians fleeing the desiccation of their own planet. The story was adapted to a radio drama broadcast on Halloween, 1938, a production so realistic that many listeners sued the network for “mental anguish” and “personal injury”.

The idea of life on Mars persisted until the 1960s, when close observations of the Martian surface were made possible by the Mariner series of spacecraft.

SETI_Logotype_RGB_reduced_resWhile much of “mainstream” science seems to steer clear of the subject, the University of California at Berkeley jumped in with both feet on this day in 1984, founding the SETI Institute for the “sharing [of] knowledge as scientific ambassadors to the public, the press, and the government”.

The Berkeley SETI Research Center conducts a number of search operations at various wavelengths from radio through infrared spectrum and visible light, including:

SERENDIP: Search for Extraterrestrial Radio Emissions from Nearby Developed Intelligent Populations
SEVENDIP: Search for Extraterrestrial Visible Emissions from Nearby Developed Intelligent Populations
NIROSETI: Near-InfraRed Optical Search for Extraterrestrial Intelligence
Breakthrough Listen:  Launched with $100 million in funding in 2016, it is “the most comprehensive search for alien communications to date.”
SETI@home:  A “scientific experiment that uses Internet-connected computers in the Search for Extraterrestrial Intelligence.”

800px-GBT
The Robert C. Byrd Green Bank Telescope (GBT) in Green Bank, West Virginia is only one such installation put to use for project Breakthrough Listen, begun in 2016 as “the most comprehensive search for alien communications to date.”

Launched on May 17, 1999 with a worldwide objective of 50,000-100,000 home computers, to date more than 5.2 million SETI@home users have logged over two million years of aggregate computing time. Since the introduction of the Berkeley Open Infrastructure for Network Computing, or “BOINC” (I didn’t make that up), SETI@home users can even compete with one another, to see who can process the most “work units”.

You, too can participate at http://setiathome.berkeley.edu/, on your Windows, Apple or Network PC, or your Sony PlayStation 3.  Don’t try it at work though, an act known as “Borging”.   You might not be “assimilated”, but you will get fired.

Let me know if you make contact.

SETI-at-HOME-SETI@HOME

November 7, 1957 Nuke the Moon

The second Vanguard launch was nearly as bad as the first, exploding in flames only seconds after launch.  Soviet leaders were beside themselves with joy, and stamped the twin disasters “Kaputnik”.  “Flopnik”.

As World War II drew to a close in 1945, there arose a different sort of conflict, a contest of wills, between the two remaining Great Powers of the world. The “Cold War” pitted the free market economy and constitutional republicanism of the United States against the top-down, authoritarian governing and economic models of the Soviet Union. The stakes could not have been higher, as each side sought to demonstrate the superiority of its own technology, military might and, by implication, the dominance of its political and economic system.

American nuclear preeminence lasted but four short years, coming to an end with the first successful Soviet atomic weapon test code named “First Lightning”, carried out on August 29, 1949. Mutual fear and distrust fueled the Soviet-American “arms race”, a buildup of nuclear stockpiles beyond any rational purpose. An entire generation grew up under the shadow of nuclear annihilation.  A single mistake, misunderstanding or one fool in the wrong place at the wrong time, initiating a sequence and bringing about the extinction of life on this planet.

nuclear

The arms race acquired the dimensions of a Space Race on July 29, 1956, when the United States announced its intention to launch an artificial satellite, into earth orbit. Two days later, the Soviet Union announced its intention to do the same.

The early phase of the Space Race was a time of serial humiliation for the American side, the Soviet Union launching the first Inter-Continental Ballistic Missile (ICBM) on August 21, 1957, and the first artificial satellite “Sputnik 1” on October 4.

Laika and capsuleThe first living creature to enter space was the dog Laika“, launched aboard the spacecraft Sputnik 2 on November 3 and labeled by the more smartass specimens among the American commentariat, as “Muttnik”. Soviet propaganda proclaimed “the first traveler in the cosmos”, with heroic images printed on posters, stamps and matchbook covers. The American news media could do little but focus on the politics of the launch, as animal lovers the world over questioned the ethics of sending a dog to certain death, in space.

On the American side, the giant Vanguard series rocket was scheduled to launch the grapefruit-sized test satellite into earth orbit in September, but the program was plagued by one delay after another.  The December 6 launch was a comprehensive disaster, the rocket lifting all of four feet off the pad before crashing to the ground in a sheet of flame, the satellite rolling free where it continued to beep, only feet from the burning wreck.

The second Vanguard launch was nearly as bad, exploding in flames only seconds after launch.  Soviet leaders were beside themselves with joy, and stamped the twin disasters “Kaputnik”.  “Flopnik”.

Out of this mess emerged an idea destined to go down in the Hare-Brain Hall of fame, if there ever is such a place. A show of force sufficient to boost domestic morale, while showing the Soviets, we mean business. It was the top-secret “Project A119”, also known as A Study of Lunar Research Flights. We would detonate a nuclear weapon, on the moon.

In 1957, newspapers reported a rumor that the Soviet Union planned a nuclear test explosion on the moon, timed to coincide with the lunar eclipse of November 7, and celebrating the anniversary of the Glorious October Revolution. Edward Teller himself, the ‘Father of the H-Bomb” is said to have proposed such an idea as early as February, to test the effects of the explosion in a vacuum, and conditions of zero gravity.

644962_v1

Today, we take for granted the massively complex mathematics, involved in hitting an object like the moon. In 1957 there was a very real possibility of missing the thing, and the bomb returning to earth.

Though the information is still classified, the project was revealed in 2000 by former NASA executive Leonard Reiffel, who said he was asked to “fast track” the program in 1958, by senior Air Force officials. A young Carl Sagan was all for the idea, believing at the time that living microbes may inhabit the moon, and a nuclear explosion may help in detecting such organisms.

In an interview with the Guardian newspaper, Reiffel said “It was clear the main aim of the proposed detonation was a PR exercise and a show of one-upmanship. The Air Force wanted a mushroom cloud so large it would be visible on earth. The US was lagging behind in the space race.” The now-retired NASA executive went on to explain that “The explosion would obviously be best on the dark side of the moon and the theory was that if the bomb exploded on the edge of the moon, the mushroom cloud would be illuminated by the sun.”

The Air Force canceled the A119 program in 1959, apparently out of concern that a ‘militarization of space’ would create public backlash, and that nuclear fallout may hamper future research and even colonization efforts, on the moon.

moon-nuke_wikimedia-commons_fb

Previously secret reports revealed in 2010 that Soviet leaders had indeed contemplated such a project, part of a multi-part program code named “E”.  Project E-1 involved reaching the moon, while E-2 and E-3 focused on sending a probe around the far side of the celestial body. The final stage, project E-4, involved a nuclear strike on the moon as a “display of force”.

Construction plans for the aforementioned Hare-Brain Hall of Fame have yet to be announced but, it appears the place may need another wing.

If you enjoyed this “Today in History”, please feel free to re-blog, “like” & share on social media, so that others may find and enjoy it as well. Please click the “follow” button on the right, to receive email updates on new articles.  Thank you for your interest, in the history we all share.