March 21, 1905 Better Babies

By the height of the eugenics movement, some 30 states had passed legislation, legalizing the involuntary sterilization of individuals considered “unfit” for reproduction. All told, some 60,000 individuals were forcibly sterilized in state-sanctioned procedures.

In 380BC, Plato described a system of state-controlled human breeding in his Socratic dialogue “The Republic”, introducing a “guardian class” to watch over over his ideal society.

Ada JukeIn the 19th century, Francis Galton studied the theories of his cousin Charles Darwin on the evolution of species, applying them to a system of selective breeding intended to bring “better” human beings into the world.  He called it his theory of “Eugenics”.

Eugenics gained worldwide respectability in the early 20th century, when countries from Brazil to Japan adopted policies regarding the involuntary sterilization of certain mental patients.

images (35)“Better Babies” competitions sprang up at state fairs across the United States, where babies were measured, weighed, and “judged”.  Like livestock.  By the 20s, these events had evolved into “Fitter Family” competitions.

One of the leaders of the eugenics movement was the pacifist and Stanford University professor, David Starr Jordan.  After writing several books on the subject, Jordan became a founding member of the Eugenics Committee of the American Breeders Association.  The higher classes of American society were being eroded by the lower class, he argued.  Careful, selective breeding would be required to preserve the nation’s “upper crust”.

download (23)
Margaret Samger

Margaret Higgins Sanger believed that birth control should be compulsory for “unfit” women, who “recklessly perpetuated their damaged genetic stock by irresponsibly breeding more children in an already overpopulated world.”

An early advocate for birth control, Sanger has her supporters to this day, including former Presidential candidate Hillary Rodham Clinton. “I admire Margaret Sanger enormously”, Clinton said.  “Her courage, her tenacity, her vision…”  Time Magazine points out that “Sanger opened the first birth-control clinic in the United States”, describing her as “An advocate for women’s reproductive rights who was also a vocal eugenics enthusiast…”

Detractors have described Sanger as a “thoroughgoing racist”, citing her own words in What Every Girl Should Know, published in 1910:  “In all fish and reptiles where there is no great brain development, there is also no conscious sexual control. The lower down in the scale of human development we go the less sexual control we find. It is said that the aboriginal Australian, the lowest known species of the human family, just a step higher than the chimpanzee in brain development, has so little sexual control that police authority alone prevents him from obtaining sexual satisfaction on the streets”.

Admire or detest the woman as you choose, Sanger’s work established organizations which later evolved into the Planned Parenthood Federation of America.

Around the world, eugenics policies took the form of involuntarily terminated pregnancies, compulsory sterilization, euthanasia, and even mass extermination.


Madison Grant, the New York lawyer best known for his work in developing the discipline of wildlife management, was a leader in the eugenics movement, once receiving an approving fan letter from none other than Adolf Hitler.

Public policy and academic types conducted three international eugenics conferences to discuss the application of programs to improve human bloodlines.  The first such symposium convened in London in 1912, discussing papers on “racial suicide” and similar topics.  Presiding over the conference was none other than Major Leonard Darwin, the son of Charles Darwin, with Harvard president emeritus Charles William Eliot serving as vice President.

Vermont Eugenics: A Documentary HistoryThe 1912 conference was followed by two more in 1921 and 1932, both held in New York City.  Colleges and universities delved into eugenics as academic discipline, with courses exploring the ethical and public policy considerations of eliminating the “degenerate” and “unfit”.

In Pennsylvania, 270 involuntary sterilizations were performed without benefit of law, between 1892 and 1931.  On March 21, 1905, the Pennsylvania legislature passed “An Act for the Prevention of Idiocy”, requiring that every institution in the state entrusted with the care of “ idiots and imbecile children”, be staffed by at least one skilled surgeon, whose duty it was to perform surgical sterilization.  The bill was vetoed by then-Governor Samuel Pennypacker, only to return in 1911, ’13, ’15, ’17, ’19, and again in 1921.

By the height of the eugenics movement, some 30 states had passed legislation, legalizing the involuntary sterilization of individuals considered “unfit” for reproduction. All told, some 60,000 individuals were forcibly sterilized in state-sanctioned procedures.

California forced Charlie Follett to undergo a vasectomy in 1945 at the age of 15, when Follett found himself abandoned by alcoholic parents.   He was only one of some 20,000 Californians forced to undergo such a procedure.

images (36)
Roadside Marker, Raleigh, NC

Vermont passed a sterilization law in 1931, aimed at what then-University of Vermont zoology professor Henry Perkins called the “rural degeneracy problem.”  An untold number of “defectives” were forced to undergo involuntary sterilization, including Abenaki Indians and French-Canadian immigrants.

Indiana passed the first eugenic sterilization law in 1907, but the measure was legally flawed.  To remedy the situation, the Eugenics Record Office (ERO), founded in 1910 by the the former Harvard University Zoology Professor Charles Benedict Davenport, Ph.D.,  crafted a statute, which was later adopted by the Commonwealth of Virginia as state law in 1924.

That September, Superintendent of the ‘Virginia State Colony for Epileptics and Feebleminded’ Dr. Albert Sidney Priddy, filed a petition to sterilize one Carrie Elizabeth Buck, an 18-year-old patient at the institution whom Priddy claimed to be “incorrigible”.  A “genetic threat to society”.  Buck’s 52-year-old mother had a record of prostitution and immorality, Priddy claimed, and the child to whom Buck gave birth in the institution only proved the point.

Carrie Elizabeth Buck was born into poverty in Charlottesville, Virginia, the first of three children born to Emma Buck. Carrie’s father Frederick Buck abandoned the family, shortly after the marriage. Emma was committed to the “Virginia State Colony for Epileptics and Feebleminded” following accusations of immorality, prostitution, and having syphilis.

Buck’s guardian brought her case to court, arguing that compulsory sterilization violated the equal protection clause of the 14th amendment.  After losing in district court, the case was appealed to the Amherst County Circuit Court, the Virginia Supreme Court, and finally the United States Supreme Court.

Dr. Priddy died along the way, Dr. John Hendren Bell taking his place.  SCOTUS decided the “Buck vs Bell” case on May 2, 1927, ruling in an 8–1 decision that Carrie Buck, her mother, and her perfectly normal infant daughter, were all “feeble-minded” and “promiscuous.”

“This photograph was taken on the eve of the initial trial of Buck v Bell in Virginia. Mrs. Dobbs appear to be holding a coin believed to be used as a test for alertness or mental acuity. Vivian appears to be looking elsewhere. It may have ben on the strength of this test that Arthur Estabrook concluded that she “showed backwardness.” H/T DNA Learning Center,

In the majority ruling, Justice Oliver Wendell Holmes, Jr., did more than just greenlight the Virginia statute.  He urged the nation as a whole to get serious about eugenics, and to prevent large numbers of “unfit” from breeding:  “”It is better for all the world“, Holmes wrote, “if instead of waiting to execute degenerate offspring for crime, or to let them starve for their imbecility, society can prevent those who are manifestly unfit from continuing their kind“. In writing about Carrie Buck herself, her mother and infant daughter Vivian, Holmes delivered one of the most brutal pronouncements in all American jurisprudence: “Three generations of imbeciles are enough.”

It was later revealed that Carrie Buck had been raped by a member of the Dobbs family, the foster family who had taken her in and later had her committed.  To save the family “honor”.  No matter.  Buck was compelled to undergo tubal ligation, later paroled from the institution to become a domestic worker with a family in Bland, Virginia.  Buck’s daughter Vivian was adopted by the Dobbs family.

In a later examination of the child, ERO field worker Dr. Arthur Estabrook pronounced her “feeble minded” saying that she “showed backwardness”, supporting the “three generations” theory expressed in the SCOTUS opinion.

Vivian died from complications of measles in 1932, after only two years in school.  Dr. Estabrook failed to explain in his report, how she seemed to do well for those two years, nor did doctor Estabrook reveal how she came to be listed on her school’s honor roll, in April, 1931.

March 10, 1876  The Speed of Sound

For all of Mark Twain’s abilities, he wasn’t much of an investor.  The man turned down a ground-floor opportunity to invest in the telephone, in favor of a typesetting machine which actually made setting type more complicated, than the age-old printer’s method of setting type, by hand.

As the son of a speech pathologist and husband to a deaf wife, Alexander Melville Bell was always interested in sound. Since the profoundly deaf can’t hear their own pronunciation, Bell developed a system he called Visible Speech in 1864, to help the deaf learn and improve elocution.

220px-VisibleSpeech-illustrationsÉdouard Séguin, the Paris-born physician and educator best known for his work with the developmentally disabled and a major inspiration to Italian educator Maria Montessori, called the elder Bell’s work “…a greater invention than the telephone by his son, Alexander Graham Bell”.

As a boy, the younger Bell developed a method of carefully modulating his speech and speaking into his mother’s forehead, a method which allowed her to “hear” him, fairly clearly.  The boy followed in his father’s footsteps, mastering his elder’s work to the point of improving on it and teaching the system at the Boston School for Deaf Mutes (which operates today as the Horace Mann School for the Deaf), the American Asylum for Deaf-mutes in Hartford, Connecticut, and the Clarke School for the Deaf in Northampton, Massachusetts.

539001It was Alexander Graham Bell who first broke through to Helen Keller, a year before Anne Sullivan.  The two developed a life-long relationship closely resembling that of father and daughter.  Bell made it possible for Keller to attend Radcliffe and graduate in 1904, the first deaf/blind person, ever to do so.

Keller used a braille typewriter to write her first autobiography in 1903, dedicating The Story of My Life, to her life-long friend, benefactor and mentor:  “To Alexander Graham Bell, who has taught the deaf to speak and enabled the listening ear to hear speech from the Atlantic to the Rockies.”

CREDIT: “[Alexander Graham Bell with Helen Keller and Annie Sullivan at the meeting of the American Association to Promote the Teaching of Speech to the Deaf, July 1894, in Chautauqua, N.Y.]” [1894, printed later]. Prints and Photographs Division of the Library of Congress.
A natural inventor, it was Bell’s 1875 work on electrical telegraphy, which led to the telephone.  Bell heard a “twang” on the line while working , leading him to investigate the possibility of using electrical wires, to transmit sound.

Rival Elisha Gray was working on a similar concept, and filed a caveat (statement of concept) on February 14, 1876, mere hours after Bell applied for patent.

Bell’s device first produced intelligible speech on March 10, that same year.  His diary entry describes the event: “I then shouted into M [the mouthpiece] the following sentence: “Mr. Watson, come here – I want to see you.” To my delight he came and declared that he had heard and understood what I said.  I asked him to repeat the words. He answered, “You said ‘Mr. Watson – come here – I want to see you.'” We then changed places and I listened at S [the speaker] while Mr. Watson read a few passages from a book into the mouthpiece M. It was certainly the case that articulate sounds proceeded from S. The effect was loud but indistinct and muffled”.


Many of Bell’s innovations came about, much earlier than you might expect.  One of his first inventions after the telephone was the “photophone,” a device enabling sound to be transmitted on a beam of light. Bell and his assistant, Charles Sumner Tainter, developed the photophone using a sensitive selenium crystal and a mirror which would vibrate in response to sound. In 1881, the pair successfully sent a photophone message from one building to another, a distance of over 200 yards.

Several innovations would later build on this accomplishment to produce the modern laser.

0bcb9e63f5In September 1881, Alexander Graham Bell hurriedly invented the first metal detector, as President James Garfield lay dying from an assassin’s bullet. The device was unsuccessful in saving the President, but credited with saving many lives during the Boer War and WW1.

That was the year in which Bell’s infant son Edward died of respiratory problems, leading the bereaved father to design a metal vacuum jacket which would facilitate breathing. This apparatus was a forerunner of the iron lung used in the ’40s and ’50s to aid polio victims.  As many as 39 people still used an iron lung to breathe, as late as 2004.

The telephone was a commercial success, but that wasn’t a foregone conclusion. Looking for investors for his new enterprise, Bell approached Samuel Clemens in 1877, as a potential investor.  Better known as Mark Twain, the author declined the opportunity, believing the market to be confined to bridge-to-engine room communications, onboard maritime vessels.

download (19)
Mark Twain

One of the towering figures of American literature, Samuel Clemens achieved considerable financial success during his lifetime but, for all his abilities, didn’t have much of an eye for opportunity.  Mark Twain turned down a ground-floor invitation to invest in the telephone, choosing instead to buy into a typesetting machine which complicated the setting of type, compared with the age-old printer’s method of setting type, by hand.

Alexander Graham Bell’s creation would change the world but, to the end of his days, his work with the deaf gave him greatest satisfaction.

Bell would sell his invention, to finance his work on devices to aid the hearing-impaired.  He didn’t keep a phone on his desk, considering the thing to be an interruption and a nuisance.

Later in life, Alexander Graham Bell described his work with the deaf, as “more pleasing to me than even recognition of my work with the telephone.”


If you enjoyed this “Today in History”, please feel free to re-blog, “like” & share on social media, so that others may find and enjoy it as well. Please click the “follow” button on the right, to receive email updates on new articles.  Thank you for your interest, in the history we all share.

January 16, 2003 Columbia

Flight Director Jon Harpold stated the problem, succinctly. “If it has been damaged it’s probably better not to know. I think the crew would rather not know. Don’t you think it would be better for them to have a happy successful flight and die unexpectedly during entry than to stay on orbit, knowing that there was nothing to be done, until the air ran out?”

Discussions of a reusable Space Transportation System (STS) began as early as the 1960s, as a way to cut down on the cost of space travel. The final design was a reusable, winged “spaceplane”, with disposable external tank and reusable solid fuel rocket boosters.

The ‘Space Truck’ program was approved in 1972, the prime contract awarded to North American Aviation (later Rockwell International), with the first orbiter completed in 1976.

Early Approach and Landing Tests were conducted with the first prototype, dubbed “Enterprise”, in 1977. A total of 16 tests, all within the confines of the atmosphere, were conducted from February to October of that year, the lessons learned applied to the first spaceworthy vehicle in NASA’s orbital fleet.

columbia_sts1STS-1, the first mission of the “Space Shuttle” program launched aboard “Columbia” from the Kennedy Space Center on Merritt Island, Florida. It was April 12, 1981, the 20th anniversary of the first human spaceflight, aboard the Russian Vostok 1. This was the first and, to-date only, manned maiden test flight of a new spacecraft system, in the US space program.

This first flight of Columbia would be commanded by Gemini and Apollo veteran John Young, and piloted by Robert Crippen. It was the first of 135 missions in the Space Shuttle program, the first of only two to take off with its external hydrogen fuel tank painted white. From STS-3 on, the external tank would be left unpainted to save weight.

There were initially four fully functional orbiters in the STS program: Columbia was joined after her first five missions by “Challenger”, “Discovery”, and finally “Atlantis”. A fifth orbiter, “Endeavor”, was built in 1991 to replace Challenger, which broke apart 73 seconds after lift-off on January 28, 1986, killing all seven of its crew.

All told, Columbia flew 28 missions with 160 crew members, traveling 125,204,911 miles in 4,808 orbits around the planet.

Columbia-Space-Shuttle-DisasterSTS-107 launched from the Kennedy Space Center aboard the Space Shuttle Columbia on January 16, 2003.  Eighty seconds after launch, a piece of insulating foam the size of a briefcase broke away from the external fuel tank, striking the leading edge of Columbia’s left wing and leaving a hole in the carbon composite tiles.

These carbon tiles are all that stands between the orbiter and the searing heat of re-entry.  On the ground, mission management teams discussed the problem, without being certain of its extent.  Even if there was major damage, little could be done about it.  So, what to tell the crew?

Flight Director Jon Harpold stated the problem, succinctly. “If it has been damaged it’s probably better not to know. I think the crew would rather not know. Don’t you think it would be better for them to have a happy successful flight and die unexpectedly during entry than to stay on orbit, knowing that there was nothing to be done, until the air ran out?”

So it was that Columbia’s 300 days, 17 hours, forty minutes and 22 seconds in space came to an end on the morning of February 1, 2003.

231,000 feet over the California coast and traveling 23 times the speed of sound, external temperatures surrounding the craft rose to 3,000°F when hot gases penetrated the interior of the left wing.  Abnormal readings began to show up at Mission Control, first temperature readings, and then tire pressures.


The first debris began falling to the ground near Lubbock, Texas, at 8:58am. “Capcom”, the spacecraft communicator, called to discuss the tire pressure readings. At 8:59:32 a.m., Commander Husband called back from Columbia: “Roger,” he said, followed by another word.  It was cut off in mid-sentence.

After sixteen days in space, the ST-107 crew — Rick Husband, commander; Michael Anderson, payload commander; David Brown, mission specialist; Kalpana Chawla, mission specialist; Laurel Clark, mission specialist; William McCool, pilot; and Ilan Ramon, payload specialist from the Israeli Space Agency, probably survived the initial breakup, losing consciousness in the seconds following.


Vehicle debris and crew remains were found in over 2,000 locations across Arkansas, Texas and Louisiana. The only survivors of the disaster was a canister full of worms, brought into space for study.

Petr-Ginz-drawing-lPayload Specialist Colonel Ilan Ramon, born Ilan Wolferman, was an Israeli fighter pilot, the first Israeli astronaut to join the NASA space program.

Colonel Ramon’s mother survived the Nazi death camp at Auschwitz.  His grandfather and several family members, did not.  In their memory, Ramon carried a copy of “Moon Landscape”, a drawing by 14-year-old holocaust victim Petr Ginz, depicting what he thought earth might look like, from the moon.

Today, there are close to 84,000 pieces of Columbia and assorted debris, stored in the Vehicle Assembly Building at the Kennedy Space Center. To the best of my knowledge, that drawing by a boy who never made it out of Auschwitz, was never found.

Left to right: David Brown, Rick Husband, Laurel Clark, Kalpana Chawla, Michael Anderson, William McCool and Ilan Ramon

Feature Image credit, top:  Space Shuttle Columbia Disaster, Chris Butler

January 12, 1967 Frozen

“I should prefer to an ordinary death, being immersed with a few friends in a cask of Madeira, until that time, then to be recalled to life by the solar warmth of my dear country”. – Benjamin Franklin

The human brain is an awesome thing. Weighing in at about 3lbs, the organ is comprised of something like 86 billion neurons, each made up of a stoma or cell body, an axon to take information away from the cell, and anywhere between a handful and a hundred thousand dendrites bringing information in. Chemical signals transmit information over minute gaps between neurons called synapses, about 1/25,000th to 1/50,000th of the thickness of a sheet of paper.

Signal+Transmission+Dendrites+Cell+body+Nucleus+SynapseThere are roughly a quadrillion such synapses, meaning that any given thought could wend its way through more pathways than there are molecules in the known universe. This is roughly the case, whether you are Stephen J. Hawking, or Forrest Gump.

For all of this, the brain cannot store either oxygen or glucose (blood sugar), meaning that there’s about 6 minutes after the heart stops, before the brain itself begins to die.

Legally, brain death occurs at “that time when a physician(s) has determined that the brain and the brain stem have irreversibly lost all neurological function”. Brain death defines the legal end of life in every state except New York and New Jersey, where the law requires that a person’s lungs and heart must also have stopped, before that person is declared legally dead.

Clearly there is a gap, a small span of time, between the moment of legal death and a person’s permanent and irreversible passing. So, what if it were possible to get down to the molecular level and repair damaged brain tissue.  For that matter, when exactly does such damage become “irreversible”?

“Information-theoretic death” is defined as death which is final and irreversible by any technology, apart from what is currently possible given contemporary medical methodologies.  For some, the gap between current legal and clinical definitions of death and the truly irretrievable, is a source of hope for some future cure.

cryonicsThe Alcor Life Extension Foundation, the self-described “world leader in cryonics, cryonics research, and cryonics technology” explains “Cryonics is an effort to save lives by using temperatures so cold that a person beyond help by today’s medicine can be preserved for decades or centuries until a future medical technology can restore that person to full health”.

The practice is highly controversial, and not to be confused with Cryogenics, the study of extremely low temperatures, approaching the still-theoretical cessation of all molecular activity.  Absolute zero.

The Cryogenic Society of America, Inc. includes this statement on its home page: “We wish to clarify that cryogenics, which deals with extremely low temperatures, has no connection with cryonics, the belief that a person’s body or body parts can be frozen at death, stored in a cryogenic vessel, and later brought back to life. We do NOT endorse this belief, and indeed find it untenable”.

The modern era of cryonics began in 1962, when Michigan College physics professor Robert Ettinger proposed that freezing people may be a way to reach out to some future medical technology.

The Life Extension Society, founded by Evan Cooper in 1964 to promote cryonic suspension, offered to preserve one person free of charge in 1965. Dr. James Hiram Bedford was suffering from untreatable kidney cancer at that time, which had metastasized to his lungs.

James Bedford
Dr. James Hiram Bedford

Bedford became the first person to be cryonically preserved on January 12, 1967, frozen at the boiling point of liquid nitrogen, −321° Fahrenheit, and sealed up in a double-walled, vacuum cylinder called a “dewar”, named after Sir James Dewar, the 19th century Scottish chemist and physicist best known for inventing the vacuum flask, and for  research into the liquefaction of gases.

Fifty-one years later, cryonics societies around the world celebrate January 12 as “Bedford Day”.  Dr. Bedford has since received two new “suits”, and remains in cryonic suspension, to this day.

Advocates experienced a major breakthrough in the 1980s, when MIT engineer Eric Drexler began to publish on the subject of nanotechnology. Drexler’s work offered the hope that, theoretically, one day injured tissue may be repaired at the molecular level.

Cryonics1-640x353In 1988, television writer Dick Clair, best known for television sitcoms “It’s a Living”, “The Facts of Life”, and “Mama’s Family”, was dying of AIDS related complications. In his successful suit against the state of California, “Roe v. Mitchell” (Dick Clair was John Roe), Judge Aurelio Munoz “upheld the constitutional right to be cryonically suspended”, winning the “right” for everyone in California.

The decision failed to make clear who was going to pay for it.

As to cost, the Cryonics Institute (CI) website explains, “A person who wishes to become a Lifetime CI Member can make a single membership payment of $1,250 with no further payment required. If a new member would rather pay a smaller amount up front, in exchange for funding a slightly higher cryopreservation fee later on ($35,000), he or she can join with a $75 initiation fee, and pay annual dues of only $120, which are also payable in quarterly installments of $35”.

Ted Williams went into cryonic preservation in 2002, despite the bitter controversy that split the Williams first-born daughter Bobby-Jo Williams Ferrell, from her two half-siblings John-Henry and Claudia. The pair were adamant that the greatest hitter in baseball history wanted to be preserved to be brought back in the future, while Ferrell pointed out the will, which specified that Williams be cremated, his ashes scattered off the Florida coast.

teds_new_will_072502The court battle produced a “family pact” written on a cocktail napkin, which was ruled authentic and allowed into evidence. So it is that Ted Williams’ head went into cryonic preservation in one container, his body in another.

The younger Williams died of Leukemia two years later, despite a bone marrow donation from his sister. John-Henry joined his father, in 2004.

Walt Disney has long been rumored to be in frozen suspension, but the story isn’t true. After his death in 1966, Walt Disney was interred at Forest Lawn Memorial Park in Glendale, California.

FranklinIn April 1773, Benjamin Franklin wrote a letter to Jacques Dubourg. “I wish it were possible”, Franklin wrote, “to invent a method of embalming drowned persons, in such a manner that they might be recalled to life at any period, however distant; for having a very ardent desire to see and observe the state of America a hundred years hence, I should prefer to an ordinary death, being immersed with a few friends in a cask of Madeira, until that time, then to be recalled to life by the solar warmth of my dear country! But…in all probability, we live in a century too little advanced, and too near the infancy of science, to see such an art brought in our time to its perfection”.

Maybe so but, for the several hundred individuals who have plunked down $25,000 to upwards of $200,000 to follow Dr. Bedford into cryonic suspension, hope springs eternal.

January 5, 1709 Frost Fair

The science is politicized. Vast sums of public largesse and political capital are lavished on the climate.  We are told to expect global warming, and warned of a coming ice age. The skeptical taxpayer who has to pay for it all is forced to wade through competing narratives, in an exercise not unlike taking a sip from a fire hose. 

Over the past two weeks, temperatures have dipped near 0° Fahrenheit, as far south as Alabama.  The capital of Florida awoke only yesterday to snow in the palm trees, as frozen iguanas fell to the ground.    Ice hangs from the Spanish mosses of Savannah, as something called a “bomb cyclone” worked its way toward the New England coast.


In July 1983, temperatures of -129° were recorded at the Soviet Vostok Station in Antarctica, the coldest temperature ever measured by ground instruments. NASA satellite data recorded a low temperature of -135.8°F in August, 2010.

Four years later, a Russian research ship full of environmental, scientific and activist types, the Akademik Shokalskiy, got stuck in Antarctic ice, as did the Chinese icebreaker Xue Long, which had come to their rescue.

Very few media outlets got around to reporting that they were there to study “global warming”.

The environmental activist types would object to my use of the term, preferring what they feel to be the more descriptive “climate change”.  They’re right to prefer the term. We can all agree that climate is changing, five ice ages demonstrate that much, but it does beg the question.  How, exactly, will we know we’ve reached climate optimum?

In England, accounts of the River Thames freezing over date back as early as 250AD. The river was open to wheeled traffic for 13 weeks in 923 and again in 1410.  That time, the freeze lasted for 14 weeks. By the early 17th century, the Thames became a place of “Frost Fairs”.


The “Medieval Warm Period” lasting from 950 to 1250 was followed by the “Little Ice Age”, a 300-year period beginning in the 16th century.  King Henry VIII rode a sleigh down the Thames from London to Greenwich in 1536.  Elizabeth I was out on the ice shooting at archery targets, in 1564.

English writer John Evelyn describes the famous “Frost Fair” of the winter of 1683-’84:  “Coaches plied from Westminster to the Temple, and from several other stairs too and fro, as in the streets; sleds, sliding with skeetes, a bull-baiting, horse and coach races, puppet plays and interludes, cooks, tipling and other lewd places, so that it seemed to be a bacchanalian triumph, or carnival on the water”.

The Great Frost of the winter of 1708-09 was held in the coldest winter Europe had seen in 500 years.  William Derham, an English clergyman and natural philosopher best known for calculating a reasonably accurate estimate of the speed of sound, recorded a low of −12°C (10 °F) on the night of January 5, 1709.  It was the lowest he’d measured since beginning readings in 1697, prompting the comment that “I believe the Frost was greater than any other within the Memory of Man”.

24,000 Parisians died of cold in the next two weeks.  Animals froze in their stalls and crops planted the prior year, failed.  The resulting famine killed an estimated 600,000 in France alone while, in Italy, the lagoons and canals of Venice, froze solid.

clip_image0026Breaks in cold weather inevitably marked the end of the frost fairs, sometimes all of a sudden.  In January 1789, melting ice dragged a ship with it, while tied to a riverside tavern, in Rotherhite.  Five people were killed when the building was pulled down on their heads.

The last Thames River frost fair took place in 1814, the year someone led an elephant across the ice, below Blackfriar’s Bridge.  Structural changes in river embankments and the demolition of the medieval London Bridge have increased water flow in the Thames, making it possible that the river will never freeze again.

Today, many blame weather extremes on “anthropogenic” (human) causes, associating what used to be called global warming”, with CO2. Others contend the reverse: that historic increases in carbon do not precede but rather result from, climate extremes. A third group associates the sun with climate change (imagine that), linking an extended period of low solar activity called the “Maunder Minimum”, with the brutal cold of 1645-1715.

The science is politicized. Vast sums of public largesse and political capital are lavished on the climate.  We are told to expect global warming, and warned of a coming ice age. The skeptical taxpayer who has to pay for it all is forced to wade through competing narratives, in an exercise not unlike taking a sip from a fire hose.

Meanwhile, the sun is going to do what the sun is going to do, which at the moment appears to be another quiet period in solar activity.  Very quiet. Before it’s over, we may find ourselves wishing for a little Global Warming.

Feature image, top:  The Battery, Charleston SC, January 2, 2018

January 3, 1870 Mr. Roebling’s Bridge

To aid in the work, Emily Roebling took a crash course in bridge engineering. For 11 years she studied higher mathematics, catenary curves, materials strength, and the intricacies of cable construction, all while acting as the pivot point on the largest bridge construction project in the world, and nursemaid to a desperately sick husband.

At the time it was built, the span across the East river linking Brooklyn with Manhattan was the longest suspension bridge in the world.

In 1869, civil engineer John Roebling had already invested two years in site work.   With the ferry coming in, he should have paid more attention to his surroundings.  Focused on what he was doing on the pier, the engineer’s toes were crushed as the boat docked, so badly that several had to be amputated.

brooklyn-bridge-caisson-granger“Lockjaw” is such a sterile term, it doesn’t begin to describe the condition known as Tetanus. In the early stages, the anaerobic bacterium Clostridium Tetani produces tetanospasmin, a neurotoxin producing mild spasms in the jaw muscles. As the disease progresses, sudden and involuntary contractions effect most skeletal muscle groups, becoming so powerful that bones are fractured and the muscles tear themselves apart. These were the last days of John Roebling, the bridge engineer who would not live to see his most famous work.

Roebling was the first casualty of the project.  He would not be the last.

Brooklyn Bridge Caisson ConstructionRoebling’s 32-year-old son Washington took over the project, beginning construction on January 3, 1870.

Enormous yellow pine boxes called “caissons” were built on the Brooklyn and New York sides of the river, descending at the rate of 6-in. per week in search of bedrock. Like giant diving bells, the New York side ended up at 78′ below mean high tide, the Brooklyn side 44′. Pressurized air was pumped into these boxes, keeping water and mud at bay, as workers excavated the bottom. In 1872, these “sandhogs” began to experience a strange illness that came to be called “caisson disease”.

Civil War era submarine designer Julius Hermann Kroehl may have recognized what was happening, but Kroehl was five years in his grave by this time, victim of the same “fever”. Today we call it “the bends”. Pop the top off a soda bottle and you’ll see the principle at work. Without sufficient decompression time, dissolved gasses come out of solution and the blood turns to foam. Bubbles form in or migrate to any part of the body, resulting in symptoms ranging from joint pain and skin rashes, to paralysis and death.  The younger Roebling was badly injured as a result of the bends in 1872, leaving him partially paralyzed and bedridden, incapable of supervising construction on-site.

brooklyn-anchorageRoebling conducted the entire project looking out his apartment window, designing and redesigning details while his wife, Emily Warren Roebling, became the critical connection between her husband and the job site.

To aid in the work, Emily Roebling took a crash course in bridge engineering. For 11 years she studied higher mathematics, catenary curves, materials strength, and the intricacies of cable construction, all while acting as the pivot point on the largest bridge construction project in the world, and nursemaid to a desperately sick husband.

Emily Warren Roebling, the “first woman field engineer”.

Historian David McCullough wrote in his book, The Great Bridge: The Epic Story of the Building of the Brooklyn Bridge: “By and by it was common gossip that hers was the great mind behind the great work and that this, the most monumental engineering triumph of the age, was actually the doing of a woman, which as a general proposition was taken in some quarters to be both preposterous and calamitous. In truth, she had by then a thorough grasp of the engineering involved”.

Unlikely as it sounds, fires broke out at the bottom of the river on several occasions, started by workmen’s candles, fed by the oakum used for caulking, and turbocharged by all that pressurized air. On at least one occasion, the caisson had to be filled with millions of gallons of water, before the fire went out for good.

Brooklyn bridge builders

A footbridge connected the two sides in 1877, and soon the wires began to be strung. Wooden “buggies” carried men back and forth along wires suspended hundreds of feet above the water, as individual wires were woven into the four great cables that support the bridge. The work was exacting, with each wire bound together to precise specifications. Rumors about corruption and sleaze surrounded the project when J. Lloyd Haigh, the wire contractor, was discovered to be supplying inferior material. It was way too late to do anything about it, and 150 extra wires were bundled into each cable to compensate. The tactic worked.  Haigh’s shoddy wire remains there, to this day.

Construction was completed in 1883, the bridge opening for use on May 24. The first person to cross was Emily Roebling. Six days later, a rumor started that the bridge was about to collapse.  At least 12 people were killed in the resulting stampede. A year later, a publicity stunt by P. T. Barnum helped to put people’s minds at ease when Jumbo, the circus’ prize elephant, led a parade of 20 other elephants across the bridge.

Brooklyn Bridge

For a long time the span was called the “New York and Brooklyn Bridge” or the “East River Bridge”, officially becoming the “Brooklyn Bridge” only in 1915. At least 27 were killed in its construction; 3 from the bends, several from cable stringing accidents and others crushed under granite blocks or killed in high falls.

Even today, popular culture abounds with stories of suckers “buying” the Brooklyn Bridge. It was the longest bridge in the world for its time, and would remain so until 1903. Roebling had designed his project to be six times the strength required for the job. Even with the defective cables installed, the bridge is four times as strong as it needs to be. Many of the Brooklyn Bridge’s contemporary structures have long since gone.  Johann Augustus Röbling’s bridge carries 145,000 cars, every day.


January 1, 1995 Rogue Wave

“None of the state-of-the-art weather forecasts and wave models—the information upon which all ships, oil rigs, fisheries, and passenger boats rely—had predicted these behemoths. According to all of the theoretical models at the time under this particular set of weather conditions, waves of this size should not have existed”.

From the time of Aristotle, mankind has looked to the field of scientific inquiry to explain the world around us.

From Copernicus to Charles Darwin to Stephen J. Hawking, the greatest of scientific minds have struggled to explain not only “that” the universe works, but also the “how” and the “why”.

We live in a near-miraculous age, when science has conquered complexities from space travel to molecular biology, to medicine, climate and astral physics.

Except sometimes, science has not the foggiest notion of why things happen.

Rogue Wave

In 1826, French scientist and naval officer captain Jules Dumont d’Urville described waves as high as 108.3 feet tall, in the Indian Ocean.  Despite having three colleagues as witnesses, d’Urville was publicly ridiculed by fellow scientists. “Everybody knew” at that time, that no wave could exceed 30 feet.   Walls of water the size of 10 story buildings, simply didn’t exist.

Either that, or very few who’d ever seen such a thing, lived to tell about it.

For nearly 100 years, oceanographers, meteorologists, engineers and ship designers have used a standard linear model to predict wave height.  This model suggests that there will hardly ever be a wave higher than 50 feet. One of 100 feet or larger is possible but unlikely, occurring maybe once in 10,000 years.

Serious study of the subject is younger than you might think.  The first scientific article on “freak waves” was written in 1964, by professor Lawrence Draper.   Far from ridiculing old sailors’ stories about monster waves, professor Draper posited not only that wave heights “can exceed by an appreciable amount the maximum values which have been accepted in responsible circles“, but also a terrifying phenomenon he called ‘freak wave holes’, the exact opposite of a rogue wave.  God help anyone caught at the bottom of one of those things.

rogue wave_destructionOnce considered mythical but for the old sailor’s stories and the damage inflicted on ships themselves, the first scientific measurement of a rogue wave occurred in the North sea in 1984, when a 36′ wave was measured off the Gorm oil platform, in a relatively placid sea.

What really caught the attention of the science community was the “New Year’s wave” measured from the Draupner oil platform off the coast of Norway, on January 1, 1995.  Laser instruments measured this thing at 84-ft.  Oil platform damage above the water line, confirmed the measurement.

Not to be confused with a tidal wave which is caused by underwater earthquake or volcanic eruption, a rogue wave is a different kind of animal. Oceanographers define rogue waves as being two or more times the height of the mean top-third of waves, in any given sea state.

rogue wave (3)

These literal freaks of nature are rare, unpredictable, appear and disappear without warning or trace, and are capable of sudden and catastrophic damage.  Modern ships are designed to withstand a “breaking pressure” of 6 metric tons per square meter or 21psi. A 39-ft wave in the usual linear model produces just over a third of that. A rogue wave can generate breaking pressures of 140psi and more.

In 2000, the British oceanographic research vessel RRS Discovery measured individual waves up to 95.5-ft. Analysis of the data took years, noting that “none of the state-of-the-art weather forecasts and wave models—the information upon which all ships, oil rigs, fisheries, and passenger boats rely—had predicted these behemoths. According to all of the theoretical models at the time under this particular set of weather conditions, waves of this size should not have existed”.

rogue wave (2)

In December 1900, Thomas Marshall, James Ducat, and Donald McArthur vanished from the Flannan Island lighthouse, in the Hebrides Islands of Scotland.  No serious storm had been reported between the 12th and the 17th, and there was speculation about supernatural causes.  Subsequent inspection revealed wave-damage, 200ft. above sea level.

flannan-isles-lighthouseIn 1909, the 500-ft. cargo liner SS Waratah disappeared off Durban South Africa, with 211 passengers and crew.

The liner RMS Queen Mary was broadsided by a 92-ft. monster in 1942, nearly pushing the 1,019-ft vessel over on her side. The 81,961-ton liner listed all the way over to 52°, before slowly righting herself.

In 1966, heavy glass was smashed 80-ft. above the waterline of the Italian liner SS Michelangelo, killing three and tearing a hole in her superstructure.

In 1995, the 963-ft. RMS Queen Elizabeth 2 was forced to “surf” a 95-ft. behemoth to avoid being sunk. The ship’s master said this thing “came out of the darkness” and “looked like the White Cliffs of Dover.”

The 1975 sinking of the Edmund Fitzgerald is widely blamed on the 1-2-3 punch of a freak wave phenomenon peculiar to Lake Superior, known as the “three sisters”.


In 2007, NOAA compiled a catalog of over 50 historical incidents, most likely associated with rogue waves.

Serious scientific study of non-linear fluid dynamics began only 20-30 years ago. Researchers now believe that ‘super rogue waves’ of up to eight times the surrounding sea state, are possible.

rogue wave
“The Perfect Storm” movie based on the Sebastian Junger book of the same title depicts the last moments of the Gloucester fisherman Andrea Gail in September, 1991

European Space Agency satellite radar studies have proven that waves cresting at 65 to 98-ft. occur far more regularly than previously believed. Rogue waves occur several times a day, in all of the world’s oceans. One three-week period in 2004 identified over ten individual giants measuring 82-ft. and above, in the South Atlantic, alone.

MIT researchers Themis Sapsis and Will Cousins working with the Office of Naval Research, the Army Research Office and the American Bureau of Shipping have combined high resolution scanning technology with advanced algorithms, to digitize and map the sea state in real time, to predict the possible formation of rogue waves. The method only gives 2-3 minutes warning, but that is enough. Research is ongoing. Lighthouse keepers, mariners and oil platform operators the world over, anxiously await the results.