Review of “The Tangled Tree: A Radical New History of Life” by David Quammen

Think of the most famous invaders of all time–Attila, Genghis Kahn, Napoleon, to name just three. Pretty important, but none nearly as significant or historically momentous as the invaders described in David Quammen’s The Tangled Tree: A Radical New History of Life, an engrossing tale of invasions over time and inside the cells of all living things. If you doubt biology can be fascinating, this book may change your mind.

David Quammen, an award-winning science writer, has written about Darwin before. Now he turns to scientific discoveries in just the past forty years that constitute a revolutionary revision of Darwin’s “tree” of life (although not, importantly, a repudiation of it). Because of both the electron microscope and the development of methods to sequence genes and compare genomes, we have become aware of aspects of life Darwin couldn’t even dream about. Much of this new knowledge is thanks in part to the seminal thinking of Dr. Carl Richard Woese, whose life and work forms the scaffolding upon which Quammen unfolds the story. It was Woese who upended theories relating to the definitions of a species, an individual, and whether the history of life does or does not resemble a tree.

As most people know, Darwin postulated that evolution occurs as traits descend from parents to offspring and are very gradually modified based on mutations favorable for survival. The process resembled a tree, to Darwin’s thinking.

A page from Darwin’s Notebook B showing his sketch of the tree of life

But now we understand that, as Quammen explained on NPR, “innovation in genomes doesn’t always come gradually. Sometimes it comes suddenly, in an instant, by horizontal gene transfer. And that represents the convergence, not the divergence, of lineages.” This discovery means all the domains of life are much more interrelated than we thought.

In fact, we didn’t even know about the existence of one of the main domains of life, the archaea, until recently! (Scientists now divide all life into three domains: bacteria, archaea, and eukarya. Bacteria you are probably familiar with. Eukarya are organisms that have cells with a nucleus, and include plants and animals and human beings. Awareness of archaea, discovered by Carl Woese, is the first of the three big developments highlighted by Quammen, and will be expanded upon below.)

Woese with an RNA model at G.E. in 1961. CreditAssociated Press via NYT

The discovery of Horizontal Gene Transfer (HGT, also called Lateral Gene Transfer or LGT) as a pathway to heredity, and its importance in the process of evolution, is an astounding development. This means cells can acquire genes from other cells around them, “horizontally” rather than only vertically from a previous generation. In fact, gene sequencers have been astonished at just how much HGT has been going on. This does not mean gradual evolution through previous generations did not and does not occur, but rather, that over time evolutionary change takes the shape of a tangled web more than a stereotypical looking tree.

Revision of Darwin’s tree by evolutionary biologist Carl R. Woese

More specifically, HGT has been responsible for some of the biggest developments in plants and animals. Both mitochondria and chloroplasts, those organelles helping animals and plants harness and process energy critical for cell survival, originated as bacterial cells that migrated across species to live inside primitive hosts. How do we know?

Mitochondria and chloroplasts resemble bacterial cells more than cells of animals and plants. They even have separate DNA! They use their own DNA, not that of their hosts, to produce the proteins and enzymes they need to carry out their energy-producing functions. Cells without these organelles lack the nuclear genes to encode all of the proteins they need to survive. The organelles replicate their own DNA, as bacteria do, and are each surrounded by a double membrane, further emphasizing their difference and separation.

Endosymbiosis from bacteria. Illustration by biology pioneer Lynn Margulis

We are all, that is to say, “composite creatures” – “mosaics” made up of all possible domains of life. When Walt Whitman said “we contain multitudes,” little did he know we in fact contain multitudes – of bacteria, archaea, and viruses that are an integral part of us. What “human” means involves different organisms that have formed symbiotic associations inside us, and can be passed on to our progeny.

[Wait, you may be thinking: to which domain do viruses belong? A tricky question! Whereas all of the three main domains of life replicate by cell division, viruses do not. Believe it or not, viruses are considered non-living, or at least, in a gray area somewhere between living and non-living, since they cannot reproduce on their own. Viruses are basically ultramicroscopic intracellular parasites. They can replicate only within other cells. Nevertheless, they play a large role in living organisms, especially through the mechanism of retroviruses, a whole area of research beyond the scope of this review. But suffice it to point out that it is thanks to retroviruses that animals have workable placentas to protect fetuses.]

Back to the living domains, the archaea are very odd but interesting. Like bacteria, they are microbial species (living things too small to see with the naked eye). But archaea and bacteria are made up of very different genetic material. Archaea tend to live in extreme environments, whether super hot, acidic, alkaline, deep in the ocean, or super cold. [Oh yes, and in the human colon, but if that’s not extreme, what is?]

Hydrothermal vents on the ocean floor, where the surrounding water can reach over 300° Celsius, are home sweet home for some archaeal species. Image adapted from: NOAA Photo Library

The reason archaea are exciting is that their ability to function in extreme environments gives us a glimpse of what earliest life on the earth was probably like, as well as what life on other planets might be like. Here’s another strange thing: archaea possess both DNA and RNA that work much more similarly to that of eukaryotes (i.e., us) than bacteria. As Jennifer Frazer in “Scientific American” writes:

“These compelling similarities . . . between archaeal and eukaryotic cells has led some to suggest that in addition to the bacterial engulfment/symbiosis that created mitochondria and chloroplasts, some other more mysterious symbiosis or chimerism may have occurred between an ancient archaeon and bacterium to produce the first proto-eukaryotic cell. Or it may suggest that eukaryotes, in fact, evolved from archaea.”

You can read more about our possible ancestry from archaea here, in an article asking whether archaea are best viewed as our “sisters” or our “mothers.”


There is a lot more just waiting to be discovered in the field of molecular phylogenetics, which is the study of evolutionary relationships among biological entities by analyzing data at the molecular level. Quammen not only provides adequate background for you to follow along (at least at the “popular science” level) but to get you excited enough to do so.

He will have you pondering, along with scientists, how we can possibly define an “individual” given what we now know? Among other ideas, he will introduce you to “zooids,” or as writers of science fiction say, “hive beings.” Zooids are multicellular beings that exist only in reference to their group. Think of bees: they form a colony of organisms each of which has a specialized function, and whose members cannot survive independently. Another familiar zooid is the quaking Aspen tree. This species of tree lives in forests of clonal trees that all belong to a single root system and thus are physiologically characterized as belonging to one single individual.

Quammen also sets you up with enough background to understand the current debate about CRISPR, a section of the genome that can be used for “editing” genetic codes. The days of heritable diseases could be ending, if the use of editing is found to be safe, effective, and ethical. (You can read about how CRISPR works, here.)

How CRISPR works, via Cambridge Univ. Press

I had only one criticism. Quammen very briefly raised the complexity theory topic of emergent phenomena as a possible explanation for DNA, but then dropped it. Although the book already covered so much, I wanted to hear more about those theories. [You can read an excellent short article on this subject in “Nature Magazine,” here. Among other things, the article explains: “Life itself is an example of an emergent property. For instance, a single-celled bacterium is alive, but if you separate the macromolecules that combined to create the bacterium, these units are not alive. Based on our knowledge of macromolecules, we would not have been able to predict that they could combine to form a living organism, nor could we have predicted all of the characteristics of the resulting bacterium.”]

Evaluation: Overall, I loved this book. Quammen is an excellent storyteller. In addition, it’s so full of exciting information that I felt compared to share with everyone I met while listening to it!

Rating: 4.5/5

Note: Longlisted for the National Book Award for Nonfiction and A New York Times Notable Book of 2018

Published in hardcover by Simon & Schuster, 2018

A Few Notes on the Audio Production:

This book was narrated admirably by Jacques Roy. I think it is a challenge to imbue a non-fiction science book with enthusiasm and emotional range, but he managed to do it nonetheless.

Published unabridged on 11 CDs (approximately 14 listening hours) by Simon & Schuster Audio, 2018

Book Review of “How to Hide an Empire: A History of the Greater United States” by Daniel Immerwahr

The author begins this decidedly different approach to American history by pointing out that on December 7, 1941, Japanese planes attacked not just Pearl Harbor, but also the U.S. territories of the Philippines, Guam, Midway Island, and Wake Island. Yet Roosevelt chose to characterize the entire incident as an attack on Pearl Harbor. Immerwahr writes:

Roosevelt no doubt noted that the Philippines and Guam, though technically part of the United States, seemed foreign to many. Hawaii, by contrast, was more plausibly ‘American.’ Though it was a territory rather than a state, it was closer to North America and significantly whiter than the others. …. Yet even when it came to Hawaii, Roosevelt felt a need to massage the point. Though the territory had a substantial white population, nearly three-quarters of its inhabitants were Asians or Pacific Islanders. . . . “

Thus, Roosevelt changed his announcement, and added that damage had been done to “American naval and military forces,” and “very many American lives” had been lost.

This history is illustrative of the main theme of this book, which emphasizes the deliberate invisibility of American territories outside of the mainland. Early on, the word “colonies” was determined to be anathema. “Territories” sounded better, if one had to discuss those places at all. Mostly, however, they were not and still are not discussed. Immerwahr observes, “One of the truly distinctive features of the United States’ empire is how persistently ignored it has been.”

In fact, as Immerwahr avers, the “logo map” of the United States most often shows the mainland, and more recently, includes Alaska and Hawaii. But, Immerwahr asks, “When have you ever seen a map of the United States that had Puerto Rico on it? Or American Samoa, Guam, the U.S. Virgin Islands, the Northern Marianas, or any of the other smaller islands the United States has annexed over the years?”

The logo map, the author argues, is not only misleading geographically:

It suggests that the United States is a politically uniform space: a union, voluntarily entered into, of states standing on equal footing with one another. But that’s not true, and it’s never been true. From the day the treaty securing independence from Britain was ratified, right up to the present, it’s been a collection of states and territories. It’s been a partitioned country, divided into two sections, with different laws applying in each.”

On the eve of World War II, nearly nineteen million people lived in American colonies, the great bulk of them in the Philippines. Although smaller than the British Empire, the United States empire was then the fifth largest in the world. Moreover, the racism that had pervaded the U.S. since slavery affected the territories as well, as the example of FDR’s reaction to the Japanese bombing illustrates so well. Colonial subjects were even called “niggers” to emphasize their “inferior” status, and were treated as badly as black citizens were in the mainland.

A report released during WWII noted that “Most people in this country [i.e., the U.S.], including educated people, know little or nothing about our overseas possessions.”

Of course, as we have found, even today many mainlanders do not realize Puerto Ricans are American citizens. A poll taken after Hurricane Maria found that only a slight majority of mainlanders, and only 37% of those under age 30, knew that fact. Even the U.S. President at the time seemed to be unaware of it. As one online article reported about Trump’s reaction to Hurricane Maria in 2017:

The small Caribbean island is a United States territory (technically an “unincorporated territory”) and has been since 1898, after the U.S. claimed victory in the Spanish-American War. Puerto Ricans have American citizenship and are able to travel throughout the U.S. mainland as they please – and it’s under the jurisdiction of the current occupant of the Oval Office. 

It appears that the president – who hails from New York, the state that contains the largest population of Puerto Ricans in the country – is either not entirely aware that Puerto Rico is not another country, or is refusing to acknowledge that it’s a part of the United States.” 

Immerwahr, an associate professor of history at Northwestern University, sets out to show what U.S. history would look like if it acknowledged the Greater United States rather than just the logo map version. That history has had three stages, in his view. The first was westward expansion: “the creation in the 1830s of a massive all-Indian territory [is] arguably the United States’ first colony.” The second stage moved off the continent, when the U.S. started annexing new territory overseas. The third stage involved a retreat and a ceding back of territory. He explores the reason why at some length.

Current U.S. territories

Although the United States prefers to see itself as a “republic,” the result of this self-deception has been costly for people in the colonies:

The logo map has relegated them to the shadows, which are a dangerous place to live. At various times, the inhabitants of the U.S. Empire have been shot, shelled, starved, interned, dispossessed, tortured, and experimented on. What they haven’t been, by and large, is seen.”

Immerwahr attempts, quite successfully in my view, to remedy that omission. He spends a great deal of time recounting the injustices committed in the colonies, far from the eyes of mainlanders. He noted:

The men sent to run the territories, unlike the trained administrators who oversaw European colonies, simply didn’t know much about the places to which they’d been assigned, and they cycled rapidly through their posts. Between Guam’s annexation in 1899 and World War II, it had nearly forty governors. FDR’s first governor of Puerto Rico, who served for six months, spoke no Spanish and left reporters with the distinct impression that he didn’t know where the island was. There was a period of several months when the territory of Alaska, which is half the physical size of India, didn’t have a single federal official in it.”

In the Philippines, in particular, he tells the story of how Daniel Burnham and others were sent to Manilla to transform it, but for the colonizers, not the natives.:

Such were the joys of empire. The colonies [in the Philippines in this instance] were, for men like [Daniel] Burnham, playgrounds, places to carry out ideas without worrying about the counterforces that encumbered action at home. Mainlanders could confiscate land, redirect taxes, and waste workers’ lives to build paradises in the mountains.”

Other stories of Colonial behavior are even worse, and hard to stomach. In perhaps the most egregious example, the author reports that the U.S. Department of Defense admitted in 2002 to having conducted chemical and biological warfare experiments on unwitting citizens in territories including Puerto Rico and the Marshall Islands. The first tests were carried out by a mainland doctor, Cornelius P. Rhoads, whose cancer research in a Puerto Rican hospital in the 1930s reportedly included injecting unknowing patients with cancer cells. Incredibly, Rhoads went on to serve in 1940 as director of Memorial Hospital for Cancer Research in New York, and then starting in 1945 was the first director of Sloan-Kettering Institute, and the first director of the combined Memorial Sloan–Kettering Cancer Center. Thanks to his unethical contributions to cancer research, Rhoads was featured on the cover of the June 27, 1949 issue of “Time Magazine” under the title “Cancer Fighter.”

The second experiments, using biological and chemical weapons, were performed by the U.S. military in the 1960s and 1970s at various locations, including Puerto Rico, Alaska, Hawaii, Florida, Canada, and the Marshall Islands. The experiments were performed outdoors, meaning civilians might also have been exposed to harmful chemical and biological agents.

All of these behaviors were enabled by three interconnected forces: racism; the imprimatur of laws passed in its service; and the invisibility that still obtains regarding the annexed areas.

The legal basis for the treatment of the colonies was established with the “Insular Cases,” a series of opinions by the U.S. Supreme Court in 1901 about whether or not people in newly acquired U.S. territories were citizens. (The term “insular” signifies that the territories were islands administered by the War Department’s Bureau of Insular Affairs.) In a number of related cases listed here, the Court held that full constitutional protection of rights did not automatically (or ex proprio vigore — i.e., of its own force) extend to all places under American control. This meant that inhabitants of unincorporated territories such as Puerto Rico—”even if they are U.S. citizens”—may lack some constitutional rights (e.g., the right to remain part of the United States in case of de-annexation).

Thus, the Insular Cases “authorized the colonial regime created by Congress, which allowed the United States to continue its administration—and exploitation—of the territories acquired from Spain after the Spanish-American War.” [See, Juan R. Torruella [First Circuit Judge], “Ruling America’s Colonies: The ‘Insular Cases,’” Yale Law & Policy Review, Vol. 32, No. 1 (fall 2013), pp. 57-95, online here.]

Today, the categorizations and implications put forth by the Insular Cases still govern the United States’ territories. The Harvard Law Review Blog writes, “Judge Torruella has become the most prominent critic of the Insular Cases, arguing forcefully that “the Insular Cases represent classic Plessy v. Ferguson legal doctrine and thought that should be totally eradicated from present-day constitutional reasoning.”

Currently, Immerwahr reports, there are about four million people living in U.S. territories, in Puerto Rico, Guam, American Samoa, the U.S. Virgin Islands, and the Northern Marianas. He observes, “they’re subject to the whims of Congress and the president, but they can’t vote for either.”

The author concludes the book with a discussion of “birtherism,” discussing how the racism that has always tinged attitudes toward territories and former territories still intrudes into American politics.

Evaluation: This history reflects a great deal of research and as a bonus is written in a very accessible way, interweaving anecdotes about colonial players with facts that are horrifying and little reported. The book, dedicated “To the Uncounted,” should be a part of every U.S. history program.

Rating: 4.5/5

Published by Farrar, Straus and Giroux, 2019

October 14, 1985 – Ivory Coast Changes its Name to “Côte d’Ivoire”

Originally, Portuguese and French merchant-explorers in the 15th and 16th centuries divided the west coast of Africa, very roughly, into four “coasts” reflecting local economies. The coast that the French named the Côte d’Ivoire and the Portuguese named the Costa Do Marfim —both, literally, mean “Coast of Ivory”— lay between what was known as”Upper Guinea”, and Lower Guinea. There was also a Pepper Coast, also known as the “Grain Coast” (present-day Liberia), a “Gold Coast” (Ghana), and a “Slave Coast” (Togo, Benin and Nigeria). Like those, the name “Ivory Coast” reflected the major trade that occurred on that particular stretch of the coast: the export of ivory.

In 1842 the French declared the area their protectorate. Formal French colonial rule was introduced in the 1880s following the scramble for Africa. In 1904, Ivory Coast became part of French West Africa until August 7, 1960 when the country regained independence from France.

It’s first leader after independence was Prime Minister Félix Houphouët-Boigny, who received a letter from President Dwight D. Eisenhower on the same date recognizing the Republic of the Ivory Coast.

Ivory Coast in West Africa

By decree dated on October 14, 1985, this day in history, the Ivoirian government decided to name the country “Côte d’Ivoire” and to no longer accept translations of this French name. According to Prof. Boubacar N’Diaye writing in “Not a Miracle After All… Côte d’Ivoire’s Downfall: Flawed Civil-Military Relations and Missed Opportunities,” in Scientia Militaria, South African Journal of Military Studies, Vol 33, Nr 1, 2005, the decision revealed the “special” relationship between the country’s elites and the French language.

Several coups were attempted beginning in 1999, ushering in years of rebellion and disputed elections. In 2015, Cote d’Ivoire held very successful presidential elections and President Ouattara peacefully won reelection. President Ouattara introduced a new constitution in 2016, approved in a nationwide referendum.

According to the U.S. State Department:

U.S.-Ivoirian relations have traditionally been friendly and close. . . . The U.S. Government’s overriding interests in Cote d’Ivoire have long been to help restore peace, encourage disarmament and reunification of the country, and support a democratic government whose legitimacy can be accepted by all the citizens of Cote d’Ivoire.”

Ivory Coast overtook Ghana and other West African countries as the world’s leading producer of cocoa beans used in the manufacture of chocolate in 1978, supplying approximately 38% of cocoa produced in the world. Today the country is highly dependent on the crop, which accounts for 40% of national export income.

Ivory Coast and other West African cocoa producing nations have come under severe criticism in the west for using child slave labor to produce the cocoa purchased by Western chocolate companies. The bulk of the criticism has been directed towards practices in Ivory Coast.

A BBC article, for example, claimed that 15,000 children from Mali, some under age 11, were kidnapped and sold into slavery to work in cocoa production in Ivory Coast plantations:

In all, at least 15,000 children [from Mali] are thought to be over in the neighbouring Ivory Coast, producing cocoa which then goes towards making almost half of the world’s chocolate. Many are imprisoned on farms and beaten if they try to escape. Some are under 11 years old.”

A child rests with a machete at an Ivory Coast cocoa plantation. Children as young as 7 routinely work under dangerous conditions to harvest cocoa there, via U of Berkeley Law School

A 2018 article by journalist Oliver Balch in the online magazine Raconeur argues that the use of child slave labor has one simple cause: poverty. The author explains:

On average, cocoa-growing households earn $0.78 a day, less than one third of what the Fairtrade International defines as a living income of $2.51.”

The Washington Post adds:

The scope and scale of the problems in cocoa are staggering: An estimated 2.1 million children are engaged in hazardous work in the fields of the Ivory Coast and Ghana alone. The average cocoa farming household in the Ivory Coast earns just 37 percent of a living income. The average age of farmers in Ghana — the second-largest cocoa producer — is 52, and few young people see farming as an attractive vocation.

Still, our research showed that cocoa continues to be the best among few options for millions of small farmers. In the Ivory Coast . . . there simply aren’t alternatives that provide farmers with comparatively stable incomes and a certain level of land security.”

Fortune Magazine journalists also investigated the issue, writing:

For a decade and a half, the big chocolate makers have promised to end child labor in their industry—and have spent tens of millions of dollars in the effort. But as of the latest estimate, 2.1 million West African children still do the dangerous and physically taxing work of harvesting cocoa. What will it take to fix the problem?

Apparently greed is an insuperable barrier. The Washington Post reports that the world’s largest chocolate companies promised to eradicate the epidemic of child labor nearly 20 years ago. But they missed deadlines to uproot child labor from their cocoa supply chains in 2005, 2008 and 2010.

In February, 2020, human rights advocates petitioned U.S. Customs and Border Protection to stop some of the world’s largest chocolate companies, including Nestlé; Mars; Hershey; Mondelez, the owner of the Cadbury brand; and other companies, from importing cocoa from Ivory Coast unless they could show that the chocolate was produced without forced or trafficked child labor.

A young boy from Burkina Faso, working in Ivory Coast cocoa fields, follows other children as they leave the cocoa farm where they work, via Washington Post

The U.S. State Department has indicated, however, that Ivory Coast appears ill-prepared to police child trafficking, saying the budget for the anti-trafficking program is “severely inadequate.” As State Department officials noted in a 2018 report, the primary police anti-trafficking unit is based in the nation’s economic capital, Abidjan, several hours away from the cocoa-growing areas, and its budget was about $5,000 a year.

Richard Scobey, President of the World Cocoa Foundation industry group, was opposed, out of concern, needless to say, for the poor people:

This irresponsible call for a U.S. ban on cocoa imports from Côte d’Ivoire will hurt, not help. It could push millions of poor farmers deeper into poverty, even though the vast majority of them are innocent of such practices, and threatens to damage the economy and security of a vital U.S. partner in West Africa.”

In the final analysis, chocolate companies do not want to give up their profits, and consumers do not want to give up their candy bars.

October 11, 1872 – Birth of Harlan F. Stone, 12th Chief Justice of the Supreme Court

Harlan Fiske Stone, born on this day in Chesterfield, New Hampshire, graduated from Columbia Law School in 1898. From 1899 to 1902 he taught law at Columbia Law School, becoming a professor in 1902 and the dean in 1910, a position he held until 1923. Columbia reports that as a teacher and as dean, Stone was known for taking a great interest in his students, who called themselves “Stone-Agers” in his honor. In 1946, the year of Stone’s death, the Columbia Law School faculty established the Harlan Fiske Stone Scholars. These scholarships are awarded each year in recognition of academic achievement by students in each of the three J.D. classes and in the LL.M. Program. That same year, the Harlan Fiske Stone professorship in constitutional law was established.

Harlan Fiske Stone, Chief Justice of the United States

From 1924 to 1925 he served as U.S. Attorney General under President Calvin Coolidge, with whom he had attended Amherst College.

In 1925, Coolidge nominated Stone to succeed retiring Associate Justice Joseph McKenna, and Stone won Senate confirmation with little opposition. On the Taft Court, Stone joined with Justices Holmes and Brandeis in calling for judicial restraint and deference to the legislative will. On the Hughes Court, Stone and Justices Brandeis and Cardozo formed a liberal bloc called the Three Musketeers that generally voted to uphold the constitutionality of the New Deal.

(“The Three Musketeers” were opposed by “the four horsemen,” consisting of Justices James Clark McReynolds, George Sutherland, Willis Van Devanter, and Pierce Butler.) Chief Justice Charles Evans Hughes and Justice Owen J. Roberts controlled the balance.)

By 1941 most of the others on the court were gone, with only Stone and Roberts remaining.

Stone’s support of the New Deal was no doubt instrumental in leading to his nomination as Chief Justice by FDR in June, 1941, following the retirement of Chief Justice Charles Evans Hughes. Stone, aged 69, was quickly confirmed by the United States Senate sworn in on July 3. He remained in this post until his sudden death in 1946; his was one of the shortest terms of any Chief Justice. He was also the only justice to have occupied all nine seniority positions on the bench, having started out as the most junior Associate Justice, working his way to most senior Associate Justice, and then to Chief Justice.

Stone’s tenure as Chief Justice was not without controversy. He upheld the President’s power to try Nazi saboteurs captured on American soil by military tribunals in Ex parte Quirin, 317 U.S. 1 (1942). The court’s handling of this case has been the subject of scrutiny and controversy. One scholar, for example, argues that irrespective of congressional authorization, such extra-Judicial prosecution encroaches upon quintessential Article III functions and thereby violates the separation of powers.

Additionally, as Chief Justice, Stone described the Nuremberg court as “a fraud” to Germans and a “high-grade lynching party,” even though his colleague and successor as Associate Justice, Robert H. Jackson, served as the chief U.S. prosecutor.

According to William Rehnquist in a 2004 speech:

Stone’s biographer, Alpheus T. Mason, sums up Stone’s views of Jackson’s service this way: ‘For Stone, Justice Jackson’s participation in the Nuremberg Trials combined three major sources of irritation: disapproval in principle of non-judicial work, strong objection to the trials on legal and political grounds, the inconvenience and increased burden of work entailed. Even if the Chief Justice had wholly approved the trials themselves, he would have disapproved Jackson’s role in them. If he had felt differently about the task in which Jackson was engaged, he might have been somewhat less annoyed by his colleague’s absence.'”

It is worth noting, (again per Rehnquist):

One of Stone’s complaints was that he first learned of Jackson’s acceptance of the role of prosecutor when it was announced by President Truman. One would think that Jackson would have at least consulted Stone before accepting the job; not that Stone had any authority to forbid his taking it, but that advance notice would have made it more palatable to Stone even though he still disagreed.”

Just imagine if he had learned of the appointment by tweet….

Stone’s death came after he was suddenly stricken while in an open session of the Supreme Court. Justice Hugo Black called the Court into a brief recess, and physicians were summoned. Stone died of a cerebral hemorrhage on April 22, 1946 at his Washington D.C. home. He is buried at Rock Creek Cemetery in Washington, D.C., along with three other justices buried there (Willis Van Devanter, John Marshall Harlan, and Stephen Johnson Field).

October 9, 1823 – Birth of Mary Ann Shadd, 1st African-American Publisher in North America & 1st Woman Publisher in Canada

Mary Ann Shadd, the eldest of 13 children, was born in Delaware on this day in history to free African-Americans. Her parents were active in the Underground Railroad and their home frequently served as a refuge for fugitive slaves.

When it became illegal to educate African-American children in the state of Delaware, the Shadd family moved to Pennsylvania, where Mary attended a Quaker Boarding School. In 1840, after being away at school, Mary Ann returned to East Chester and established a school for black children.

In 1848, Frederick Douglass asked readers in his newspaper, “The Northern Star,” to offer their suggestions on what could be done to improve life for African-Americans. Mary Ann, then only 25 years of age, wrote to him to say, “We should do more and talk less.” She expressed frustration that speeches and resolutions had not produced many tangible results. Douglass published her letter in his paper.

Mary Ann Shadd, via Library and Archives Canada, C-029977

When the Fugitive Slave Law of 1850 in the United States threatened to return escaped slaves into bondage but also threatened free blacks, Mary Ann and her brother Isaac moved to Windsor, Ontario, across the border from Detroit. This is where Mary Ann’s efforts to create free black settlements in Canada first began.

While in Windsor, she founded a racially integrated school with the support of the American Missionary Association. Public education in Ontario was not open to black students at the time. Mary Ann offered daytime classes for children and youth, and evening classes for adults.

In 1853, Mary Ann Shadd founded an anti-slavery paper, called “The Provincial Freeman.” The paper’s slogan was: “Devoted to antislavery, temperance and general literature.” It was published weekly, and the first issue was published in Toronto, Ontario, on March 24, 1853. She persuaded a black abolitionist man and a male white clergyman to lend their names to the masthead to provide legitimacy that a woman’s name could not.

A remembrance of Mary Ann Shadd in the New York Times points out:

Her progressive approach and unorthodox outlook alienated some people. She criticized abolitionists who did not fight for full equality and instead supported segregated schools and communities. She also denounced refugee associations that gathered funds to support fugitive slaves but turned a blind eye to free blacks who were forced to live in poverty.”

The paper ran for four years, before financial challenges forced the paper to fold.

After the demise of the “Freeman,” Mary Ann Shadd Cary (she had married Toronto businessman Thomas F. Cary in 1856) was hired by Martin Delany as perhaps the only woman to recruit Black soldiers during the Civil War. She then settled in Washington, D.C., founding a school for the children of freed slaves, believing that education offered them more opportunities.

In 1869, she embarked on her second career, becoming the first woman to enter Howard University’s law school, graduating in 1870.  She was the first African-American woman to obtain a law degree and among the first women in the United States to do so.

She fought alongside Susan B. Anthony and Elizabeth Cady Stanton for women’s suffrage, testifying before the Judiciary Committee of the House of Representatives in January 1874 and becoming the first African-American woman to cast a vote in a national election.

After a lifetime of achievements and firsts, Shadd Cary died on June 5, 1893. Mary Ann Shadd Cary has been designated a Person of National Historic Significance in Canada, one of her many posthumous honors.

October 7, 1996 – Beginning of Fox “News”

The Fox News Channel was created by Australian-born American media mogul Rupert Murdoch, who hired Roger Ailes as its founding CEO. The channel was launched on October 7, 1996, this day in history, to 17 million cable subscribers. By the time of the 2000 presidential election, Fox News was available in 56 million homes nationwide.

Murdoch in December 2012

As the Pew Research Center points out:

Fox News . . . holds a unique place in the American media landscape, particularly for those on the ideological right. While Democrats in the United States turn to and place their trust in a variety of media outlets for political news, no other source comes close to matching the appeal of Fox News for Republicans.”

Pew further notes, citing “Five Facts About Fox News::

Around four-in-ten Americans trust Fox News. Nearly the same share distrust it.

Republicans trust Fox News more than any other outlet. Democrats distrust it more than any other outlet.

On an ideological scale, the average Fox News consumer is to the right of the average U.S. adult, but not as far to the right as the audiences of some other outlets.

People who cite Fox News as their main source of political news are older and more likely to be white than U.S. adults overall.

Those who name Fox News as their main source of political news stand out in their views on key issues and people, including President Donald Trump.”

How much does Fox matter to the state of the U.S.? Some contend it matters a lot. Nicole Hemmer, an assistant professor of Presidential studies at the University of Virginia’s Miller Center and the author of “Messengers of the Right,” a history of the conservative media’s impact on American politics, says of Fox, “It’s the closest we’ve come to having state TV.” Hemmer argues that Fox — now the most watched cable news network — acted as a force multiplier for Trump, solidifying his hold over the Republican Party and intensifying his support. “Fox is not just taking the temperature of the base—it’s raising the temperature,” she says. “It’s a radicalization model.” For both Trump and Fox, “fear is a business strategy—it keeps people watching.”

Matt Gertz in Media Matters writes:

Fox News and its biggest stars currently enjoy an unprecedented influence over the federal government’s actions because President Donald Trump is obsessed with the network’s propagandistic programming and relies on its incendiary right-wing personalities for advice. 

Fox has effectively merged with the Trump administration, an event with no analogue in modern American history. The network’s sway over the political universe has become so great in recent years that whether you watch it or not, its coverage and commentators have had a tangible effect on your life.”

He also notes:

This Trump-Fox feedback loop impacts the president’s worldview — and thus, the government’s actions — on a scope far beyond any other news outlet in the recent past. And the network’s hold on both have only been strengthened over the course of his administration.”


Reporter Jane Mayer wrote for the New Yorker:

Most American news outlets try to adhere to facts. . . . Conservative media outlets, however, focus more intently on confirming their audience’s biases, and are much more susceptible to disinformation, propaganda, and outright falsehoods (as judged by neutral fact-checking organizations such as PolitiFact). Case studies . . . show that lies and distortions on the right spread easily from extremist Web sites to mass-media outlets such as Fox, and only occasionally get corrected.”

Why is this so frightening? According to Matt Gertz:

The president watches hours of Fox News and its sister network, Fox Business, each day and regularly tweets in response to segments that attract his attention — at times dramatically shifting the news cycle and government policy. He sent 1,146 of these Fox live tweets from September 2018 through August 2020 — 7.5% of his total tweets during that period — according to a new Media Matters report.”

For more background, you can check out this New York Times Magazine article, in which Jonathan Mahler and Jim Rutenberg trace the power of Fox News patriarch Rupert Murdoch — and how it “remade the world.”

October 5, 1829 – Birth of Chester A. Arthur, 21st U.S. President

Many Americans don’t know much about their 21st president, Chester Alan Arthur, born on October 5, 1829, this day in history. He was serving as the 20th vice president when he succeeded to the presidency upon the death of President James A. Garfield in September 1881, two months after Garfield was shot by an assassin.

Arthur has quite an interesting background however. The son of an abolitionist preacher, he began his professional career as an attorney in New York City, making a name for himself by prosecuting civil rights cases.

Chester A. Arthur as a young man in New York

In 1854, for example, he successfully represented Elizabeth “Lizzie” Jennings, born free in New York City in March 1827. Lizzie’s father was the first African American to be awarded a patent, for his method to clean clothes. Lizzie’s mother was active in a society founded by New York’s elite black women to promote self-improvement through community activities, reading, and discussion. Thus Lizzie grew up in an atmosphere that stressed equality and activism. In adulthood, Lizzy became a schoolteacher at New York’s African Free School, as well as the organist for her church.

As a “Black History Month” post for Potus-Geeks Live Journal explains, in New York prior to the Civil War, although there was public transportation for blacks, the buses ran “infrequently, irregularly, and often not at all.” Blacks could only board omnibuses designated for whites if no passenger or driver objected, and the drivers carried whips to enforce the practice.

On Sunday, 16 July 1854, Lizzie was late for church, and boarded a streetcar for whites. The conductor ordered her to get off, but she saw plenty of empty seats, and refused. The driver assisted by two other men grabbed Lizzie and threw her out into the street but she picked herself up and climbed back on the streetcar.

Five blocks later, the driver hailed a police officer, who forced Lizzie off the car and Lizzie was at last successful ejected.

Word of Lizzie’s treatment spread throughout her neighborhood, and the incident was reported in the New York Tribune. Her story was further publicized by Frederick Douglass, and received national attention.

A meeting was held at Lizzie’s church at which attendees decided to form a committee and hire a lawyer. Her case was taken on by 24-year-old Chester A. Arthur, the future twenty-first president.

Seven months later, the case, Elizabeth Jennings v. The Third Avenue Railroad Company, was heard in court. Lizzie won her case, and the next day, the “Colored People Allowed in This Car” signs on the Third Avenue streetcars came down. Lizzie was awarded $225 in damages (comparable to over $6,500 in today’s dollars), and $22.50 in costs. The next day, the Third Avenue Railroad Company ordered its cars desegregated. New York’s public transit was fully desegregated by 1861. Furthermore, the New York State Supreme Court, Brooklyn Circuit ruled that African Americans could not be excluded from transit provided they were “sober, well behaved, and free from disease.”

Another case Arthur won, Lemmon v. New York decided by the New York Court of Appeals in 1860, held that slaves being transferred to a slave state through New York would be emancipated.

The Potus Geeks history of the incident recounts:

Lemmon’s lawyers relied on the Supreme Court’s ruling in the 1824 decision Gibbons v. Ogden to argue that states had no right to regulate interstate commerce as that power lay in the hands of the federal government. The state’s lawyers included Chester Alan Arthur, Erastus D. Culver and John Jay. They argued that the U.S. Constitution granted limited powers to the federal government, and the powers which were not granted were reserved for the state. Under the provision of the Fugitive Slave Act of 1850 that required states to return fugitive slaves, the state argued that any requirement for states to return non-fugitive slaves was excluded.

The Court of Appeals affirmed by a vote of 5-3 in March 1860, holding that the slaves were free. Lemmon appealed to the Supreme Court of the United States, but by then the Civil War was under way and the appeal was never heard.”

During the Civil War, Arthur served as quartermaster general of the New York Militia. When the war ended, he got involved in Republican politics and became a part of Senator Roscoe Conkling’s political organization. When James Garfield won the Republican nomination for president in 1880, Arthur was nominated for vice president to balance the ticket as an Eastern Stalwart. (Stalwarts were a faction of the Republican Party that favored patronage systems over merit-based appointments to cvil service.)

After Garfield was assassinated, to everyone’s surprise Arthur shifted policy and advocated and signed the Pendleton Act of 1883, which established the principle that federal jobs should be awarded on the basis of merit, determined by competitive exams, rather than through political connections.

[As a story in Politico points out, initially the act covered only about 10 percent of the federal government’s civilian employees. It included a provision, however, that allowed outgoing presidents to lock in their appointees by converting their jobs to civil service status. After a series of successive party flip-flops at the presidential level in 1884, 1888, 1892 and 1896, most federal jobs eventually came under the civil service umbrella, where they remain to this day.]

Weisberger asserts that “Overall, Arthur conducted a responsible, if undistinguished (and unimportant), presidency.”

Notably, however, he did veto renewed attempts by Congress to restrict immigration from China in response to racist scapegoating following the financial panic of 1873. Congress could not override his veto, but passed a new bill reducing the immigration ban to ten years. Although he still objected to this denial of citizenship to Chinese immigrants, Arthur acceded to the compromise measure, signing the Chinese Exclusion Act into law on May 6, 1882.

Arthur suffered from poor health, and made only a limited effort to get the Republican Party’s nomination in 1884. Shortly after becoming president, he was diagnosed with Bright’s disease, a kidney ailment now referred to as nephritis. But Arthur also realized that the Republican party was not prepared to support him. Thus he retired at the close of his term.

Arthur left office in 1885 and returned to his New York City home. In the fall of 1886 he became seriously ill. On November 17, he suffered a cerebral hemorrhage and never regained consciousness; he died the following day, November 18, at the age of 57. While he was considered a good president at the time, his reputation has faded over the years. Scholars rank him somewhere near the middle of all presidents serving thus far.

Review of “Stalin’s War: A New History of World War II” by Sean McMeekin

In Stalin’s War, distinguished historian Sean McMeekin has produced a decidedly revisionist history of World War II. He argues convincingly that Stalin wanted WWII at least as much as Hitler did. Moreover, Stalin was far more successful than Hitler was in that war, hence, the title of the book.

McMeekin analyzes the war from Stalin’s perspective. The Soviet Union was the world’s first communist country and it considered all capitalist countries as enemies. A primary goal of Russian diplomacy was to infiltrate capitalist governments with the goal of providing support Russia’s interests and to foment animosity among capitalist states. There were literally hundreds of Russian paid agents in the Roosevelt administration. Indeed, Harry Hopkins, FDR’s most trusted advisor, although not directly paid by the U.S.S.R., was certainly what Lenin would call a “useful idiot.”

Importantly, before World War II, Stalin encountered the same risk of a two-front war that Germany had in 1914. In 1938, not only were the Germans aggressive to his west, but Japan was busily grabbing large chunks of China to his east. In fact, the Japanese Army in Manchuko (today’s Manchuria) was fighting several hundred thousand soldiers of the Red Army and threatening Vladivostok, Russia’s only port on the Pacific. Still, Germany posed a greater threat, being much closer to the bulk of Russia’s population.

To secure his eastern flank, Stalin executed a non-aggression pact that granted terms very favorable to Japan. In fact, he honored that agreement throughout the coming world war. One aspect of that treaty affected American airmen who had attacked Japan and had to bail out or crash-land in Russia to avoid capture by the Japanese. Russia treated them as hostile prisoners of war even though they were fighting for a country that was supplying the Russians with vital supplies. On the other hand, American merchant marine seamen fared much better — the Japanese navy did not attack American commercial ships bound for Vladivostok, which allowed safe passage for enormous amounts of war materiel to be supplied for Russia’s war with Germany.

Stalin did not feel not fully prepared for war with Germany despite the fact that the principal thrust of the Soviet Union’s Five Year Plans of the 1930s was the “mass manufacture of modern military hardware.” Consequently, he jumped at the chance of a non-aggression pact with Germany, which resulted in the Molotov-Ribbentrop Pact in 1939. During the negotiations for the pact, Stalin suggested to Hitler the partition of Poland.

Via Wikimedia Commons

Hitler attacked Poland in September 1939 and quickly conquered the western half of the country. Stalin waited only a few days after Hitler’s invasion to launch his own invasion of Poland which ended up with the Russians controlling more of Poland that Germany did.

Stalin’s fondest hope was that Germany would star a war with France and England, and that Russia could watch from the sidelines as the capitalists mauled each other. Russia would then find itself the dominant power in Europe without having to expend blood or treasure. Unfortunately for him, Germany did start such a war, but won it so quickly and at such low cost that the Soviet Union found itself in great peril from Hitler.

After the successful attack on Poland, Hitler turned west and attacked and occupied France, Belgium, the Netherlands, and Norway. Meanwhile, the Russians quickly conquered Latvia, Lithuania, Estonia, and Moldavia. The Russians also tried to take Finland, but met effective and heroic resistance and had to settle for a small slice of the southeastern part of the country.

A Red Army tank rolls in Finland. This and other great photos from the Winter War at

The stage was now set for Germany’s massive invasion of Russia. Here McMeekin tells a story quite different from what has come down from most western historians. The Russians may have been surprised by the timing of the attack, but they had been preparing for it for years. Contrary to popular opinion, the Germans did not have an advantage in tanks and artillery — the Russians had far more. Moreover, they greatly outnumbered the invaders.

McMeekin argues that the Russians maintained its advantage in armor and number of soldiers throughout the war, even in the early stages when they were clearly losing. In fact, the Germans were not as thoroughly mechanized as many western historians described — they relied on millions of horses rather than trucks for much movement of materiel. Ultimately, the Russians were able to out-maneuver and surround the Germans because they had enormous supplies of trucks and fighter planes that had been furnished at no charge by the United States.

Another myth that McMeekin counters is that the Germans had the vast majority of their troops on the Eastern Front. In fact, once Hitler had redeployed many divisions to the west for the famed Battle of the Bulge, there were more German soldiers in France and Italy than there were on the Eastern Front.

McMeekin is highly critical of Roosevelt and, to a lesser extent, Churchill regarding their dealings with Stalin. If the point of the war was to save Poland and Eastern Europe from foreign subjugation, then the war was an abysmal failure for the West. Churchill gets particularly low marks for abandoning Mikhailovich and the Chetniks in Yugoslavia so that Tito’s Communists could prevail there. The end of the war found Stalin in charge of all of Eastern Europe.

Churchill, Roosevelt and Stalin at the Yalta conference, February 1945. Universal Images Group/Getty Images

McMeekin argues that “the most lasting consequence of Stalin’s victories in 1945 was the impetus they had given to Communist expansion in Asia, above all in China.” Russia did not enter the war against Japan until the final weeks when the results were clear. However, Stalin was able to position many divisions in East Asia. From there, they could supply Mao’s communist forces with tanks, artillery, and other materiel. At the same time, the western powers soured on Chiang Kai-shek (the Chinese Nationalist politician, revolutionary and military leader who served as the leader of the Mainland Republic of China from 1928 until 1949, and then in Taiwan) and ceased helping him against Mao. McMeekin says, “the mystery is not that Mao won the Chinese Civil War, but that it took him three more years to do so.”

McMeekin concludes with several acerbic observations:

“By objective measures of territory conquered and war booty seized, Stalin was the victor in both Europe an Asia, and no one else came close.”

“The notion that a great American victory was achieved in 1945 is hard to square with the strategic reality of the Cold War, which required a gargantuan expenditure over decades merely to hold the line at the Fulda Gap before the USSR finally collapsed in 1991.”

“The ultimate price of victory was paid by the tens of millions of involuntary subjects of Stalin’s satellite regimes in Europe and Asia, including Maoist China, along with the millions of Soviet dissidents, returned Soviet POWs, and captured war prisoners who were herded into Gulag camps. . . . For subjects of his expanding slave empire, Stalin’s war did not end in 1945. Decades of oppression and new forms of terror were still to come.”

Evaluation: This is an unnerving book, beautifully written and forcefully argued.

Rating: 5/5

Published by Basic Books, an imprint of Perseus Books, a subsidiary of Hachette Book Group, 2021

September 28, 1868 – Massacre of Blacks in Opelousas, Louisiana

On September 28, 1868, this day in history, between 30 and 150 people died from a riot begun by white residents of Opelousas, Louisiana over African Americans exercising their new voting rights.

After the Civil War, the Southern economy was in shambles. But as Matthew Christensen, writing in 2012 for his history dissertation “The 1868 St. Landry Massacre: Reconstruction’s Deadliest Episode of Violence,” pointed out:

Although economic hardships, disease, and natural disasters were all influential in the lives of Southern whites, the fate of the newly emancipated slaves was their foremost concern.”

Louisiana enacted “Black Codes” in 1865 and 1866 that “reflected white fears and desires regarding the newly emancipated slaves.” Basically, they wanted to keep blacks “in their place” but just without the label of “slavery” attached. When enacting laws didn’t entirely accomplish their goals, they resorted to violence.

As the Equal Justice Initiative reports, white St. Landry Parish voters in the April 1868 election supported candidates who were white supremacists. But there was a large black electorate that voted Republican. EJI recounts:

After half-hearted efforts to sway black voters to the white-controlled Democratic party failed, many white voters in St. Landry resorted to violent intimidation tactics. In response, Republicans like Emerson Bentley, white publisher of the radical St. Landry Progress newspaper, organized and encouraged black people to become politically active. Racial and political tensions continued to escalate as the 1868 presidential election neared.”

St. Landry Parish, Louisiana

Mr. Bentley, who inflamed matters by teaching at a school for black children he had established, was physically attacked by a local judge, and the rumor spread that he had been killed. Republicans in town panicked, and summoned black allies from a nearby town for help. The arrival of armed black men caused the white supremacists to assemble. It was black men who were arrested, of course, and the next night, white forces forcibly removed twenty-seven of the twenty-nine arrested black men from jail and shot them dead, with the sheriff’s full cooperation.

For the next two weeks, murderous violence swept the parish as white mobs terrorized the black community. When the attacks subsided, six white people had been killed, but many more black people were dead. Christensen explains that accurate death tolls were difficult to obtain. Whites certainly were not eager to disclose how many they had killed, and blacks feared retribution unless they remained silent. Thus, Democratic testimonies fell between 23-75 total deaths while Republican estimates ranged between 200-500. Christensen notes that s spokesperson for the far right at that time stated that the Democrats were “well satisfied with the result,” leading one to suspect the high estimates were more accurate.

EJI summarizes the matter thusly:

As a means of political and racial intimidation, the Opelousas Massacre was a great success. St. Landry was one of few Louisiana parishes not politically controlled by Republicans by late 1868. Mr. Bentley and other white Radical Republicans fled the area, leaving a solidly Democratic white electorate, while black voters had learned the consequences of opposing white political will. In the November 1868 presidential election, held just weeks after the massacre and just a few months after St. Landry’s black voters had solidly supported Republican candidates in state and local races, Republican candidate Ulysses S. Grant did not receive a single vote.”

September 26, 1874 – Birth of Lewis Hine, the Photographer Instrumental in Changing U.S. Child Labor Laws

Lewis Wickes Hine was born on this date in Oshkosh, Wisconsin. He studied sociology at the University of Chicago, Columbia University, and New York University, using a camera as both a teaching strategy and as a tool for social change and reform. As art critic Billy Anania observes, “Few American photographers have captured the misery, dignity, and occasional bursts of solidarity within US working-class life as compellingly as Lewis Hine did in the early twentieth century.”

The Washington Post reports:

In the early 1900s, Hine traveled across the United States to photograph preteen boys descending into dangerous mines, shoeless 7-year-olds selling newspapers on the street and 4-year-olds toiling on tobacco farms. Though the country had unions to protect laborers at that time — and Labor Day, a federal holiday to honor them — child labor was widespread and widely accepted. The Bureau of Labor Statistics estimates that around the turn of the century, at least 18 percent of children between the ages of 10 and 15 were employed.”

The “breaker boys” at a Pennsylvania coal mine, photographed by Hine in 1911. (Library of Congress)

In 1907, Hine became the staff photographer of the Russell Sage Foundation. He photographed life in the steel-making districts and people of Pittsburgh, Pennsylvania, for the influential sociological study called The Pittsburgh Survey.

In 1908 Hine left his teaching position to become the photographer for the National Child Labor Committee (NCLC). Over the next decade, he documented child labor, with a focus on the use of child labor in the Carolina Piedmont. In this capacity he aided the NCLC’s lobbying efforts to enact child labor laws better protecting children.

May 9th, 1910. Newsies in St. Louis, Missouri. (Wikimedia Commons)

However, as the International Photography Hall of Fame pointed out:

Child labor was extremely profitable and many business owners were unwilling to accept or adhere to the laws.”

Thus Hine’s work taking pictures for the NCLC was often dangerous. He was frequently threatened with violence or even death by factory police and foremen. To gain entry to the mills, mines and factories, Hine pretended he was there on some official business.

He published the photos he took along with background notes, which had great potency. The Washington Post noted:

Hine’s photos were paired with captions and stories from his interviews with the children, who would tell him their ages, backgrounds and working conditions.

If they didn’t know their own age, Hine would estimate it by measuring them. As a Bible salesman or in one of his other disguises — he posed as a postcard salesman and a machinery photographer, Hine could hardly be seen whipping out a measuring tape. That’s why he wore a three-piece suit. He could measure the children against the buttons on his vest.”

Maud Daly, age 5 and Grade Daly, age 3, photographed by Hine in 1911. Hine wrote that each girl picked a pot of a shrimp a day for a Mississippi oyster company. “The youngest said to be the fastest worker,” Hine noted. (Library of Congress)

The International Photography Hall of Fame recounted:

Eventually these images helped convince government officials to create and strictly enforce laws against child labor. The impact of these photographs on social reform was immediate and profound. They also inspired the concept of art photography, not because of the subject matter, but because the images showed a stark truth that dramatically differed from an emerging artistic character.”

During and after World War I, Hine photographed American Red Cross relief work in Europe. In the 1920s and early 1930s, he made a series of “work portraits,” which emphasized the human contribution to modern industry. In 1930, Hine was commissioned to document the construction of the Empire State Building. To obtain his famous pictures of the workers in precarious positions while they secured the steel framework of the structure, he took many of the same risks that the workers endured.

Icarus, Empire State Building, Lewis Hine (Metropolitan Museum of Art)

Hine continued documenting conditions during the Great Depression, drought relief in the American South, life in the Tennessee Mountains, and he served as chief photographer for the Works Progress Administration’s National Research Project, which studied changes in industry and their effect on employment. Hine also made a visual record of working conditions of women during the 1920s and 1930s. Notably, Hine also photographed housewives; he believed that homemakers deserved recognition as workers.

Power house mechanic working on steam pump, 1920. (Wikimedia Commons)

Anania writes:

Hine once argued that a good picture is ‘a reproduction of impressions made upon the photographer which he desires to repeat to others.’ For him, an organized workforce was the epitome of empathy and mutual benefit, which he hoped to convey to the greater American public.”

The last years of his life were filled with professional struggles by loss of government and corporate patronage. Few people were interested in his work, past or present, and Hine lost his house and applied for welfare. He died on November 3, 1940 at Dobbs Ferry Hospital in Dobbs Ferry, New York, after an operation. He was 66 years old.