History Podcasts

Which eras have historians proposed that we are currently living in?

Which eras have historians proposed that we are currently living in?


We are searching data for your request:

Forums and discussions:
Manuals and reference books:
Data from registers:
Wait the end of the search in all databases.
Upon completion, a link will appear to access the found materials.

This is something that I've been researching for the past few weeks and have found a few answers, but I assume I'm missing a few of them.

Basically, the question is this:

In which ways have modern historians defined our current era?

Not sure how to tag this one if mods could do an edit that'd be great.

Edit: The duplicate that was proposed seems to be asking about the distinction between the classical eras historians have defined we are currently living in. Whereas I'm interested in any and all definitions that historians have proposed, which may be more esoteric and obscure than the obvious ones like 'modern' and 'information age'


Our tags are actually a pretty good guide here.

modernThe period of history roughly from the 15th century to the mid 20th centurycontemporary-historyContemporary history describes the period timeframe that is without any intervening time closely connected to the present day and is a certain perspective of modern history.

These are the categories used by historians.

Now of course people like to propose their own "Age"s to slot the contemporary period (and usually all or part of the modern period). The only one you are probably safe with is "Modern Age", which is essentially a synonym for "Modern Period" (usually with Contemporary thrown in).

Anything else you hear generally comes with a specific outlook or theory attached. Of those, ones I have heard are:

  • Renaissance - My Art history teacher in particular insisted we are still in the Renaissance.
  • Industrial Age - The idea behind this is that we are in a culture roughly shared with people born after the advent of steam power and the industrialization that came in its wake in the 1760's.
  • Information Age - This one takes the Industrial Age idea, but claims that that advent of global communications and computing since the late 20'th Century puts the Contemporary period itself in a whole different class. As a computer person myself, I'm kind of partial to this one.

(Discourse on the last bullet follows. Skip it if you like)

Douglas S. Robertson took the idea of the Information Age and went even further. He classifies all societies based on the amount of information, in bits, that a typical member has access to. I believe this is called "Informationalist History".

Wherehis the amount of info one mind can hold, and is probably in the vicinity of 5Mb (5*106 bits).

  • Level 0 - 107 bits (h) - Pre-Language
  • Level 1 - 109 bits - Language
  • Level 2 - 1011 bits - Writing
  • Level 3 - 1017 bits - Printing
  • Level 4 - 1025(?) bits - Computers

The exponent on that number of bits is the important thing. How far one society outclasses another can be gauged by the difference in those exponents. This is why Native Americans, the most advanced of whom barely had writing, had no hope of competing with Europeans with printing presses, but under the right conditions could actually replace a society of Europeans with no printing press a few years earlier. Being a couple of orders of magnitude back can perhaps be dealt with. However, be several back and you're lucky if they bother to treat you as the same species.

An Informationalist would say we are in the Computer Age, and that further human progress to any new level is going to require us to find ways around our current limitations on information access (particularly combing through massive amounts of it in new and more productive ways)


This might not answer your question, but could be useful. Back in the school the history textbook contained a list of historic areas, something like this (I give you the Hungarian names I remember and the approximate English translations):

  • őskor (prehistory)
  • ókor (ancient age), -476 CE
  • középkor (middle age), 476-1640
  • újkor (modern age), 1640-1971
  • legújabb kor (newest age, most modern age), 1917-

Of course, this was a communist interpretation as you could guess from the dates. I'm not 100% sure about 1640, but as it was described as the start of the English civilian revolution in the textbook, the revolution-obsessed communist may have chosen this date to be more in line with their theories. So the short answer is some historians defined the (then current) age as "most modern age". I don't know what happened since the fall of communism, maybe we went back an age :-)


The Age of Exploration lasted from the 15th through the 17th centuries. This was the period when Europeans searched the globe for trading routes and natural resources. It resulted in the founding of numerous colonies in North America by the French, British and Spanish.


Are We Living in the Gilded Age 2.0 ?

In a region filled with the palatial homes of the rich and famous, one mansion stands out. Measuring an astonishing 38,000 square feet (plus 17,000 more on the exterior), it was crafted with the finest and most expensive materials. The interior boasts 12 bedrooms, 21 bathrooms and three kitchens, plus six bars and a 40-seat theater. The whole thing comes pre-loaded with extensive, curated collections of fine art, vintage wine and classic cars.

Was this “The Breakers” built by the Vanderbilts in the 1890s in Newport, Rhode Island? Or maybe “The Biltmore” in Asheville, North Carolina? Actually, it’s a new property in Bel Air, California𠅏ittingly called 𠇋illionaire”—that just went on sale this summer for a cool $250 million. But beware to any billionaire buyer thinking this would put them at the top of the real-estate heap: Another Bel Air mega-mansion is slated to go on sale later this summer𠅏or twice the price.

Welcome to the Second Gilded Age, where the opulence is unapologetic and the ranks of the hoi polloi can be seen swelling outside the gates. Scores of books and articles have been published in recent years on the topic of a new Gilded Age. Many activists and politicians invoke the phrase because they see startling parallels with the first Gilded Age, the period from roughly 1870 to 1900 marked by increased poverty, rising inequality and growing concern about corporate influence in politics.

An age of enthusiasm and anxiety

What are the parallels, really? A look at the original Gilded Age reveals it as an era marked, not unlike ours, by a powerful duality. It was both the best of times and the worst of times. It was an age of both enthusiasm and anxiety.

On the enthusiasm side of the ledger, nothing loomed larger than the booming industrial economy. Between 1860 and 1900, U.S. factory output soared from $1.9 billion to $13 billion, an increase of nearly 600 percent. By 1900 the U.S. boasted the most powerful industrial economy in the world. In recent decades, America has experienced a similar economic boom, albeit one interrupted by periodic recessions. (The same was true in the Gilded Age.)

Gilded Age enthusiasm was fueled not merely by the performance of the overall economy, but also by the new technologies it produced. The signature product of the late-19th century was steel, a material that transformed American life. Steel reshaped everything from transportation (the railroad) and architecture (skyscrapers) to medicine (surgical instruments) and consumer goods (pianos). The same holds true in recent decades, only this time the key transformative product has been the silicon chip𠅊nd the digital economy it powers.

Both eras have also produced a list of innovator corporate heads who have become household names. In the 1880s and �s, Andrew Carnegie, John D. Rockefeller and William K. Vanderbilt topped the list. In the early 21st century, Steve Jobs, Mark Zuckerberg and Elon Musk stand as titans.

But if the Gilded Age was characterized by great enthusiasm, it was likewise an age marked by intense anxiety. That’s because many believed that beneath all the gold and glitter one found disturbing economic, social and political trends. This notion explains why Mark Twain dubbed the era the “Gilded Age.” Like a piece of gilded jewelry, it looked beautiful on the outside. But beneath the thin veneer of gold lay cold black iron.

A shelter for immigrants in a Bayard Street tenement, photographed by muckraker Jacob Riis, 1888. (Credit: Bettmann Archive/Getty Images)

The flip side: poverty and inequality

Even as the nation’s aggregate wealth grew, so too did the number of people mired in poverty. In New York City, America’s largest and wealthiest city, two-thirds of its residents lived in cramped tenement apartments, many unfit for human habitation, while tens of thousands scrounged by in the streets. In 1890, muckraking social crusader Jacob A. Riis shone a light on the era’s grinding poverty with his shocking exposé, How the Other Half Lives: Studies among the Tenements of New York. It’s brimming with photos of people crammed cheek-to-jowl in dark, cluttered, airless quarters.

The unsettling implication of all this poverty? That America was losing its republican character and becoming more like a European nation with a population of haves and have-nots locked into fixed classes. Poet Walt Whitman captured the wider economic anxiety in a speech he delivered in 1879. For more than 20 years Whitman had written poems brimming with optimistic paeans to America and its people (“I hear America singing”), but now the great bard was worried. “If the United States, like the countries of the Old World, are also to grow vast crops of poor, desperate, dissatisfied, nomadic, miserably-waged populations…then our republican experiment, notwithstanding all its surface-successes, is at heart an unhealthy failure.” Note Whitman‘s reference to “surface successes.” He was urging his audience to look beneath the gilding to see the threat facing the nation.

Another anxiety-inducing threat: growing wealth inequality. Never before had so few people accumulated such vast wealth in so short a time span. Industrialists like John D. Rockefeller and Andrew Carnegie and financiers like J. P. Morgan and Jay Gould amassed stupendous fortunes. By 1890, the top 1 percent of the U.S. population owned 51 percent of all wealth. The top 12 percent owned an astounding 86 percent. The lower 44 percent of U.S. population𠅊lmost half the country—owned just 1.2 percent.

Alva Vanderbilt costumed for the legendary fancy ball she hosted in March of 1883. (Credit: Bettmann Archive/Getty Images)

Consumption run amok

More than the mere possession of this wealth, it was the way the super rich used it that troubled many of their fellow Americans. To begin with, they spent in ways that violated long-standing republican values of modesty and virtue. Those values dictated that, unlike the aristocrats of Europe, one live well but without palatial mansions, fancy carriages or legions of servants.

All that changed in the Gilded Age as the wealthy competed with each other to see who could build the most opulent mansion, take the longest European tour and host the most expensive ball. The supreme example of the latter was the gala hosted by Alva Vanderbilt, wife of William K. Vanderbilt, in the spring of 1883 to celebrate the opening of the their new French-chateau-style Fifth Avenue mansion, brimming with stained glass, wood carvings, paintings and massive tapestries shipped in from Europe. More than 1,000 of New York’s rich and famous attended the event. Their invitations were hand-delivered by servants in full livery.

It was a costume ball and tellingly, many attendees dressed as European royalty. One partygoer sported an ensemble complete with a taxidermied cat headpiece and a skirt embellished with cattails. Alva’s sister-in-law paid tribute to Thomas Edison’s newfangled invention, the lightbulb, wearing a House of Worth gown emblazoned with lightning bolts (now in the collection of the Museum of the City of New York) and carrying a torch powered by batteries hidden in the dress. As guests arrived to the mansion, surging crowds of lookey-loos had to be held back by police, like fans at a red-carpet premiere.

The cartoon ‘The Bosses of the Senate’ featured big, rich, fat men in top hats representing various trusts and monopolies stand behind the senators at their desks. (Credit: Corbis/Getty Images)

Moneybags lord over politics

Perhaps more disturbing than all the conspicuous consumption𠅊 term coined in the late Gilded Age by sociologist Thorstein Veblen—was the public’s growing awareness that with great wealth came the power to bend democracy to their will. Industrialists used their influence to lobby lawmakers to adopt policies favorable to big business and hostile to organized labor. One of the most famous political cartoons of the era, “The Bosses of the Senate,” lampooned the trend. Appearing in Puck magazine in 1889, it showed U.S. senators being lorded over and intimidated by giant industrial monopolists shaped like moneybags. They’ve entered the Senate gallery through the door labeled Entrance for Monopolists, while in the background a People’s Entrance is boarded shut. The message is clear: Big business had hijacked American democracy, shutting out and defying the will of the people. Stories abounded of big business controlling the political process at both the state and federal level. In Pennsylvania, for example, the Pennsylvania Railroad enjoyed so much power and influence in the 1870s and �s that it had its own office in the state Capitol building. Its chief lobbyist was known as “the 51st Senator.”

And when lobbying wasn’t enough, Gilded Age industrialists turned to bribery and other forms of corruption, inspiring some of the most infamous political scandals in American history. The Crຝit Mobilier scandal involved massively inflated contracts related to building the transcontinental railroad. In the Whiskey Ring scandal, politicians colluded with the liquor industry to avoid paying excise taxes. Republican Party power broker Mark Hanna, himself a millionaire, said in the 1890s: “There are two things that are important in politics. The first is money, and I can’t remember what the second one is.”

Labor and capital in conflict

As industrialists consolidated their power, labor unrest began to surge. Between 1880 and 1900, American workers staged nearly 37,000 strikes—including some of the largest and most famous in U.S. history. These include the first nationwide railroad strikes, the Great Uprising of 1877 and the Pullman Strike of 1894, both of which saw more than 100 people killed in clashes with police, state militia and federal troops. Meanwhile, thousands of local strikes protested starvation wages, long hours and unsafe conditions.

These labor actions called into question the nation’s foundational belief that in America everyone, no matter how lowly their origins, could achieve upward economic mobility. In many ways, the discontent of the American worker during the Gilded Age can be seen in the establishment of Labor Day. What started out as a small hybrid protest-celebration in New York City in 1882 quickly spread across the nation, becoming a federal holiday in 1894.

Artwork depicting a wealthy, fashionable crowd emerging from a restaurant observed by a downtrodden poor family, circa 1880. (Credit: Bettmann Archive/Getty Images)

Contemporary echoes of the Gilded Age

These Gilded Age pain points have many parallels in our time. Concern over rising wealth inequality has become a major political issue, as evidenced by the popularization of the term “the one percent” to describe the super rich. Concern is growing about the influence of corporate money in politics𠅎specially in the wake of the 2010 Supreme Court decision Citizens United v. FEC, which struck down a federal law banning corporations and unions from spending money in federal elections. The recent wave of teachers’ strikes suggests a possible uptick in labor rumblings.

And there are additional parallels worth noting. Anti-immigrant sentiment raged in the Gilded Age. It led to the enactment of several laws to restrict immigration—or at the very least to keep out those deemed “undesirable” because they were seen as racially inferior, criminally inclined, physically or mentally deficient—or likely to end up in the poorhouse. There was even concern about terrorism in the late-19th century, a threat associated with German anarchists and Irish nationalists. We see clear evidence, both in polling data and political rhetoric, of a similar level of anti-immigration sentiment in contemporary American society.

The late 19th century also saw voter-suppression efforts waged against African Americans in the South. Terrorist organizations like the Ku Klux Klan used violence and intimidation to keep blacks away from the polls. When that effort failed to eliminate black voting, legal schemes like the poll tax and literacy tests emerged, which successfully reduced African-American voting by 90 percent in many parts of the South. In the North in the 1870s, lawmakers in New York state tried unsuccessfully to strip voting rights from poor urban whites𠅊 majority of them Irish and Irish American. In recent years the adoption of voter ID laws, purge of voter rolls, and limitations on early voting and the number of polling sites—not to mention sophisticated gerrymandering schemes—have elicited accusations of voter suppression, some of which have been affirmed in federal court.

And then there’s political polarization. The first Gilded Age was marked by intense partisanship, gridlock and presidential elections decided by razor-thin margins. Sound familiar? Two presidential contests in the Gilded Age saw the candidate who lost the  popular vote win the election by virtue of the Electoral College, just as George W. Bush and Donald Trump did in 2000 and 2016, respectively.

But the belief that we live in a Second Gilded Age raises an intriguing question. The original Gilded Age was followed by the Progressive Era (1900-1920), a period marked by a vast array of reforms that alleviated poverty, increased workplace safety, improved public health and education, restrained big business, adopted an income tax, granted women the right to vote and made the political process more democratic. Is the United States poised for a Second Progressive Era? It’s entirely possible, but as any good historian will tell you, history follows no script. Nothing is inevitable.


What Are the Four Major Eras of Earth's Geological History?

Progressing from the oldest to the current, the four major eras of Earth's geological history are Precambrian, Paleozoic, Mesozoic and Cenozoic. The lengths of these eras are often measured by the term "mya," which represents "millions of years ago." The four major eras of the geological time scale, or GTS, are also subdivided into smaller units, such as the Earth's current time scale placement within the Holocene Epoch of the Quaternary Period of the Cenozoic Era.

The current GTS era, the Cenozoic Era, began 65.5 million years ago. The current period within that era is the Quaternary Period, which began 2.588 million yeas ago. The Holocene Epoch, the most recent subdivision of geological time scale, began 11,700 years ago. The Cenozoic Era represents the time during, which the first recognizable humans came into existence. During the Cenozoic Era's comparatively short time span, relatively little change has occurred with regard to shifting plate tectonics affecting the distribution of the continents across the Earth's surface.

The oldest GTS era, the Precambrian Era, began with the formation of the Earth 4,600 mya, or 4.6 billion years ago. During this time, the Earth's crust began to solidify from its original molten form. The earliest-known fossils are from the Archean Eon of this era, which began 4,000 mya, or 4 billion years ago. Overall, the Precambrian Era accounts for 88 percent of Earth's history.


The Anthropocene epoch: have we entered a new phase of planetary history?

I t was February 2000 and the Nobel laureate Paul Crutzen was sitting in a meeting room in Cuernavaca, Mexico, stewing quietly. Five years earlier, Crutzen and two colleagues had been awarded the Nobel prize in chemistry for proving that the ozone layer, which shields the planet from ultraviolet light, was thinning at the poles because of rising concentrations of industrial gas. Now he was attending a meeting of scientists who studied the planet’s oceans, land surfaces and atmosphere. As the scientists presented their findings, most of which described dramatic planetary changes, Crutzen shifted in his seat. “You could see he was getting agitated. He wasn’t happy,” Will Steffen, a chemist who organised the meeting, told me recently.

What finally tipped Crutzen over the edge was a presentation by a group of scientists that focused on the Holocene, the geological epoch that began around 11,700 years ago and continues to the present day. After Crutzen heard the word Holocene for the umpteenth time, he lost it. “He stopped everybody and said: ‘Stop saying the Holocene! We’re not in the Holocene any more,’” Steffen recalled. But then Crutzen stalled. The outburst had not been premeditated, but now all eyes were on him. So he blurted out a name for a new epoch. A combination of anthropos, the Greek for “human”, and “-cene”, the suffix used in names of geological epochs, “Anthropocene” at least sounded academic. Steffen made a note.

A few months after the meeting, Crutzen and an American biologist, Eugene Stoermer, expanded on the idea in an article on the “Anthropocene”. We were entering an entirely new phase of planetary history, they argued, in which human beings had become the driving force. And without a major catastrophe, such as an asteroid impact or nuclear war, humankind would remain a major geological force for many millennia. The article appeared on page 17 of the International Geosphere-Biosphere Programme’s newsletter.

At this point it did not seem likely the term would ever travel beyond the abstruse literature produced by institutions preoccupied with things like the nitrogen cycle. But the concept took flight. Environmental scientists latched on to what they saw as a useful catch-all term for the changes to the natural world – retreating sea ice, accelerating species extinction, bleached coral reefs – that they were already attributing to human activity. Academic articles began to appear with “Anthropocene” in the title, followed by entire journals dedicated to the topic. Soon the idea jumped to the humanities, then newspapers and magazines, and then to the arts, becoming a subject of photography, poetry, opera and a song by Nick Cave. “The proliferation of this concept can mainly be traced back to the fact that, under the guise of scientific neutrality, it conveys a message of almost unparalleled moral-political urgency,” wrote the German philosopher Peter Sloterdijk.

There was just one place where the Anthropocene seemed not to be catching on: among the geologists who actually define these terms. Geologists are the guardians of the Earth’s timeline. By studying the Earth’s crust, they have carved up the planet’s 4.6bn years of history into phases and placed them in chronological order on a timescale called the International Chronostratigraphic Chart. That timescale is the backbone of geology. Modifying it is a slow and tortuous process, overseen by an official body, the International Commission on Stratigraphy (ICS). You can’t just make up a new epoch and give it a convincing name the care taken over the timescale’s construction is precisely what gives it authority.

To many geologists, accustomed to working with rocks that are hundreds of millions of years old, the notion that a species that has been around for the blink of an eye was now a genuine geological force seemed absurd. Few would deny we are in a period of climatic turmoil, but many feel that, compared with some of the truly apocalyptic events of the deep past – such as the period, 252m years ago, when temperatures rose 10C and 96% of marine species died – the change so far has not been especially severe. “Many geologists would say: it’s just a blip,” Philip Gibbard, the secretary-general of the ICS, told me.

Prof Jan Zalasiewicz. Photograph: Colin Brooks

But as the idea of the Anthropocene spread, it became harder for geologists to ignore. At a meeting of the Geological Society of London, in 2006, a stratigrapher named Jan Zalasiewicz argued that it was time to look at the concept seriously. Stratigraphy is the branch of geology that studies rock layers, or strata, and it is stratigraphers who work on the timescale directly.

To Zalasiewicz’s surprise, his colleagues agreed. In 2008, Gibbard asked if Zalasiewicz would be prepared to assemble and lead a team of experts to investigate the matter more deeply. If the group found evidence that the Anthropocene was “stratigraphically real”, they would need to submit a proposal to the ICS. If the proposal was approved, the result would be literally epoch-changing. A new chapter of Earth’s history would need to be written.

With a mounting sense of apprehension, Zalasiewicz agreed to take on the task. He knew the undertaking would not only be difficult but divisive, risking the ire of colleagues who felt that all the chatter around the Anthropocene had more to do with politics and media hype than actual science. “All the things the Anthropocene implies that are beyond geology, particularly the social-political stuff, is new terrain for many geologists,” Zalasiewicz told me. “To have this word used by climate commissions and environmental organisations is unfamiliar and may feel dangerous.”

What’s more, he had no funding, which meant he would have to find dozens of experts for the working group who would be willing to help him for free. Having spent much of his career absorbed in the classification of 400m-year-old fossils called graptolites, Zalasiewicz did not consider himself a natural people manager. “I found myself landed in this position,” he said. “My reaction was: goodness me, where do we go from here?”

W orking out the age of the planet has always been a fraught business. The Bible stated that God created everything in six days, but it wasn’t until the 17th century that scholars made a concerted effort to work out precisely when that week might have been. For some time, the estimate of one scholar, an Irish archbishop named James Ussher, held sway: the world began on 23 October 4004 BC.

Then, in the late 18th century, a different theory emerged, one based on the close observation of the natural world. By studying the near-imperceptibly slow process of the weathering and forming of rocks, thinkers such as the Scottish landowner James Hutton argued that the Earth must be far, far older than previously thought.

The invention of geology would go on to transform our sense of our place in existence, a revolution in self-perception similar to the discovery that the Earth is not at the centre of the universe. Human beings were suddenly an astonishingly recent phenomenon, a “parenthesis of infinitesimal brevity”, as James Joyce once wrote. During the almost inconceivable expanse of pre-human time, successive worlds had risen and collapsed. Each world had its own peculiar history, which was written in rock and waiting to be discovered.

In the early 19th century, geologists began naming and organising different rock formations in a bid to impose some order on the endless discoveries they were making. They used clues within the rock layers, such as fossils, minerals, texture and colour, to tell when formations in different locations dated to the same time period. For instance, if two bands of limestone contained the same type of fossilised mollusc, alongside a certain quartz, it was likely they had been laid down at the same point in time, even if they were discovered miles apart.

Geologists called the spans of time that the rock formations represented “units”. On the timescale today, units vary in size, from eons, which last for billions of years, to ages, which last for mere thousands. Units nestle inside each other, like Russian dolls. Officially, we live in the Meghalayan age (which began 4,200 years ago) of the Holocene epoch. The Holocene falls in the Quaternary period (2.6m years ago) of the Cenozoic era (66m) in the Phanerozoic eon (541m). Certain units attract more fanfare than others. Most people recognise the Jurassic.

The Enterprise Sand Mine on North Stradbroke Island, Australia. Photograph: Dave Hunt/AAP

As geologists began dividing deep time into units, they came up against the difficult question of boundaries – defining precisely where one phase of history transitions into the next. In the late 19th century, it was recognised that if the field was to advance, global cooperation and coordination would be necessary. The International Commission on Nomenclature, the forerunner of the present-day ICS, was established during a congress in Bologna in 1881 with the mandate of creating an international language of geology, one that was to be enshrined in the timescale.

The task of interpreting and classifying 4.6bn years of Earth history continues today. Geologists have barely begun to describe the Precambrian eon, which spans Earth’s first 4bn years. Meanwhile, well-studied units are revised as new evidence unsettles old assumptions. In 2004, the Quaternary period was unceremoniously jettisoned and the preceding period, the Neogene, extended to cover its 1.8m years. The move came as a surprise to many Quaternary geologists, who mounted an aggressive campaign to redeem their period. Eventually, in 2009, the ICS brought the Quaternary back and moved its boundary down by 800,000 years to the beginning of an ice age, a point considered more geologically significant. Having now “lost” millions of years, Neogene scientists were incandescent. “You might ask: who wasn’t upset by it?” Gibbard told me.

Modifying the geological timescale is a bit like trying to pass a constitutional amendment, with rounds of proposal and scrutiny overseen by the ICS. “We have to be relatively conservative,” said Gibbard, “because anything we do is going to have a longer-term implication in terms of the science and literature.” First, a working group drafts a proposal which is submitted to an expert subcommission for review and vote. From the subcommission, the proposal advances to the voting members of the ICS (composed of the chairs of the subcommissions, plus the chair, vice-chair and general-secretary of the ICS). Once the ICS has voted in its favour, it passes to the International Union of Geological Sciences (IUGS), geology’s highest body, to be ratified.

Whether or not a new proposal successfully passes through all these rounds comes down to the quality of evidence that the working group can amass, as well as the individual predilections of the 50-or-so seasoned geologists who constitute the senior committees.

This did not bode well for Zalasiewicz as he began to put together the Anthropocene working group. In fundamental ways, the idea of the Anthropocene is unlike anything geologists have considered before. The planet’s timekeepers have built their timescale from the physical records laid down in rocks long ago. Without due time to form, the “rocks” of the Anthropocene were little more than “two centimetres of unconsolidated organic matter”, as one geologist put it to me. “If we think about the Anthropocene in purely geological terms – and that’s the trouble, because we’re looking at it with that perspective – it’s an instant,” said Gibbard.

Z alasiewicz grew up in the foothills of the Pennines in a house that contained his parents, sister and a growing collection of rocks. When he was 12, his sister brought home a nestful of starlings, which his mother, who loved animals, nursed to health. Soon neighbours started calling round with all manner of injured birds, and for several years Zalasiewicz shared his bedroom with a little owl and a kestrel. (Kestrels, he came to know, are “rather thick creatures”.) He started volunteering at the local museum in Ludlow in the summer, where he met people who were expert in the things he cared most about, such as where to find trilobites. By his mid-teens, he told me, “geology was it”.

Now 64, Zalasiewicz is small and slight, with silver hair that sticks out like a scarecrow’s. He has worked in Leicester University’s geology department for 20 years, and presents himself as a quintessential geologist, a wearer of leather elbow patches and lover of graptolites. Yet among geologists, he is a known provocateur. His reputation stems from one of his papers, published in 2004, in which he argued that stratigraphy should throw out some of the terminology that has been in use since the discipline’s earliest days in favour of more modern terms. It was, to some, an audacious suggestion. When I emailed David Fastovsky, the former editor of the journal Geology, who had published the paper 15 years ago, he remembered it well. “The general feeling at the time,” he wrote, “was that it might be possible, but who would dare to take the first shot?”

Over the years, Zalasiewicz has indulged in thought experiments that are, among geologists, peculiar. In 1998, he wrote an article for New Scientist in which he imagined what mark humans might leave on the Earth long after we are extinct. His ideas became a book, published 10 years later, called The Earth After Us. Geologists tend to have their minds trained on the deep past, and Zalasiewicz’s forward-thinking approach marked him out. When, in 2006, Zalasiewicz broached the subject of the Anthropocene at the Geological Society meeting, Gibbard recalled thinking: “Well, these two go together very well.”

After he was appointed chair of the Anthropocene working group, Zalasiewicz needed to assemble his team. “At the time, it was simply a hypothetical and interesting question: can this thing be for real geologically?” Zalasiewicz told me when I visited him in Leicester last year. “It was arm-waving with very little specific detail. The diagrams were back-of-the-beer-mat things.”

Stratigraphic working groups are, not surprisingly, usually composed of stratigraphers. But Zalasiewicz took a different approach. Alongside traditional geologists, he brought in Earth systems scientists, who study planet-wide processes such as the carbon cycle, as well as an archeologist and an environmental historian. Soon the group numbered 35. It was international in character, if overwhelmingly male and white, and included experts with specialisms in paleoecology, radiocarbon isotopes and the law of the sea.

If the Anthropocene was, in fact, already upon us, the group would need to prove that the Holocene – an unusually stable epoch in which temperature, sea level and carbon dioxide levels have stayed relatively constant for nearly 12 millenia – had come to an end. They began by looking at the atmosphere. During the Holocene, the amount of CO2 in the air, measured in parts per million (ppm), was between 260 and 280. Data from 2005, the most recent year recorded when the working group started out, showed levels had climbed to 379 ppm. Since then, it has risen to 405 ppm. The group calculated that the last time there was this much CO2 in the air was during the Pliocene epoch 3m years ago. (Because the burning of fossil fuels in pursuit of the accumulation of capital in the west has been the predominant source of these emissions, some suggest “Capitalocene” is the more appropriate name.)

The Intercontinental Shanghai Wonderland hotel. Photograph: VCG via Getty Images

Next they looked at what had happened to animals and plants. Past shifts in geological time have often been accompanied by mass extinctions, as species struggle to adapt to new environments. In 2011, research by Anthony Barnosky, a member of the group, suggested something similar was underway once again. Others investigated the ways humans have scrambled the biosphere, removing species from their natural habitat and releasing them into new ones. As humans have multiplied, we have also made the natural world more homogenous. The world’s most common vertebrate, the broiler chicken, of which there are 23bn alive at any one time, was created by humans to be eaten by humans.

Then there was also the matter of all our stuff. Not only have humans modified the Earth’s surface by building mines, roads, towns and cities, we have created increasingly sophisticated materials and tools, from smartphones to ballpoint pens, fragments of which will become buried in sediment, forming part of the rocks of the future. One estimate puts the weight of everything humans have ever built and manufactured at 30tn tonnes. The working group argued that the remnants of our stuff, which they called “technofossils”, will survive in the rock record for millions of years, distinguishing our time from what came before.

By 2016, most of the group was persuaded that what they were seeing amounted to more than a simple fluctuation. “All these changes are either complete novelties or they are just off the scale when it comes to anything Holocene,” Zalasiewicz told me. That year, 24 working group members co-authored an article, published in the journal Science, announcing that the Anthropocene was “functionally and stratigraphically distinct” from the Holocene.

But the details were far from settled. The group needed to agree a start-date for the Anthropocene, yet there was nothing as clean as a colossal volcanic eruption or an asteroid strike to mark the point where it began. “From a geological point of view, that makes life very difficult,” said Gibbard, who is also a member of the working group.

The group was split into opposed camps, largely according to their academic specialisation. Initially, when he first proposed the notion of the Anthropocene, Paul Crutzen, who is an atmospheric chemist, had suggested the industrial revolution as the start-date because that was when concentrations of CO2 and methane began accumulating significantly in the air. Lately the Earth system scientists had come to prefer the start of the so-called “great acceleration”, the years following the second world war when the collective actions of humans suddenly began to put much more strain on the natural world than ever before. Most stratigraphers were now siding with them – they believe that the activity of the 1950s will leave a sharper indentation on the geological record. This concerned the archaeologists, who felt that privileging a 1950 start-date dismissed the thousands of years of human impact that they study, from our early use of fire to the emergence of agriculture. “There is a feeling among the archaeologists that because the word ‘anthropo’ is in there, their science should be central,” one geologist complained to me privately. Agreeing the start-date, Gibbard warned, could be the Anthropocene’s “stumbling block”.

A t the tail end of last summer, members of the working group boarded flights to Frankfurt and then took a 45-minute train west, to Mainz. Over two days, they gathered at the Max Planck Institute for Chemistry for the group’s annual meeting. Crutzen, now in his mid-80s, spent much of his career at the institute, and he was present both as a spectator and in the form of a bronze bust in the foyer. I asked him what he made of the progress of his idea. “It started with a few people and then it exploded,” he said.

Under the glow of a projector in a darkened classroom, two dozen researchers shared their latest findings on topics such as organic isotope geochemistry and peat deposits. Things proceeded without a wrinkle until the second day, when a debate broke out about the start date, which then turned into a debate about whether it was OK for different intellectual communities to use the term “Anthropocene” to mean different things. Someone at the back suggested adding the word “epoch” for the strictly geological definition, so “Anthropocene” by itself could be used generally.

“It’s just a personal view, but I think it would be confusing to have the same term having different meanings,” said a stratigrapher.

“I don’t think it would be that confusing,” an environmental scientist countered.

In the front row, Zalasiewicz watched with the air of an adjudicator. Eventually, he chimed in. “Certainly, in terms of our remit, we can only work from the geological term. We can’t police the word ‘Anthropocene’ beyond that,” he said. Throughout the meeting, Zalasiewicz seemed at pains to emphasise the Anthropocene’s geological legitimacy. He was aware that a number of influential geologists had taken against the idea, and he was worried about what might happen if the working group was seen to be straying too far from the discipline’s norms.

One of the loudest critics of the Anthropocene is Stanley Finney, who as the secretary-general of the IUGS, the body that ratifies changes to the timescale, is perhaps the most powerful stratigrapher in the world. During the meeting in Mainz, I was told that Finney was both a “big phallus of the discipline” and “really vehemently anti-Anthropocene”.

Zalasiewicz told me that Finney was an accomplished geologist, but one of a different temperament. “He sees me as someone who tries to bring in these crazy ideas by the backdoor,” he said. “I guess if you’re a geologist who spends your time in the past where you have these enormous vistas of time – the human-free zone, if you like – then to have something as fast, busy, crowded, as science-fiction-like, come into the steady, formalised, bureaucratised array of geological time, I can see it as something you might naturally take against.”

When Finney first came across the term “Anthropocene”, in a paper written by Zalasiewicz in 2008, he thought little of it. To him, it just seemed like a big fuss over the human junk on the surface of the planet. Finney, who is 71 and a professor of geological sciences at California State University, Long Beach, has spent much of his career trying to picture what the planet was like 450m years ago, during the Ordovician period, when the continents were bunched together in the southern hemisphere and plants first colonised land. Over the years, he has worked his way up through stratigraphy’s hierarchy. By the time Zalasiewicz was appointed chair of the working group, Finney was chair of the ICS. The two scientists knew each other professionally. Zalasiewicz’s favourite fossils, graptolites, are found in Ordovician strata.

Cape Coral, Florida, home to more canals than any other city in the world. Photograph: Planet

But for some time the pair had not seen eye to eye. When Zalasiewicz published his 2004 paper arguing that stratigraphers should cast off their long-established terminology, Finney was affronted by this lack of respect for the discipline’s traditions. In an attempt to find a middle ground, the pair worked on a “compromise paper”. As the writing got underway, things turned sour. Finney began to feel that Zalasiewicz was not treating his suggested revisions seriously. “He would take my comments and he would make tiny little changes but still keep the whole thing,” Finney told me. “When I saw the final draft that was ready to be accepted [by a journal], I said: ‘Take my name off, I’m not happy with this. Just take my name off.’” From then on, their relations assumed a cool distance.

Finney only decided to look at the Anthropocene in detail after he began getting comments from people who thought it was now an official part of the geological timescale. The more he looked, the less he liked the idea. “You can make the ‘big global changes’ issue out of it if you want, but as geologists we work with rocks, you know?” he told me. To Finney, a negligible amount of “stratigraphic content” has amassed since the 1950s. Geologists are used to working with strata several inches deep, and Finney thought it was excessively speculative to presume that humans’ impact will one day be legible in rock. As the Anthropocene working group gained momentum, he grew concerned that the ICS was being pressured into issuing a statement that at its heart had little to do with advancing stratigraphy, and more to do with politics.

Academics both inside and outside geology have noted the Anthropocene’s political implications. In After Nature, the law professor Jedediah Purdy writes that using the term “Anthropocene” to describe a wide array of human-caused geological and ecological change is “an effort to meld them into a single situation, gathered under a single name”. To Purdy, the Anthropocene is an attempt to do what the concept of “the environment” did in the 1960s and 70s. It is pragmatic, a way to name the problem – and thus begin the process of solving it.

Yet if a term becomes too broad, its meaning can become unhelpfully vague. “There is an impulse to want to put things in capital letters, in formal definitions, just to make them look like they’re nicely organised so you can put them on a shelf and they’ll behave,” said Bill Ruddiman, professor emeritus at the University of Virginia. A seasoned geologist, Ruddiman has written papers arguing against the stratigraphic definition of the Anthropocene on the grounds that any single start-date would be meaningless since humans have been gradually shaping the planet for at least 50,000 years. “What the working group is trying to say is everything pre-1950 is pre-Anthropocene, and that’s just absurd,” he told me.

Ruddiman’s arguments have found wide support, even from a handful of members of the working group. Gibbard told me he had started out “agnostic” about the Anthropocene but lately he had decided it was too soon to tell whether or not it really was a new epoch. “As geologists, we’re used to looking backwards,” he said. “Things that we’re living through at the moment – we don’t know how significant they are. [The Anthropocene] appears significant but it would be far easier if we were 200 to 300, possibly 2,000 to 3,000, years in the future and then we could look back and say: yes, that was the right thing to do.”

Yet for the majority of the working group, the stratigraphic evidence for the Anthropocene is compelling. “We realise the Anthropocene goes against the grain of geology in one sense, and other kinds of science, archaeology and anthropology, in another sense,” Zalasiewicz told me. “We try and deal honestly with their arguments. If they were to put out something that we couldn’t jump over, then we’d hold up our hands and say: OK, that’s a killer blow for the Anthropocene. But we haven’t seen one yet.”

T he day after the Mainz conference came to a close, a small number of working group members met at the central station and took a train to Frankfurt airport. As the train left the city it crossed the Rhine, a wide river the colour of tepid tea. Buildings became sparse, giving way to flat fields crossed by pylons and wires.

For all the years of discussion, research and debate, after the meeting it was obvious that the Anthropocene working group was still a long way off submitting its proposal to the ICS. Zalasiewicz’s favourite joke, that geologists “work in geological time”, was starting to wear thin. Proposals to amend the timescale require evidence in the form of cores of sediment that have been extracted from the ground. Within the core there must be a clear sign of major environmental change marked by a chemical or biological trace in the strata, which acts as the physical evidence of where one unit stops and another begins. (This marker is often called the “golden spike” after the ceremonial gold spike that was used to join two railway tracks when they met in the middle of the US in 1869, forming the transcontinental railroad.)

The core extraction and analysis process takes years and costs hundreds of thousands of pounds – money that, at that point, and despite grant applications, the group did not have. They discussed the problem on the train. “Beg, borrow and steal. That is the working group motto,” Zalasiewicz said, a little bitterly.

But in the months that followed the meeting, their fortunes changed. First, they received €800,000 in funding from an unexpected source, the Haus der Kulturen der Welt, a state-funded cultural institute in Berlin that has been holding exhibitions about the Anthropocene for several years. The money would finally allow the group to begin the core-extraction work, moving the proposal beyond theoretical discussion and into a more hands-on, evidence-gathering stage.

Central-eastern Brazil. Photograph: Copernicus Sentinel-2A/ESA

Then, in late April, the group decided to hold a vote that would settle, once and for all, the matter of the start-date. Working group members had one month to cast their votes a supermajority of at least 60% would be needed for the vote to be binding. The results, announced on 21 May, were unequivocal. Twenty-nine members of the group, representing 88%, voted for the start of the Anthropocene to be in the mid-20th century. For Zalasiewicz, it was a step forward. “What we’ll do now is the technical work. We’ve now moved beyond the general, almost existential question of ‘is the Anthropocene geological?’” he said, when I called him. The important votes at the ICS were still to come, but he felt optimistic.

In Mainz, after the train pulled into the airport, the group made for the departure zone. Among the chaos of wheelie suitcases and people hurrying about, suddenly a voice cried out: “Fossils!” Zalasiewicz was off to one side, eyes fixed on the polished limestone floor. “That’s a fossil, these are fossil shells,” he said, pointing to what looked like dark scratches. One was the shape of a horseshoe, and another looked like a wishbone. Zalasiewicz identified them as rudists, a type of mollusc that had thrived during the Cretaceous, the last period of the dinosaurs. Rudists were a hardy species, the main reef-builders of their time. One rudist reef ran the length of the North American coast from Mexico to Canada.

Staring at the rudists encased in limestone slabs that had been dug out of the ground and transported many miles across land, it was strange to think of the unlikeliness of their arrival in the airport floor. The rudists beneath our feet had died out 66m years ago, in the same mass extinction event that wiped out the dinosaurs. Scientists generally believe that the impact of an asteroid in Yucatan, Mexico, plunged the planet into a new phase of climatic instability in which many species perished. Geologists can see the moment of the impact in rocks as a thin layer of iridium, a metal that occurs in very low concentrations on Earth and was likely expelled by the asteroid and dispersed across the world in a cloud of pulverised rock that blotted out the sun. To stratigraphers, the iridium forms the “golden spike” between the Cretaceous and Paleogene periods.

Now that the working group has decided roughly when the Anthropocene began, their main task is picking the golden spike of our time. They are keeping their options open, assessing candidates from microplastics and heavy metals to fly ash. Even so, a favourite has emerged. From the pragmatic stratigraphic perspective, no marker is as distinct, or more globally synchronous, than the radioactive fallout from the use of nuclear weapons that began with the US army’s Trinity test in 1945. Since the early 1950s, this memento of humankind’s darkest self-destructive impulses has settled on the Earth’s surface like icing sugar on a sponge cake. Plotted on a graph, the radioactive fallout leaps up like an explosion. Zalasiewicz has taken to calling it the “bomb spike”.

Follow the Long Read on Twitter at @gdnlongread, and sign up to the long read weekly email here.

This article was amended on 30 May 2019. An earlier version incorrectly referred to the Bible as saying “God created everything in seven days”. According to the book of Genesis, God needed only six days to achieve this feat, and was able to rest on the seventh.


Alice Paul and the ERA

After almost a hundred years, the Equal Rights Amendment, originally written by Penn alumna Alice Paul and Crystal Eastman following the success of the suffrage movement, may finally be ratified as an amendment to the United States Constitution, guaranteeing equal legal rights for all American citizens regardless of sex. Virginia would be the 38 th state to approve the amendment and is in position to do so, with a Democratic majority and a 1971 state constitution that prohibits discrimination on the basis of sex. What this proposed amendment means—and whether or not it can still be ratified—is up for debate.

The Equal Rights Amendment itself is simple, with the main clause stating, “Equality of rights under the law shall not be denied or abridged by the United States or by any State on account of sex.” First introduced to Congress in 1923, the ERA has a long and beleaguered history.

Although the Amendment was defeated 38-35 in the U.S. Senate in 1946, the idea of having an equal rights amendment began to gain momentum during the progressive social movements of the 1960s, most notably the Civil Rights Act of 1964. U.S. Rep. Martha Griffiths reintroduced the proposed amendment in Congress in 1971, bringing the ERA back to the forefront. It was approved by the House of Representatives in 1971, the Senate in 1972, and 35 of the necessary 38 states by 1977.

Then support stalled in part by an effective anti-amendment campaign, notably led by the conservative crusader Phyllis Schlafly. Congress voted to extend the ratification deadline from 1979 to 1982. No additional states ratified the amendment during this period, while Idaho, Kentucky, Nebraska, Tennessee, and South Dakota revoked their ratifications.

Following the advent of fourth-wave feminism and the #MeToo movement, Nevada ratified the ERA in 2017, followed by Illinois in 2018. Virginia’s political leaders intend to put the ratification to vote in 2020. Three Penn experts discuss the feasibility and impact of the ratification: Kathleen M. Brown and Maria Murphy of the Alice Paul Center for Research on Gender, Sexuality, and Women, as well as legal historian Mary Frances Berry. The Center is named for Paul who received a Ph.D. from the University of Pennsylvania in 1912 and joined the National American Women’s Studies Association during her time here, beginning her activist work.

Why was the ERA not ratified during the original time frame?

Mary Frances Berry: While a federal constitutional amendment seemed a logical next step to Alice Paul and her party, it was not to those who wanted women to have the right to vote but did not want men and women to be treated equally in other cases. There was little discussion on the principle of the equality of rights and much on whether the ERA would violate traditional family values. The factors of time, demonstrating necessity, regional and state diversity as elements in gaining consensus in the states, the positive influence of negative Supreme Court decisions, and the expectation of disinformation spread by the opponents all helped to stall the Equal Rights Amendment.

At the same time, ERA proponents failed to convince a majority of women in enough states that the amendment was essential to their equal rights while dispelling fears that the ERA would make other changes in their lives that they did not desire. ERA supporters need an effective public relations approach, which can build a sense of necessity that the Constitution would be perfected by including the principle of equality of rights for women as an essential component of republican government in a democratic society.

Many of the reactionary social issues Schlafly raised as negative consequences of the ERA in the 1970s have already come to pass: same-sex marriage, women in the military, and all-gender restrooms. What does this say about how society influences legislation and vice versa?

Berry: Every single one of those changes noted came about because of social movement pressure, which changed the narrative as well as public perception of what women should do and men should do. Many of the traditional ideas were no longer relevant, for example when people realized that we have all-gender restrooms on airplanes. The way society has changed, women are in the workplace and many women work because they have to, including women with families. If someone like Phyllis Schlafly started to raise these issues again, they would most likely not seem as relevant as it did then.

If the ERA is ratified and goes to the Supreme Court, what will that look like?

Berry: If the ERA is ratified, and I think it will be, then there will be a legal dispute. Many of the people who would have been opposed to it have gone on to other battles. There still are some issues about women and families—homeschooling, for example—that are still there. The entire issue is not gone, but it’s not major. It may just be that the opposition will roll over and play dead, but I’m sure someone will raise the issue regarding the extension, the time passed, and whether or not the rescissions are valid. If the Court decides against the ERA’s validity, the people who want it would have to start all over again.

Maria Murphy: I imagine it's going to be a fight on the grounds of what constitutes ‘sex’ and how sex is defined. Beyond getting the ERA ratified and officially integrated as a constitutional amendment, I think the aftermath, when the amendment is tested and taken up in courts, will really determine how sex is understood and misunderstood.

How will the ERA affect trans and non-cisgendered people?

Murphy: From my understanding, the language of the ERA is rather vague, and although people generally assume that sex-based discrimination refers to women, the words ‘woman’ or ‘women’ do not appear in the principal clause of the ERA. That ‘sex’ is the operative term opens up possibilities for how the ERA might afford protections to trans and nonbinary folks and speak more generally to extending protections for people who are gender-nonconforming in any number of ways. Although I think it is difficult to predict the impact of the ERA on non-cis people at this stage, before it's been ratified, I believe its potential ratification will open doors to expanding equal rights protections in ways that Alice Paul probably did not/could not have imagined and perhaps in ways we cannot either.

What do you foresee as the biggest consequence of a ratified equal rights amendment? How would the proposed amendment affect our lives or not?

Berry: The overall consequences would not be as great as if the amendment were ratified originally. One possible change is that the Supreme Court might be more willing to have a vote based on ending gender discrimination if we had an amendment in the Constitution, which is stronger than interpreting statutes.

There might be a long debate about whether the ERA includes sexual orientation. Currently, you can have a same-sex marriage but have no right to nondiscrimination in your place of employment. When the ERA was originally before Congress, no one was discussing how sexual orientation would influence what happened. Now, it is likely to be something that is raised.

Some people have suggested that an ERA might mean we have greater access to child care, but I’m not sure that’s right. I’m not sure some automatic glow will suffuse the environment, but I think in the long term it should have some positive effects.

Kathleen M. Brown: Like all constitutional amendments, the ERA provides a firmer ground for women’s equality than Title VII of the 1964 Civil Rights Act, prohibiting discrimination against employees in workplaces with more than 15 employees and Title IX of the Educational Amendments Act of 1972, prohibiting discrimination within educational institutions receiving federal funding. Workplaces with fewer than 15 employees and institutions that receive no federal funding can evade the requirements of Title VII and Title IX. As articles of legislative acts, moreover, Title VII and Title IX can more easily be overturned than the provisions of a constitutional amendment.

In theory, the ERA could provide protections for transgender people’s equal access to the law, due process, and privacy in much the same way that attorneys are currently attempting to use the existing Title IX protections against sex discrimination to protect transgender people from the harms of discrimination.

How will this moment be taught in future history courses?

Murphy: Often, when students encounter societal change through a historical lens with watershed moments like this, the narrative can privilege stories that imply progress is won primarily through governmental and legal avenues. But in many ways, activists and community organizers have been chipping away at discriminatory practices in the work force, for example, and building in protections on the basis of sex in other creative ways, often outside of legal/governmental frameworks. Having the ERA ratified and integrated into the Constitution would commemorate over 100 years of activism and alternative methods of effecting change.

Brown: I was a young person when the ERA’s ratification was defeated, and I assumed that this was the end of the ERA. During the intervening decades, social justice activists and their attorneys have found strategies for working around the lack of constitutional protection for gender equality. Some of these, including the interpretations of Title IX of the Educational Amendments Act of 1972, are still proving useful in the present day in arguments to end the discriminations suffered by transgender people. These legal strategies are important and have been effective, but they cannot take the place of a constitutional amendment.

When I teach students about the defeat of the ERA, they are often shocked. They think they are living in a world with an equal rights amendment already on the books, and they are shocked to learn that the main protection against sex and gender discrimination is an easily overturned legislative act.

Mary Frances Berry is the Geraldine R. Segal Professor of American Social Thought and Professor of History in the School of Arts and Sciences at the University of Pennsylvania.

Kathleen M. Brown is the David Boies Professor of History and the director of the Alice Paul Center for Research on Gender, Sexuality & Women in the School of Arts and Sciences at the University of Pennsylvania.

Maria Murphy is the interim associate director of the Alice Paul Center for Research on Gender, Sexuality & Women in the School of Arts and Sciences at the University of Pennsylvania.


How will historians view us?

A. Richard Allen for the Boston Globe

History is a lot like forestry. In the latter, you often can’t see the forest for the trees, and in the former you often can’t see the epoch for the incidents. Though it hardly seems as momentous as the Great Depression or the civil rights era, our current period may be one of the most significant in American history — one that may well determine what kind of country we will be for decades hence. To put our own times in focus, it helps to ask: What will historians 50 or 150 years from now think of the early 21st century?

It is an apt question, because history has a way of challenging and altering the perceptions that any time has of itself. In its own day, for example, the 1920s were a boon period that gave rise to national free-spiritedness. In the long eye of history, they were the myopic prelude to the Great Depression. In his own day, Harry Truman was an accidental president, a pipsqueak who couldn’t fill FDR’s shoes. In the long eye of history, he is regarded as one of our most successful presidents, navigating the sticky post-war period internationally, and helping propel an economic boom domestically.

Predicting the historical long view is a risky proposition, but let me hazard a guess: Historians will wonder what bizarre convulsions this nation was going through — how it seemed to lose its moral, political, and economic bearings, how the gains of social and economic equality that were a century in the making were reversed, and, above all, how the country actually became less democratic, often with the acquiescence of many ordinary Americans.

The first thing historians are likely to fasten on is the historic economic inequality in America today. As the French economist Thomas Piketty has documented in his pathbreaking book, “Capital in the 21st Century,” America, the vaunted land of opportunity, has become one of the most unequal nations in the history of the world when it comes to wealth distribution — a country in which the top 1 percent own nearly 40 percent of the nation’s wealth.

Historians will certainly also focus on the fight to disenfranchise poor and minority voters after 100 years of advancing civil rights. They will discuss how the Supreme Court and the Republican Party succeeded in rolling back many of those achievements — the court by ripping out a central provision of the Voting Rights Act, and Republican state legislatures by imposing onerous voter registration restrictions that, let’s face it, have one aim only: to suppress minority voting, which is likely to tilt Democratic.

They will cite the role of money in politics and the sudden turnabout by the Supreme Court in the Citizens United and McCutcheon decisions, which released a torrent of big money into American politics.

They will look at the nation’s increasing churlishness — its reluctance to embrace health reform that would provide insurance to those who cannot otherwise afford it, its willingness to cut benefits, like food stamps, that primarily help the young and the elderly, its grudging extension of unemployment benefits to people afflicted by the economic downturn.

And historians will say that these are not discrete things but that they coalesce to form what may be called the age of inequality. Historians are also likely to see how this age of inequality answered what has been arguably the nation’s foremost question from its founding: Is America to be an aristocracy or a democracy? Ever since Andrew Jackson, the thrust, with a few detours, has been toward democracy. Historians will show that had changed in the late 20th and early 21st century, not necessarily because most Americans wanted economic inequality, voter suppression, big money in politics, or cruelty to the poor but because the system wasn’t responsive to them. It had become oligarchic.

I suspect that historians will view this as a terribly bleak period — another Gilded Age but worse. They will observe that the ever-fragile democratic enterprise was hijacked, perhaps permanently. They will mainly blame the Republicans, though if Republicans will be accused of lacking heart and brains in promulgating these policies, Democrats will be accused of lacking guts in not fighting them more strenuously. They will show how Ronald Reagan’s seeds of economic inequality finally sprouted into our society of the super-rich and everyone else.

And they will wonder: Why there was so little resistance?

The answer is complex, but it seems to have two primary components. The first is that resistance is basically futile, and everyone knows it. The wealthy have always worked the levers of power, and though we have had periods of greater equality — the period from the end of the Great Depression to the beginning of Reagan’s presidency — America is more or less an oligarchy by design. The only difference now is that there is nothing surreptitious about it.

And that leads to the second component. As intellectuals are fond of saying, ideas have consequences. It is just that the consequences may have less to do directly with policy than with mythology. The prevailing mythology has been that the wealthy are deserving of their spoils — that they are a living example of the proposition that anyone who wants to make it in America can. Of course, people want to believe that, but it provides great cover for inequality. You almost feel un-American protesting that it isn’t remotely true.

So the country rolls on, and it rolls back. And historians will wonder how the 21st century came to resemble the end of the 19th — a terrible time when the wealthy ruled and everyone else capitulated.

Neal Gabler is author of “Walt Disney: The Triumph of the American Imagination.”


Not So Evident

F acts have a history, and we ought to admit it. In op-eds, public lectures, and social media, historians take great pains to correct falsehoods about the past and the present (especially in my field, immigration history). But the basis of much of our profession&rsquos outrage&mdashthat policy should be based on a certain kind of fact&mdashitself has a history.

Border wall prototypes near the Otay Mesa Port of Entry in San Diego, California. U.S. Customs and Border Protection/Flickr/United States Government Work

Ultimately, that history dates most prominently to the Enlightenment. But more directly, in the history of federal power and the administrative state&mdashin the United States, but also in Europe and Latin America&mdashit dates to the Progressive Era&rsquos professionalization of expertise. With it came the enshrinement of objective facts to undergird and justify public policies such as economic regulation, conservation and environmental policy, and&mdashnot least&mdashimmigration.

My recent book, Inventing the Immigration Problem: The Dillingham Commission and Its Legacy (Harvard Univ. Press, 2018), explores the confluence of government social science expertise and &ldquofacts&rdquo in early 20th-century US immigration policy. From 1907 to 1911, the Dillingham Commission conducted the largest-ever study of immigrants in the United States, and it helped create the idea that immigration was a &ldquoproblem&rdquo that (only) the federal government could and should &ldquofix.&rdquo

The Dillingham Commission had nine appointed members: three senators, three congressmen, and three &ldquoexperts&rdquo chosen by President Theodore Roosevelt. Jeremiah Jenks, a professor of economics at Cornell University, organized much of the work and has been called by historians of social science the first &ldquogovernment expert.&rdquo The commission and its staff visited or gathered data on all 46 states and several territories. A staff of more than 300 men and women compiled 41 volumes of reports, including a potent set of recommendations that shaped immigration policy for generations to come. The commission&rsquos agents had advanced degrees from the Ivy Leagues and large public research institutions like Wisconsin, Michigan, Ohio State, and Berkeley. Economics degrees dominated, though others had degrees in sociology, law, medicine, political science, and anthropology (including Franz Boas, who wrote an important treatise on new immigrants&rsquo bodies and head shapes for the commission). Twenty reports on immigrants in American industries formed the bulk of the work, but other volumes considered everything from conditions on transatlantic steamships to prostitution, debt peonage, crime, schools, agriculture, philanthropic societies, other countries&rsquo immigration laws, and immigrant women&rsquos &ldquofecundity.&rdquo

The Dillingham Commission relied on a veneer of objectivity but engaged in thinking and work that was deeply flawed.

Throughout the process, the commissioners insisted that they and the social scientists they hired were objective. In 1909, Massachusetts senator and commission member Henry Cabot Lodge defended the commission member most sympathetic to immigrants, Republican Congressman William S. Bennet, who represented Jewish Harlem. Bennet &ldquois as determined as I am to get all the facts,&rdquo said Lodge. In the commission&rsquos work, he insisted, &ldquoBennet has not tried to suppress anything.&rdquo But what did objectivity mean for these men? Lodge was a true believer in social science he earned one of Harvard&rsquos first PhDs in history and government. He was also, in the words of immigration historian John Higham, the new immigrants&rsquo &ldquomost dangerous adversary.&rdquo His fellow commissioner, California businessman William R. Wheeler, insisted that they wanted to &ldquolearn the facts.&rdquo The commission&rsquos final report insisted that its conclusions would not be based on race or cultural considerations, but on the sound basis of economics and social science.

The Dillingham Commission is best known for recommending what would become the first restrictions on immigrants based on quantity (numbers) rather than quality (individual politics, health, class, or race status, as previous laws prescribed). It recommended a literacy test for immigrants, along with a continued ban on Asian immigrants, additional regulations and head taxes, and&mdashfor the first time&mdashactual numerical limits on immigration, a quota. The literacy test was enacted in 1917 over two vetoes by Woodrow Wilson. And the final recommendation became, by the 1920s, the national origins quota system that openly discriminated against southern and eastern Europeans by using a national quota based on the US population in 1890&mdashbefore most of the so-called new immigrants from southern and eastern Europe had arrived.

Immigrants wait in the Great Hall at Ellis Island after finishing their first mental inspection. Edwin Levick/The New York Public Library/Public Domain

The members&rsquo backgrounds and training relied on a new social science model of &ldquoproblem&rdquo (in this case, immigrants) and &ldquosolution&rdquo (restrictive legislation). Commissioners produced a particular kind of knowledge, valued because it was quantitative and produced by experts. But the commission did not necessarily follow it to its conflicting conclusions&mdashthe commission&rsquos data and evidence, as historian Oscar Handlin long ago recognized, did not support its recommendations. But the commission believed in federal power in general, and in federal power over immigration policy specifically. So, too, did its rank-and-file employees, from the women who enjoyed rare career opportunities and personal authority to the economist technocrats who had worked in Puerto Rico and the Philippines, where federal officials experimented with new forms of governance.

We historians do our work in particular moments, and is not even our devotion to expertise and facts relative to our own moment? I began this project in the early days of the presidency of Barack Obama, whose own infatuation with experts made me a bit nervous. Although I was thrilled by his election, I was never comfortable with Obama&rsquos reliance on Ivy League&ndasheducated wonks. My research on the Dillingham Commission made me more deeply skeptical of its experts, whose conclusions had enduring and racist consequences. The commission and its staff relied on a veneer of objectivity&mdashone they themselves carefully applied and believed in&mdashbut engaged in thinking and work that in retrospect was deeply flawed.

Historians&rsquo professional status has a history, rooted in the Progressive Era&rsquos invention of credentialed experts.

I&rsquove often told my students that you know you&rsquore doing good history when it bumps up against your own politics. But then came the election of Trump in 2016, and now my (minor, cautionary, gesturing) inveighing against experts feels quaint at best, dangerous at worst. Context is everything, and I must confess that I now see the Dillingham Commission&rsquos experts in a more sympathetic light, although I still disagree with their conclusions. The Dillingham Commission was responding to a real event&mdashthe massive influx of new immigrants to the United States from southern and eastern Europe since 1882. Their subject was real, even as their labeling it as a &ldquoproblem&rdquo was deeply subjective. In contrast, some so-called immigration problems or crises don&rsquot even appear to be real&mdashborder crossings are down, undocumented immigrants commit fewer crimes than US citizens (I could go on). And &ldquofacts&rdquo seem to have nothing to do with &ldquoproblems&rdquo or the proposed or actual solutions to them. Some&mdashlike family separation&mdashare far worse than the &ldquoproblems&rdquo for which they are prescribed. Rhetoric about the border is totally unhinged from reality.

Yet the Dillingham Commission&rsquos utter wrongness&mdashthat Asians and eastern and southern Europeans would not assimilate, that they were a &ldquoproblem&rdquo in the first place&mdashought to give us all pause, too. We ought to recognize that our own claims of truthfulness are situated in a belief system that is about values, too, not just about facts. It is telling&mdashand salutary&mdashthat the AHA&rsquos 2013 tuning of the history discipline lists empathy as one of the essential components of historical practice. To practice empathy is to be sympathetic and mindful of the complexity of our subjects and, I would argue, the limits of our own and others&rsquo expertise. The burgeoning authority of social science and certitude in its modern facts encouraged statist solutions to social problems. In turn, it bolstered support for the very governmental overreaches in immigration policy at which President Trump lunges.

Historians should, of course, continue to call out the falsehoods and vitriol that are today presented as public discourse. But we should also recognize that our professional status has a history, rooted in the Progressive Era&rsquos invention of credentialed experts, whose own hubris became baked into the rise of the administrative state. If the administrative state is part of the immigration &ldquoproblem,&rdquo and it was in some sense created by our social science forebears, then we need to recognize that we are living out a paradox that no call for reason based on facts can unravel.

Katherine Benton-Cohen is associate professor of history at Georgetown University. She is the author of Borderline Americans: Racial Division and Labor War in the Arizona Borderlands (2009) and Inventing the Immigration Problem: The Dillingham Commission and Its Legacy (2018). She recently served as historical adviser for the film Bisbee &rsquo17 (2018). She tweets @GUProfBC.

/>
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License. Attribution must provide author name, article title, Perspectives on History, date of publication, and a link to this page. This license applies only to the article, not to text or images used here by permission.

The American Historical Association welcomes comments in the discussion area below, at AHA Communities, and in letters to the editor. Please read our commenting and letters policy before submitting.


Using History to Understand Current Social Issues

Many current social issues have long histories, and many teens are expressing interest in understanding the historical context of contemporary politics. To become better informed, teens might want to revisit these issues as they played out in history to gain a deeper understanding of modern day events and attitudes. As teens learn more and judge for themselves how the past compares to attitudes today, it could also inspire a deeper understanding of human rights and our responsibilities as humans in today’s modern society.

While this author is not an expert on these topics, she hopes it will encourage teens and teen advocates to understand the past and how this could foster discussion on our current societal issues.

Nazi Party Rally Grounds (1934) – Wikimedia Commons

Rise of Nationalism vs. Rise of the Nazis

A number of countries have seen an emerging rise in nationalism, including the U.S for 2016. A quick search will sport numerous news articles on the topic. In some cases of both past and recent years, this nationalism has resulted in revolutions and independence for countries, for example, Great Britain’s “Brexit” decision to remove itself from the European Union. However, in the 1920’s through 1930’s, nationalism paired with discrimination and xenophobia resulted in the National Socialist German Worker’s Party and the rise of the Nazis. For more understanding about German nationalism during the Nazi era and those searching for social justice during that time, here are a few online and print resources to give a brief view into available information and viewpoints during that period.

Online Resources:

    by the Florida Center for Instructional Technology, also linking to educational resource The History Place about Hitler’s election.
  • Calvin College has also collected an online archive of examples of Nazi propaganda and speeches.
  • United States Memorial Holocaust Museum also covers many topics:

YA Nonfiction:

We Will Not Be Silent: The White Rose Student Movement That Defied Adolf Hitler by Russell Freedman

Two siblings formerly part of the Hitler Youth form a secret resistance group called the White Rose and distribute anti-Nazi materials.

Beyond Courage: The Untold Story of Jewish Resistance During the Holocaust by Doreen Rappaport (YALSA’s Popular Paperbacks for Young Adults – 2015)

A variety of profiles of Jewish people who defied the current climate to save others and are remembered in this detailed look, including some teens.

Branded by the Pink Triangle by Ken Setterington (YALSA Nonfiction Award nomination 2014)

This overview documents changes in society with the rise of the Nazi Party, paying specific attention to treatment of homosexuals.

YA/Middle Grade Fiction:

A teen joins Hitler Youth but questions his teachings with those of his youth and comes to rebel by distributing underground information of news reports.

Prisoner of Night and Fog by Anne Blankman

A close look at the rise of Adolf Hitler in the eyes of his niece who befriends a young reporter who transforms her views.

Projekt 1065 by Alan Gratz

An Irish/British spy masquerades as a Hitler youth in this high stakes thriller.

Adult Nonfiction for further research:

Hitlerland: American Eyewitnesses to the Nazis Rise to Power by Andrew Nagorski

American journalists living in Germany gained a first-hand account of the Nazis rise to power.

The Third Reich in History and Memory by Richard J. Evans

An overview of the rise to power, height of dominance, and postwar era in history and memories.

Japanese Internment vs. Anti-Islam

A number of reports have been in the news lately both for the US and other countries against Muslims, especially Muslim refugees. Some reports have related a comparison of the Anti-Islam sentiment and the future possibility of a Muslim registry to the attitude against Japanese Americans after the bombing of Pearl Harbor during World War II. At that time, West Coast Japanese Americans were considered potential enemies of the military and were sent through an executive order by then President Franklin Delano Roosevelt into internment camps. Later this period was defined as a human rights violation and some reparations were made towards Japanese American survivors. A few resources following Japanese Americans during this period in history are found below.

Online Resources:

  • The Gilder Lehrman Institute of American History – From Citizen to Enemy: The Tragedy of Japanese Internment
  • US National Archives – Japanese Relocation During World War II – Documents oral history of incarcerated Japanese Americans from World War II – collects news articles and resources pertaining to those Japanese Americans from San Francisco during the 1940’s

YA Nonfiction:

Imprisoned: The Betrayal of Japanese Americans During World War II written by Martin W. Sandler (YALSA Nonfiction Award finalist 2014)

Sandler introduces evacuees and their families and documents their experiences, including those Japanese Americans who served in the U.S. military.

Fighting for Honor: Japanese Americans and World War II by Michael L. Cooper

A more extensive look at Japanese Americans in the military fighting during World War II.

Dear Miss Breed: True Stories of the Japanese American Incarceration During World War II and a Librarian Who Made a Difference by Joanne Oppenheim

A San Diego children’s librarian writes to her child-age and teenage patrons who were taken into Japanese American internment camps.

Weedflower by Cynthia Kadohata

A Japanese American girl is sent to an internment camp on the Mojave Indian Reservation and finds that she and a Native American boy share some things in common.

Farewell to Manzanar: A True Story of Japanese American Experience During and After the World War II Internment by Jeanne Wakatsuki Houston and James D. Houston

A brief story about a teen girl sent to Manzanar internment camp and its effect on her family.

Adult Nonfiction for further research:

Infamy: The Shocking Story of the Japanese American Internment in World War II by Richard Reeves

A journalist traces a detailed and comprehensive history of Japanese American internment camps and the events by political leaders that led to the decision.

Silver Like Dust: One Family’s Story of America’s Japanese Internment by Kimi Cunningham Grant

Author learns and recounts her grandmother’s experience in a Japanese internment camp and comes to accept her heritage.

Latin American Politics

The recent death of Fidel Castro, leader/dictator of Cuba, has spurred talk of the era of Latin American dictators, whose practices and policies are still ongoing since Cuba still has a one-party dictatorship under Raul Castro with no opposition permitted. Additionally, recent news articles have compared certain political leaders to Latin American dictators in possessing a similar style in address and authority. Though there is less material published overall on these specific topics, especially in young adult literature, here are a few sources to explore.

Online Resources:

  • Encyclopaedia Britannica – Challenges to the Political Order and Latin American Since the Mid-Twentieth Century covers an overview of developments in Latin America from revolutions and military regimes to political changes and populism by the University of Oregon and Universitat Munster records the changes in the Caribbean, Central and South America from the end of the 19th century and the predominant oligarchies and flows into the late 20th century with notes about military regimes, juntas, and one-party states.

YA Nonfiction:

Leaving Glorytown: One Boy’s Struggle under Castro by Eduardo F. Calcines (YALSA Nonfiction Award nominee 2010)

A memoir about life in Cuba at the beginning of the Communist revolution and immigrating to the United States as a teen.

Che Guevara: You Win or You Die by Stuart A. Kallen

A revolutionary who became friends with Castro and together they overthrew the dictator in Cuba but Guevara was assassinated.

Augusto Pinochet’s Chile by Diana Childress

Covers military leader Pinochet’s rise to power in a military coup and his control through a junta and naming himself president of Chile and becoming a dictator despite trying to save his country from Communism.

Note: Readers might find this particularly interesting as the president Pinochet overthrew was Salvador Allende, the uncle of author Isabel Allende, found below.

Latin American Fiction

In the Time of the Butterflies by Julia Alvarez

Three sisters are murdered and the fourth is left to tell their stories of life under the horrors of dictator’s rule in the Dominican Republic

The House of the Spirits by Isabel Allende

A history of Latin America and Chile as seen through the tragic lives of the Truebas family.

The Autumn of the Patriarch by Gabriel Garcia Marquez

The investigation into the murder of a South American dictator reveals his evolution from leader to dictator.

Adult Nonfiction for further research:

Red Heat: Conspiracy, Murder, and the Cold War in the Caribbean by Alex von Tunzelmann

A history of three dictators of the Caribbean during the Cold War.

Looking for History: Dispatches from Latin America by Alma Guillermoprieto

A series of essays in which the author describes Latin American politics and society of Colombia, Cuba and Mexico as well as references to Argentina and Peru.

Gringo: Coming-of-Age in Latin America by Chesa Boudin

A man travels through Latin America recounting his experiences in history and local political views.

Readers might be interested to know educational database JSTOR publishes some so-termed ‘scholarly news’ articles that relate history to current events however, articles are written by a variety of authors with many points of view.

We welcome any informational contributions to these resource lists by commenting below!


World History Era 4

Beginning about 300 CE almost the entire region of Eurasia and northern Africa experienced severe disturbances. By the 7th century, however, peoples of Eurasia and Africa entered a new period of more intensive interchange and cultural creativity. Underlying these developments was the growing sophistication of systems for moving people and goods here and there throughout the hemisphere–China’s canals, trans-Saharan camel caravans, high-masted ships plying the Indian Ocean. These networks tied diverse peoples together across great distances. In Eurasia and Africa a single region of intercommunication was taking shape that ran from the Mediterranean to the China seas. A widening zone of interchange also characterized Mesoamerica.

A sweeping view of world history reveals three broad patterns of change that are particularly conspicuous in this era.

Islamic Civilization: One of the most dramatic developments of this 700-year period was the rise of Islam as both a new world religion and a civilized tradition encompassing an immense part of the Eastern Hemisphere. Commanding the central region of Afro-Eurasia, the Islamic empire of the Abbasid dynasty became in the 8th-10th-century period the principal intermediary for the exchange of goods, ideas, and technologies across the hemisphere.

Buddhist, Christian, and Hindu Traditions: Not only Islam but other major religions also spread widely during this 700-year era. Wherever these faiths were introduced, they carried with them a variety of cultural traditions, aesthetic ideas, and ways of organizing human endeavor. Each of them also embraced peoples of all classes and diverse languages in common worship and moral commitment. Buddhism declined in India but took root in East and Southeast Asia. Christianity became the cultural foundation of a new civilization in western Europe. Hinduism flowered in India under the Gupta Empire and also exerted growing influence in the princely courts of Southeast Asia.

New Patterns of Society in East Asia, Europe, West Africa, Oceania, and Mesoamerica: The third conspicuous pattern, continuing from the previous era, was the process of population growth, urbanization, and flowering of culture in new areas. The 4th to 6th centuries witnessed serious upheavals in Eurasia in connection with the breakup of the Roman and Han empires and the aggressive movements of pastoral peoples to the east, west, and south. By the 7th century, however, China was finding new unity and rising economic prosperity under the Tang. Japan emerged as a distinctive civilization. At the other end of the hemisphere Europe laid new foundations for political and social order. In West Africa towns flourished amid the rise of Ghana and the trans-Saharan gold trade. In both lower Africa and the Pacific basin migrant pioneers laid new foundations of agricultural societies. Finally, this era saw a remarkable growth of urban life in Mesoamerica in the age of the Maya.

Why Study This Era?

  • In these seven centuries Buddhism, Christianity, Hinduism, and Islam spread far and wide beyond their lands of origin. These religions became established in regions where today they command the faith of millions.
  • In this era the configuration of empires and kingdoms in the world changed dramatically. Why giant empires have fallen and others risen rapidly to take their place is an enduring question for all eras.
  • In the early centuries of this era Christian Europe was marginal to the dense centers of population, production, and urban life of Eurasia and northern Africa. Students should understand this perspective but at the same time investigate the developments that made possible the rise of a new civilization in Europe after 1000 CE.
  • In this era no sustained contact existed between the Eastern Hemisphere and the Americas. Peoples of the Americas did not share in the exchange and borrowing that stimulated innovations of all kinds in Eurasia and Africa. Therefore, students need to explore the conditions under which weighty urban civilizations arose in Mesoamerica in the first millennium CE.

Each standard was developed with historical thinking standards in mind. The relevant historical thinking standards are linked in the brackets, [ ], below.

STANDARD 1

Imperial crises and their aftermath, 300-700 CE.

Standard 1A

The student understands the decline of the Roman and Han empires.

GRADE LEVEL THEREFORE, THE STUDENT IS ABLE TO
5-12 Analyze various causes that historians have proposed to account for the decline of the Han and Roman empires. [Evaluate major debates among historians]
5-12 Trace the migrations and military movements of major pastoral nomadic groups into both the Roman Empire and China. [Reconstruct patterns of historical succession and duration]
7-12 Compare the consequences of these movements in China and the western part of the Roman Empire. [Analyze cause-and-effect relationships]
9-12 Analyze comparatively the collapse of the western part of the classical Roman Empire and the survival of the eastern part. [Compare and contrast differing sets of ideas]
9-12 Describe the consolidation of the Byzantine state after the breakup of the Roman Empire and assess how Byzantium transmitted ancient traditions and created a new Christian civilization. [Reconstruct patterns of historical succession and duration]

Standard 1B

The student understands the expansion of Christianity and Buddhism beyond the lands of their origin.

GRADE LEVEL THEREFORE, THE STUDENT IS ABLE TO
5-12 Assess how Christianity and Buddhism won converts among culturally diverse peoples across wide areas of Afro-Eurasia. [Demonstrate and explain the influence of ideas]
7-12 Analyze the spread of Christianity and Buddhism in the context of change and crisis in the Roman and Han empires. [Analyze cause-and-effect relationships]
7-12 Analyze the importance of monasticism in the growth of Christianity and Buddhism and the participation of both men and women in monastic life and missionary activity. [Compare and contrast differing values, behaviors, and institutions]

Standard 1C

The student understands the synthesis of Hindu civilization in India in the era of the Gupta Empire.

GRADE LEVEL THEREFORE, THE STUDENT IS ABLE TO
5-12 Describe fundamental features of the Hindu belief system as they emerged in the early first millennium CE. [Appreciate historical perspectives]
7-12 Explain the rise of the Gupta Empire and analyze factors that contributed to the empire’s stability and economic prosperity. [Analyze multiple causation]
7-12 Analyze how Hinduism responded to the challenges of Buddhism and prevailed as the dominant faith in India. [Reconstruct patterns of historical succession and duration]
7-12 Analyze the basis of social relationships in India and compare the social and legal position of women and men during the Gupta era. [Interrogate historical data]
5-12 Evaluate Gupta achievements in art, literature, and mathematics. [Appreciate historical perspective]
9-12 Analyze the Gupta decline and the importance of Hun invasions in the empire’s disintegration. [Analyze multiple causation]

Standard 1D

The student understands the expansion of Hindu and Buddhist traditions in Southeast Asia in the first millennium CE.

GRADE LEVEL THEREFORE, THE STUDENT IS ABLE TO
5-12 Assess the relationship between long-distance trade of Indian and Malay peoples and the introduction of Hindu and Buddhist traditions in Southeast Asia. [Analyze cause-and-effect relationships]
7-12 Explain the impact of Indian civilization on state-building in mainland Southeast Asia and the Indonesian archipelago. [Analyze cause-and-effect relationships]
7-12 Evaluate monumental religious architecture exemplifying the spread of Buddhist and Hindu belief and practice in Southeast Asia. [Draw upon visual sources]
9-12 Explain how aspects of Buddhism and Hinduism were combined in Southeast Asian religious life. [Interrogate historical data]

STANDARD 2

Causes and consequences of the rise of Islamic civilization in the 7th-10th centuries.

Standard 2A

The student understands the emergence of Islam and how it spread in Southwest Asia, North Africa, and Europe.

GRADE LEVEL THEREFORE, THE STUDENT IS ABLE TO
9-12 Analyze the political, social, and religious problems confronting the Byzantine and Sassanid Persian empires in the 7th century and the commercial role of Arabia in the Southwest Asian economy. [Analyze multiple causation]
5-12 Describe the life of Muhammad, the development of the early Muslim community, and the basic teachings and practices of Islam. [Assess the importance of the individual]
7-12 Explain how Muslim forces overthrew the Byzantines in Syria and Egypt and the Sassanids in Persia and Iraq. [Interrogate historical data]
5-12 Analyze how Islam spread in Southwest Asia and the Mediterranean region. [Analyze the influence of ideas]
9-12 Analyze how the Arab Caliphate became transformed into a Southwest Asian and Mediterranean empire under the Umayyad dynasty and explain how the Muslim community became divided into Sunnis and Shi’ites. [Reconstruct patterns of historical succession and duration]
7-12 Analyze Arab Muslim success in founding an empire stretching from western Europe to India and China and describe the diverse religious, cultural, and geographic factors that influenced the ability of the Muslim government to rule. [Analyze cause-and-effect relationships]

Standard 2B

The student understands the significance of the Abbasid Caliphate as a center of cultural innovation and hub of interregional trade in the 8th-10th centuries.

GRADE LEVEL THEREFORE, THE STUDENT IS ABLE TO
9-12 Compare Abbasid government and military institutions with those of Sassanid Persia and Byzantium. [Compare and contrast differing values and institutions]
7-12 Describe sources of Abbasid wealth, including taxation, and analyze the economic and political importance of domestic, military, and gang slavery. [Employ quantitative data]
7-12 Analyze why the Abbasid state became a center of Afro-Eurasian commercial and cultural exchange. [Analyze cause-and-effect relationships]
5-12 Analyze the sources and development of Islamic law and the influence of law and religious practice on such areas as family life, moral behavior, marriage, inheritance, and slavery. [Examine the influence of ideas]
7-12 Describe the emergence of a center of Islamic civilization in Iberia and evaluate its economic and cultural achievements. [Appreciate historical perspectives]
9-12 Describe the cultural and social contributions of various ethnic and religious communities, particularly the Christian and Jewish, in the Abbasid lands and Iberia. [Appreciate historical perspectives]
7-12 Evaluate Abbasid contributions to mathematics, science, medicine, literature, and the preservation of Greek learning. [Interrogate historical data]
5-12 Assess how Islam won converts among culturally diverse peoples across wide areas of Afro-Eurasia. [Analyze cause-and-effect relationships]

Standard 2C

The student understands the consolidation of the Byzantine state in the context of expanding Islamic civilization.

GRADE LEVEL THEREFORE, THE STUDENT IS ABLE TO
5-12 Explain how the Byzantine state withstood Arab Muslim attacks between the 7th and 10th centuries. [Analyze cause-and-effect relationships]
9-12 Compare Byzantium’s imperial political system with that of the Abbasid state. [Compare and contrast differing values and institutions]
7-12 Evaluate the Byzantine role in preserving and transmitting ancient Greek learning. [Reconstruct patterns of historical succession and duration]
9-12 Analyze the expansion of Greek Orthodox Christianity into the Balkans and Kievan Russia between the 9th and 11th centuries. [Analyze multiple causation]

STANDARD 3

Major developments in East Asia and Southeast Asia in the era of the Tang dynasty, 600-900 CE.

Standard 3A

The student understands China’s sustained political and cultural expansion in the Tang period.

GRADE LEVEL THEREFORE, THE STUDENT IS ABLE TO
7-12 Explain how relations between China and pastoral peoples of Inner Asia in the Tang period reflect long-term patterns of interaction along China’s grassland frontier. [Explain historical continuity and change]
9-12 Describe political centralization and economic reforms that marked China’s reunification under the Sui and Tang dynasties. [Analyze cause-and-effect relationships]
5-12 Describe Tang imperial conquests in Southeast and Central Asia. [Reconstruct patterns of historical succession and duration]
5-12 Describe the cosmopolitan diversity of peoples and religions in Chinese cities of the early- and mid-Tang period. [Appreciate historical perspectives]
7-12 Assess explanations for the spread and power of Buddhism in Tang China, Korea, and Japan. [Analyze cause-and-effect relationships]
7-12 Evaluate creative achievements in painting and poetry in relation to the values of Tang society. [Appreciate historical perspectives]

Standard 3B

The student understands developments in Japan, Korea, and Southeast Asia in an era of Chinese ascendancy.

GRADE LEVEL THEREFORE, THE STUDENT IS ABLE TO
7-12 Explain how Korea assimilated Chinese ideas and institutions yet preserved its political independence. [Compare and contrast different sets of ideas]
5-12 Describe the indigenous development of Japanese society up to the 7th century. [Interrogate historical data]
7-12 Assess the patterns of borrowing and adaptation of Chinese culture in Japanese society from the 7th to the 11th century. [Analyze the influence of ideas]
5-12 Describe the establishment of the imperial state in Japan and assess the role of the emperor in government. [Reconstruct patterns of historical succession and duration]
5-12 Assess the political, social, and cultural contributions of aristocratic women of the Japanese imperial court. [Appreciate historical perspectives]
5-12 Describe the indigenous development of Japanese society up to the 7th century CE. [Reconstruct patterns of historical succession and duration]
7-12 Explain China’s colonization of Vietnam and analyze the effects of Chinese rule on Vietnamese society, including resistance to Chinese domination. [Evaluate alternative courses of action]
5-12 Explain the commercial importance of the Straits of Melaka and the significance of the empire of Srivijaya for maritime trade between China and the Indian Ocean. [Draw upon data in historical maps]

STANDARD 4

The search for political, social, and cultural redefinition in Europe, 500-1000 CE.

Standard 4A

The student understands the foundations of a new civilization in Western Christendom in the 500 years following the breakup of the western Roman Empire.

GRADE LEVEL THEREFORE, THE STUDENT IS ABLE TO
5-12 Assess the importance of monasteries, convents, the Latin Church, and missionaries from Britain and Ireland in the Christianizing of western and central Europe. [Analyze cause-and-effect relationships]
5-12 Explain the development of the Merovingian and Carolingian states and assess their success at maintaining public order and local defense in western Europe. [Reconstruct patterns of historical succession and duration]
7-12 Analyze how the preservation of Greco-Roman and early Christian learning in monasteries and convents and in Charlemagne’s royal court contributed to the emergence of European civilization. [Reconstruct patterns of historical succession and duration]
7-12 Analyze the growth of papal power and the changing political relations between the popes and the secular rulers of Europe. [Identify issues and problems of the past]
9-12 Compare the successes of the Latin and Greek churches in introducing Christianity and Christian culture to eastern Europe. [Compare and contrast differing sets of ideas]

Standard 4B

The student understands the coalescence of political and social order in Europe.

GRADE LEVEL THEREFORE, THE STUDENT IS ABLE TO
5-12 Assess the impact of Norse (Viking) and Magyar migrations and invasions, as well as internal conflicts, on the emergence of independent lords and the knightly class. [Analyze cause-and-effect relationships]
7-12 Assess changes in the legal, social, and economic status of peasants in the 9th and 10th centuries. [Interrogate historical data]
7-12 Analyze the importance of monasteries and convents as centers of political power, economic productivity, and communal life. [Examine the influence of ideas]
9-12 Explain how royal officials such as counts and dukes transformed delegated powers into hereditary, autonomous power over land and people in the 9th and 10th centuries. [Reconstruct patterns of historical succession and duration]

STANDARD 5

The development of agricultural societies and new states in tropical Africa and Oceania.

Standard 5A

The student understands state-building in Northeast and West Africa and the southward migrations of Bantu-speaking peoples.

GRADE LEVEL THEREFORE, THE STUDENT IS ABLE TO
7-12 Explain how the contrasting natural environments of West Africa defined agricultural production, and analyze the importance of the Niger River in promoting agriculture, commerce, and state-building. [Analyze cause-and-effect relationships]
7-12 Explain how Ghana became West Africa’s first large-scale empire. [Interrogate historical data]
7-12 Assess the importance of labor specialization, regional commerce, trans-Saharan camel trade, and Islam in the development of states and cities in West Africa. [Analyze multiple causation]
9-12 Infer from archaeological evidence the importance of Jenné-jeno or Kumbi-Saleh as early West African commercial cities. [Interrogate historical data]
9-12 Analyze causes and consequences of the settling of East, Central, and Southern Africa by Bantu-speaking farmers and cattle herders up to 1000 CE. [Analyze cause-and-effect relationships]

Standard 5B

The student understands the peopling of Oceania and the establishment of agricultural societies and states.

GRADE LEVEL THEREFORE, THE STUDENT IS ABLE TO
9-12 Analyze various theories drawing on linguistic, biological, and cultural evidence to explain when and how humans migrated to the Pacific Islands and New Zealand. [Evaluate major debates among historians]
5-12 Describe the routes by which migrants settled the Pacific Islands and New Zealand and the navigational techniques they used on long-distance voyages. [Draw upon data in historical maps]
7-12 Describe the plants and animals that early migrants carried with them and analyze how agricultural societies were established on the Pacific Islands and New Zealand. [Clarify information on the geographic setting]
9-12 Analyze how complex social structures, religions, and states developed in Oceania. [Analyze multiple causation]

STANDARD 6

The rise of centers of civilization in Mesoamerica and Andean South America in the first millennium CE.

Standard 6A

The student understands the origins, expansion, and achievements of Maya civilization.

GRADE LEVEL THEREFORE, THE STUDENT IS ABLE TO
5-12 Describe the natural environment of southern Mesoamerica and its relationship to the development of Maya urban society. [Analyze cause-and-effect relationships]
7-12 Analyze the Maya system of agricultural production and trade and its relationship to the rise of city-states. [Analyze cause-and-effect relationships]
9-12 Interpret the Maya cosmic world view as evidenced in art and architecture and evaluate Maya achievements in astronomy, mathematics, and the development of a calendar. [Appreciate historical perspectives]
5-12 Analyze how monumental architecture and other evidence portrays the lives of elite men and women. [Draw upon visual sources]
7-12 Assess interpretations of how and why Maya civilization declined. [Evaluate major debates among historians]

Standard 6B

The student understands the rise of the Teotihuacán, Zapotec/Mixtec, and Moche civilizations.



Comments:

  1. Volabar

    stupid pad steel !!!!

  2. Ocnus

    Yes, in my opinion, they already write about this on every fence :)

  3. Biast

    All of the above is true.

  4. Rudiger

    After a while, your post will become popular. Remember my word.

  5. Laren

    I apologize, but in my opinion you are wrong. Enter we'll discuss it.

  6. Calidan

    I am very grateful to you for the information.



Write a message