The Neolithic Revolution

When and why did humans start farming crops, rather than hunting and gathering their food?

(excerpted from Chapter 2 of Pandora’s Seed ©Spencer Wells, 2010)

The burst dam

The Australian archaeologist Vere Gordon Childe led quite a fascinating life. In his youth he was a Marxist who served as private secretary to the Premier of the Australian state of New South Wales, as well as a talented linguist and inveterate traveler. Only later did he make the decision to pursue archaeology as a profession, first at the University of Edinburgh and later at the University of London. His early field work was on Skara Brae, in the Scottish Orkney islands, but his career really took off when he turned his attentions to the early agricultural communities of the eastern Mediterranean.

Childe coined the term ‘Neolithic Revolution’, and he fully meant it to be taken as a revolutionary transition. All that came before was savagery (which preceded barbarity in the linear progression of his Marxist-influenced view of cultural evolution), and the fruits of civilization only arose after this momentous event. To Childe, the dawning of the Neolithic was the defining point in our history as a species, and he popularized this notion in his books, particularly his widely read New Light on the Most Ancient East and Man Makes Himself, which influenced the general public and subsequent generations of professional archaeologists.

According to Childe, one of the key triggers in the onset of agriculture was the abrupt warming experienced in the Middle East at the end of the last ice age. He didn’t use the relatively sparse data from the rest of the world, perhaps assuming that what happened first in the Fertile Crescent later diffused to other regions as they tailored the methods of agriculture to their own crops. According to Childe, this warming trend affected the types of plant species growing there, leading some groups to begin to cultivate wheat and barley. This successful invention led to an expansion in population and the rise of urban civilization, and gradually from this Middle Eastern source Neolithic farmers spread themselves (and their advanced culture) far and wide across the rest of western Eurasia.

While generally correct, this model has since been modified with the re-assessment of what happened to the climate at the end of the last Ice Age. While the general trend over the past 15,000 years has been an increase in average global temperature, the period between 15,000 and 10,000 years ago was marked by abrupt advances and reversals in the warming trend. It was in this chaotic cauldron that agriculture developed.

Perhaps the best studied part of this 5,000-year period is a mini-Ice Age known as the Younger Dryas, which lasted over a thousand years from around 12,700 to 11,500 years ago. It is named after the genus of a small plant, Dryas octopetala, found in the tundra regions of Scandinavia, which was replaced by forest in the southern part of its range at the end of the ice age, but reappeared during the cooler conditions of the Younger Dryas. What led this small ice age plant to suddenly reappear is not completely clear, but the most likely theory is that it was caused, paradoxically, by the sudden melting of an ice dam in North America.

Now wait, you might be saying — it was affected by an ice dam halfway across the world? This was a very special dam, though. The warming temperatures at the end of the last Ice Age caused the Laurentide ice sheet we learned about in Chapter 1 to retreat from its foray into Illinois, and the pieces that remained around 13,000 years ago served to hold back Lake Agassiz, a massive body of fresh water located in what is today central Canada. Agassiz was comprised of much of the rest of the former ice sheet — while today’s Great Lakes are large bodies of water, this was a monster larger than all of them put together, larger even than the state of California or the Caspian Sea.

When the ice dam holding the lake back melted, it released the water, which flowed into the Saint Lawrence river basin and out into the north Atlantic. The flood of fresh water formed a kind of ‘shield’ on the surface of the ocean — it floated because its density is lower than that of salt water — which killed the Gulf Stream that brings warmer water from the tropical Gulf of Mexico into the north Atlantic. This natural flow had warmed western Eurasia like a massive radiator since the end of the ice age, and still does. It is the reason why palm trees grow in Cornwall, the southernmost point in Great Britain, despite the fact that its latitude is 50°, the same as Winnipeg in Canada, and nearly 30° north of the Tropic of Cancer. When the influx of fresh water killed the Gulf Stream, western Eurasia was plunged back into ice age-like conditions — the Younger Dryas.

Ecological shifts from the Last Glacial Maximum (~16 KYA) to the Holocene Optimum (~9 KYA)

While all of this was going on in the north Atlantic, the inhabitants of the Fertile Crescent had been getting used to the warmer temperatures. Between roughly 16,000 and 12,700 years ago the region was warming up and becoming wetter, which led to the expansion of plant species that had formerly been limited in their distribution to mountain valleys, where there were reliable supplies of water. The ready availability of these grasses — the ancestors of wheat, rye and barley — led some human populations to focus much of their energies on gathering them. It was a plentiful and calorie-rich food source, so it made sense. The Natufian people, who flourished in the western part of the Fertile Crescent during this period, were largely grain gatherers. And unlike almost all hunter-gatherers who came before, they were sedentary — they lived in small villages.

Since the earliest days of hominid evolution, our ancestors had been semi-nomadic. This is because of the uncertainty inherent in being a hunter-gatherer — if the food supply dwindles in one place, you pack up and move to better foraging grounds. It was our ability to do this successfully that led our ancestors out of their homeland in Africa again and again over the millions of years of hominid evolution. Homo erectus, who left around 1.8 million years ago, was simply following food, as was Homo heidelbergensis, who left around 500,000 years ago and gave rise to the Neanderthals in Europe. We are the descendants of a third wave of African hunter-gatherer migrants that left around 50,000 years ago, as I detailed in my book The Journey of Man. And throughout this long line of human evolution we had been semi-nomadic, staying in one place only as long as the pickings were good and moving on when they weren’t.

This started to change toward the end of the Paleolithic period — the period that preceded the Neolithic, when all humans were hunter-gatherers. According to recent research carried out in Israel by archaeologist Dani Nadel and his colleagues, there is evidence for grain gathering and flour-making at one of his sites near the Sea of Galilee dating back to 23,000 years ago. This would place them within the early part of the time period allocated to the Kebaran culture, which preceded the Natufian in the Levant. The Kebarans were the link with the true hunter-gatherers of Middle Eastern prehistory — highly nomadic, shifting their settlements seasonally to follow the food and water supplies during the cold, dry terminal period of the last ice age. But even these mobile hunter-gatherers seem to have recognized the advantages of collecting and grinding wheat, albeit in much smaller quantities than the Natufians.

When the last ice age ended, though, wheat expanded its range. Life became easier for our ancestors — food that had been difficult to obtain during the cold, dry conditions of the last ice age suddenly became more plentiful. This allowed them to finally settle down in an area with large quantities of the easily gathered grain. Gathering wild wheat produces more calories of food for each calorie of energy invested than did early forms of agriculture, which made it a fabulously valuable food source for the Natufians. Moreover, it was a particular type of food resource that lent itself to long-term storage — a seed that could be stored dry for years. A couple of weeks of intensive grain gathering in the fall can yield enough wheat to feed a family for a year, supplemented with nuts and game meat. Life was good, and they made the most of it by…well…doing what people do. They had babies.

The hunter-gatherer way of life had limited the number of children people had as part of a complex feedback loop with the environment. If the population grew too large it was necessary to split and form two smaller groups, one of which would typically move on to new hunting grounds. The calorie-rich environment of the Fertile Crescent wild grain fields increased the region’s carrying capacity (the number of people the land could support), and the human population responded. Natufian settlements during this period expanded into villages of 150 people or more, complete with circular houses and storage pits. It was a radical shift in our relationship with nature, and it only happened because the Natufians could rely on a steady supply of grain from the territory where they lived.

Then, suddenly, it all changed. That burst dam thousands of miles away in North America, setting in motion the younger Dryas, brought a return of the long winter. The population of the Middle East was cast back into the Ice Age, but this time they had a strike against them: they couldn’t move on to greener pastures. They had invested too much in their villages, the collective memory of the good times was probably still fresh in their minds (leave the village to return to the hardscrabble life of a nomad? Unthinkable!), and in all likelihood there were now too many people to return to life as nomadic hunter-gatherers. The Natufians were in a bind.

Although Dryas refers to a tundra plant species, it could more aptly refer to the drying effect during these periods of global cooling, for this was the main effect in the Middle East. As the land dried out the wild grain retreated from the lowlands into the higher mountain valleys, its distribution determined by where it could get enough water. The Natufians had to travel further and further from their lowland settlements to gather enough to survive. This would have put tremendous pressure on the food supply, and probably resulted in an increased mortality rate in these people accustomed to a land of plenty. It was humanity’s first real encounter with Thomas Malthus’s conjecture that population growth will eventually produce more people than can be supported by the available food supply.

Around this time we see also evidence for the extinction of megafauna, large mammals that would have formed part of the diet of these hunter-gatherers. While such extinction events had happened before, most notably in Australia when humans first arrived there around 50,000 years ago, as well as in North America with humanity’s arrival around 15,000 years ago, the clearest evidence in Europe and the Middle East comes at the end of the last ice age. This extinction event is further evidence that climate and human population pressures were having a significant effect on food resources. A population that had been able to live sustainably during the warm period immediately following the end of the last Ice Age was now too large to be supported by the diminished resources of the Younger Dryas, and the animal species lost out.

Extinction of megafauna around the world during early human migrations

Then, sometime between 11,000 and 10,000 years ago, during the Younger Dryas, one of these stressed Natufians had a revolutionary idea. What if, instead of walking further each day to gather food, they simply planted it close to the village? It was probably a woman, since women typically do the gathering in hunter-gatherer populations and thus had access to the seeds — and an incentive to reduce her gathering commute! Her first efforts must have been rewarded with admiration from the entire village, and the idea quickly spread. Virtually overnight humans had gone from being controlled by their food supply to controlling it.

This course of events can actually be seen in the bones of the people living in the region at the time, making use of something called the strontium/calcium ratio. Strontium (Sr) is an element that accumulates in human bone, and its content is determined by the Sr level in the groundwater of the region where you live (as well as exposure to nuclear fallout, which has Sr-90 as one of its major constituents). Plants absorb Sr as they grow, and then pass it on to the animals eating them. Animals excrete Sr, so not all of the Sr eaten is passed on to carnivores. Thus, the higher the proportion of plant food in your diet, the higher your Sr levels. Natufian remains have a very high level of Sr during their intense gathering phase prior to the Younger Dryas, then the level drops significantly as the wild stands of grain shrank and they turned to hunting to survive. The level then rises dramatically after the onset of domestication as the new culture took hold and plants made up a larger proportion of their diet.

This change in our relationship with nature had an extraordinarily far-reaching impact on the future of humanity — it was about far more than just food. We’ll examine the early fallout from cultivation a bit later in this chapter, but for now we need to zoom out from our close focus on this one region. While this sequence of events was playing out in the Middle East, extraordinary things were happening in other places around the world. We’ll see that the Natufians were not alone in their early attempts to domesticate crops — cultivation seems to have been a global trend around this time. But in the days before mass media and the internet, how did the seed of this revolutionary idea get planted in places as far afield as Mesoamerica, southern China and New Guinea? And what does it reveal about the next stage in the development of agriculture — the crucial step from planting wild seeds to their domestication, coupled with selective breeding for desirable traits?

Peaks and valleys

The Soviet botanist and geneticist Nikolai Vavilov led one of those lives that deserves to be a feature film. He was born into a bourgeois merchant family in Moscow in 1887, and had a younger brother who became a famous physicist. During his youth he spent several years traveling and studying in Europe and returned home just in time for the Bolshevik Revolution. He became a prominent member of the Soviet regime, a member of the Supreme Soviet, and a recipient of the prestigious Lenin Medal. He created and headed for nearly 20 years the Institute for Plant Industry in Leningrad (now St. Petersburg), and today this institute and the Institute of General Genetics in Moscow both bear his name. Yet, due to the bizarre rise of an agronomist named Trofim Lysenko in the 1930s and his pseudo-scientific attacks on genetics and the basic rules of evolution (both championed by Vavilov), this great scientist starved to death in 1943 in a gulag, having been jailed by Stalin in 1940 for allegedly plotting to destroy Soviet agriculture.

Vavilov was a hugely influential thinker on the origins of plant domestication, and the Institute in Leningrad still houses one of the world’s largest seed banks, created in an effort to preserve and study the diversity of cultivated crops from around the world. During the 28-month Siege of Leningrad the workers managed to protect the collection from starving Leningrad residents who tried repeatedly to eat its contents, and it is still a major botanical resource to this day.

In his extraordinarily influential work on domesticated plants, Vavilov described many primary centers of plant domestication. One is the Fertile Crescent, which we’ve just learned about. Others major centers are in China, Mesoamerica, and the Andes of South America. A wide range of places, but all are united in one aspect: they are all mountainous regions. Why not coastal areas, or prairies? Primarily because mountains serve as so-called refugia of biological diversity — places where species continue to thrive when the surrounding plains are too dry, due to the climatic shifts that have occurred frequently throughout the past few million years, to sustain them. Mountains are capable of drawing enough rainfall that they serve as relatively safe havens in times of climatic stress, so they are the places where genetic diversity is typically the highest. And high genetic diversity allows for the development of advantageous traits that can be selected for by humans, including seed retention and other characteristics that suit their use as food crops.

Humans can’t live easily in high mountains — we tend to prefer lowlands for climatic reasons — but plants advance and retreat, ‘breathing’ in and out of the lowlands during wetter and drier phases. This gives us our first clue as to why domestication happened in all of these places at the same time.

Mesoamerica, for instance, has given us many crops that are indispensable components of the modern diet: corn, tomatoes, beans, chiles, chocolate, vanilla, squash, pineapples, avocados and pumpkins. All were domesticated in the region of present-day Oaxaca, in southern Mexico, which has a rugged, mountainous terrain that has served to fragment human populations, resulting in a tremendous amount of cultural and linguistic diversity to match its botanical horn of plenty. Corn is by far and away the most important Oaxacan crop, and the evidence is that it was cultivated from around 10,000 years ago. There is some debate corn’s botanical ancestor — its closest wild relative, teosinte (pronounced tea-o-sin-tay), is so different in morphology that many scientists find it difficult to believe that one developed from the other — but not its geographic origin.

Domesticated corn later spread far from its Mexican homeland, reaching into North and South America over the subsequent 8,000 years, much as wheat spread far from its origin in the Fertile Crescent. The spread of corn has been well documented from human remains in North America, where the sudden transition from hunting and gathering can be seen in the ‘carbon signature’ in the bones, in a similar way to that in which strontium revealed the sequence of Neolithic events in the Middle East. This is because hunter-gatherers mostly eat what are known as C3 plants, which use carbon dioxide from the atmosphere to produce 3-carbon molecules as their energy store. 95% of the world’s plants are members of this C3 group, and it was the first to evolve over 250 million years ago. A more efficient type of plant metabolism, known as C4, evolved more recently, within the past 65 million years. The C4 plants include mostly tropical grasses, including corn, millet and sugarcane, and they store their energy in 4-carbon molecules, thus the C4 epithet.

Change in C4 plant consumption in North America around the introduction of corn agriculture

There is one other difference between C3 and C4 plants, and this is how this little foray into plant physiology fits into our story. The carbon atoms that these plants are using to make sugars and starches (where the CO2 in your breath and car exhaust ultimately ends up) aren’t all identical. There are several different variants of carbon, distinguished by their atomic anatomy. Most carbon has 6 protons and 6 neutrons packed into its nucleus, giving it an atomic weight of 12 (6+6). However, there are rarer forms of carbon that have 7 or even 8 neutrons packed with their 6 protons, giving them atomic weights of 13 and 14. Carbon-14 is extremely rare, but its tendency to lose atomic baggage (an electron and an anti-neutrino, if you must know) in an effort to drop a bit of extra molecular weight makes it extremely useful as a way of dating once living material. Carbon-13 doesn’t decay, but sticks around indefinitely and gives us another tool in our archaeological atomic arsenal. It turns out that C3 plants, for whatever reason, are picky and don’t like carbon-13, excluding it from their metabolic machinery. C4 plants don’t seem to care, and will use whatever is available. This means that C4 plants have higher ratios of carbon-13 to carbon-12 than do C3 plants.

So what does all of this mucking about in the world of carbon atoms mean? When people add C4 plants (like corn) to their diet, the ratio of carbon-13 in them also increases. By carefully measuring these carbon ratios in ancient bones we can see when people started to eat C4 plants like corn. And when we do this in North America, around the time that corn started to spread into a region, we can see a dramatic increase. While the ratios aren’t necessarily indicative of the actual amounts in the diet (it’s unlikely that 75% of their calories came from corn, as the figure above might suggest), it does indicate the extraordinary shift in diet that accompanied the spread of agricultural ‘killer apps’ like corn and wheat.

Similarly, rice seems to have been domesticated first in the mountains of southern China and northern India, where its wild ancestor Oryza rufipogon still grows. Through careful analyses of phytoliths, microscopic stone-like particles in plants that serve as a kind of species fingerprint and are preserved in the archaeological record, Zhijun Zhao of the Smithsonian Tropical Research Institute has found evidence that hunter-gatherers living on the Yangtze River in central China were eating rice around 13,000 years ago. With the onset of the Younger Dryas, however, the rice phytoliths disappear from the archaeological record, only reappearing around 11,000 years ago when warmer and wetter conditions returned — and based on changes in their phytoliths, these appear to have been cultivated. It seems that during the Younger Dryas the rice retreated back to a more hospitable environment, and humans — as in the Middle East and the mountain valleys of Oaxaca — were forced to start planting it to keep the grain in their diet.

Thus, in the centers of domestication for the three main grain crops around the world, we see a similar interaction between hunter-gatherers and their local grains. Intensive foraging at the end of the last ice age, coupled with warmer, wetter conditions, led to specialized gathering of particular plant species and an increase in population. The onset of the Younger Dryas created a crisis in food supply, which forced these sedentary foragers to start cultivating grains that had previously been plentiful in the wild. The combination of a demographic expansion followed by a climatic stress probably explains why we see the development of agriculture independently at the same time around the world. Cultivating food allowed them to survive the cold snap of the Younger Dryas, and when favorable conditions returned agriculture was ready to take off. All that was need was one final step: domestication.

The importance of duplication

Today Captain Bligh is a name that’s synonymous with cruel leadership, but in fact he was quite a good naval commander. His fall to ignominy came from his tough treatment of his crew during a six month sojourn in Tahiti in which he tried to enforce a ban on ‘liaisons’ with the local women, although this was in large part because of his concern for the Tahitians — Bligh didn’t want his crew to spread sexually transmitted diseases on the island. If they hadn’t spent so much time in Tahiti, the Bounty would probably be a footnote in naval history textbooks and history would have a different term for a cruel authoritarian. Unfortunately, Bligh couldn’t leave Tahiti any sooner because of the difficulty of cultivating breadfruit.

Unlike most plants humans eat, breadfruit typically has no seeds (there are a few varieties that do, but they aren’t widely cultivated). The only way to propagate the plant is by air layering, a tedious process where a small incision is made on a branch or stem and then wrapped with some rooting medium, like moss or soil. After a few weeks new roots have grown from the incision and the plant branch can be removed and planted on its own, eventually yielding a new independent plant. Two botanists from the Royal Botanical Gardens at Kew accompanied Bligh on his voyage in order to perform this time-consuming task. It was the only reason for the Bounty’s multi-year voyage — Bligh was supposed to deliver as many of the plants as possible to British colonies in the West Indies in order to feed the burgeoning slave population there. Breadfruit, despite the difficulties involved in cultivating it, is a calorie-rich and easily grown food source once the tedious business of propagation is done.

The complicated techniques that allow breadfruit to be cultivated are among the many that have been developed by humans since the dawn of agriculture. Propagation is one of the key parts of domestication, because without it you can’t make more of your food source. Doing it consistently requires a tremendous amount of knowledge about the biology of the species in question — life history, preferred growth conditions, and many other details…While we take it for granted that ranchers can breed cattle and poultry farmers can raise chickens from egg to adulthood to egg, it actually takes a great deal of effort to figure out how to do it in the first place — as it did with the propagation of breadfruit.

In perhaps the best-studied example of the development of agriculture during the Neolithic period, an epic book about the excavation of Abu Hureyra in northern Syria entitled Village on the Euphrates, author Andrew Moore and his co-writers lay out the critical role of propagation very early on: “Domestication may be defined in several ways…but the essence of it is that humans usually influence the breeding of the species concerned.” This is the key step in creating more of the species — if your animals and plants don’t produce offspring, you have to keep going back to the wild for more.

Domestication is about far more than simply making more of the species in question, though. It is also about selecting for traits that make the species in question a better source of food…Selecting for these traits is easier in some species than in others, and it turns out that our Big Three — wheat, rice and corn, which together account for around half of all calories consumed in the modern world — are particularly suitable for selective breeding. All of them are what is known as polyploid, which means their genomes have been duplicated, in some cases more than once. Many plants are polyploid, and genome duplication seems to have happened quite often during plant evolution. It’s as if you photocopied the entire genome, creating a spare. This has some pretty interesting consequences for what you can do to the plant with selective breeding.

When you have a spare copy of something, you can take more risks than if you had only one. It’s kind of like the ‘lives’ you have in a video game — when you’re down to your last one you can’t afford to make a mistake. This holds true at the genetic level as well, since having a duplicate allows you to ‘tinker’ with one of the copies while retaining an unaltered version. It gives you a backup, in other words, in case something goes wrong in your tinkering. Duplicate copies can open up new opportunities for evolutionary change without losing vital functions — and possibly lead to more rapid evolution. This idea was first championed by Susumu Ohno, a Japanese-American population geneticist, who wrote a classic book entitled Evolution by Gene Duplication in 1970. In this book Ohno presented what he believed to be one of the fundamental mechanisms of molecular evolution — duplicated genes leading to rapid evolutionary change due to the relaxed selection made possible by having a backup copy. He also coined the term “junk DNA,” referring to the large stretches of DNA in the genome with no known function. This is the ultimate fate of gene duplicates that suffer a fatal mutation and become non-functional. But the working copy of the gene keeps the organism alive. Cancerous tumors almost all duplicate their DNA, becoming polyploid as they develop. Geneticists believe this gives them more plasticity — more options to develop in ways that normal cells never would — due to the duplication of key genes.

In addition to being polyploid, wheat, rice and corn also seem to have a very high rate of mutation. Their DNA, it seems, is in a constant state of flux, duplicating and deleting parts in a molecular shuffle that produces a high level of natural variation in many traits. Some of this shuffling is caused by the presence of small DNA parasites known as transposable elements. These are like little viruses embedded in the genome, and may be the remnants of what were once active retroviruses (a family that includes HIV, the human immunodeficiency virus) that lost their ability to infect other cells but retained the retroviruses penchant for integrating into DNA and hopping around. Their discovery by Barbara McClintock in the 1940s and 1950s, during her efforts to understand some of the odd characteristics of corn genetics, was initially met with skepticism from fellow geneticists, but later research showed her work to have been correct and she was awarded a Nobel Prize in 1983.

Corn is a wonderful example of how careful selective breeding produced characteristics that are a far cry from its likely wild ancestor. Teosinte looks completely different from today’s cultivated corn — as you can see in the figure below — and while it is impossible to calculate the level of investment required to domesticate wheat, rice and corn during Neolithic times, it must have taken an enormous amount of effort to select for such extreme changes in morphology. In fact, recent genetic studies have suggested just how difficult it must have been.

Teosinte (left), the wild ancestor of corn (right)

Three genes, known as teosinte branched 1 (tb1), pro-lamin box binding factor (pbf) and sugary 1 (su1), are key to creating certain traits that distinguish corn from teosinte. Despite their complicated names, and even more complicated biochemical functions (tb1, for example, determines how the cobs are arranged on the corn plant, while su1 determines the mix of sugars found in the corn kernel), all seem to have been under strong selection as early as 4,400 years ago, according to a recent analysis of these genes in ancient corn remains. However, selection for these traits seems to have been ongoing nearly 2,000 years later, showing how difficult the process of selection is, particularly in a society lacking today’s scientific knowledge. That these early Mexican farmers were able to create corn from teosinte is a remarkable achievement.

The genetic plasticity of wheat, rice and corn gave them an edge over other potential food crops, and it is a large part of the reason that they are so widely cultivated today. While the Natufians at Abu Hureya consumed around 150 different plants, gathered (along with wheat) from the rich hills of the northern Fertile Crescent, by the time domestication was complete a few thousand years later their diet had dropped to only eight species, and wheat was by far and away the most important dietary component. Today, the ‘big three’ cereals account for around 90% of grain species under cultivation — they won the race to be humanity’s most important foods.

But here is the sting in the tail. When we used our amazing ingenuity to select for traits that allowed us to cultivate these incredibly successful foods, we unknowingly set in motion a strong selective regime on ourselves…and the end result of the higher (and less complex, in terms of species mix) carbohydrate levels in our diet is still playing out….