What factors account for the dramatic decrease in the native american populations of north america?

(Inside Science) -- In 1870, there were at least 10 million bison in the southern herd on the North American plains. Fewer than 20 years later, only 500 wild animals remained. That part of the story -- the bloody removal of the animals for hides, meat and to devastate Native American communities -- is well-known. We have countless movies, books and ballads about the dust-strewn slaughter.

What hasn’t been so well-studied is the story of what happened next to the people involved. That’s the focus of a new report presented at the American Economic Association meeting in January. The researchers claim that the rapid destruction of the bison created an equally dramatic decline in the heights of the Native Americans who depended on them -- and a worse per capita income that persists today.

“You could imagine what would happen to certain sections of the American economy if oil disappeared, and people had no alternative,” said Donna Feir, an economist at the University of Victoria, Canada. Now, imagine that these people couldn’t migrate to other economic activities, and were kept in certain locations for 50 years. It would be a cultural and economic bomb that would continue for decades. 

Before the bison disappeared, the native people living in the plains were among the tallest in the world. They didn’t diversify their work from a single resource, the researchers said, because the bison pretty much supplied them with everything they needed. They were at least as well-off as European colonists at the time, researchers have argued. “The idea of poverty coincides with the reservation era, which happened after the slaughter of bison,” said Feir.

The slaughter changed everything. It happened in two waves. The first came slowly: European settlers brought cattle with them, and those animals competed for land with the wild bison. The second started in the 1870s, after German leather-makers created technology that allowed bison hides to be tanned more efficiently and economically. During 1871 and 1872, an average of 5,000 bison were killed every day, as thousands of hunters poured onto the plains. The slaughter continued until 1889, when only about 85 free-ranging bison remained.

In just a generation, the height of the Native American people who depended on bison dropped by an inch or more, as measured by physical anthropologist Franz Boas, who collected data on the height, gender and age of over 15,000 Native Americans between 1889 and 1919.

Groups that experienced rapid bison extinction had an even more precipitous drop -- children born after the slaughter were up to 2 inches shorter at adulthood than those born before the slaughter, the researchers found. Population declined, as well. And what the researchers called a kind of cultural depression settled on bison-dependent groups.

Feir pointed out that the drop in height wasn’t as steep for women as it was for men -- perhaps because traditional women’s skills, like making clothing -- were more adaptable to new locations and animals. While men’s traditional skills could have transferred to cattle ranching, government regulations forced people onto reservations where that was not an option. The researchers found that even today, formerly bison-dependent societies have between 20-40 percent less income per capita than the average Native American nation.

The economic shock effects of the end of bison culture -- and its particular shock effect on men -- didn’t surprise Dan Flores, a New Mexico-based historian who specializes in the American West. He said he was intrigued and impressed by the research’s ability to compare economic effects over several generations. 

Flores said that he’d like to see archaeological evidence going back further in time -- 500 or 1,000 years. “One of the problems of extrapolating from the Boas data is that several of the classic bison Indians in history -- the Siouan speakers, the Cheyennes, the Comanches -- only arrived on the plains in the 1700s,” he said. “Siouan peoples like the Osages and Lakotas were indeed tall, but was it bison that made them so, or were they just from a gene pool of taller people? Archaeological data from these groups before and after they came to the plains would tell us whether it was a bison diet or just genes that made them taller.”

He also suggested that the decline of the bison actually began to affect tribal economies and cultures a half-century before the endgame of the 1870s and 1880s -- only a small handful of groups were in places that experienced the precipitous drop. Flores also believes that the U.S. government wasn't the only factor to blame for the bison’s decline. “I argue that climate change, accidentally introduced bovine diseases, competition for grass and water from burgeoning horse populations, but most of all the effects of an unregulated market brought bison down,” he said, both in the U.S. and in Canada.

Feir pointed out that there were a lot of restrictions on Native Americans around this time. Native people didn’t gain full citizenship in the U.S. until 1924. “Society viewed them as an enemy,” she said. Taking away bison from people dependent on the animals drove them to shorter bodies and poorer lives, she said.

Today, an estimated 500,000 bison roam North America, on private and public lands, a far cry from their peak population, but much larger than their 19th-century nadir.

Over the past few years, bison-dependent tribes in the U.S. and Canada have come together to sign a new Buffalo treaty, cementing their economic and cultural relationship with the animal. At the same time, bison are slowly being reintroduced to the areas they once roamed. Feir wonders if this re-emergence of the bison in modern culture will give a boost to the people who once depended on them, adding, “It may improve economic factors.”

Powhatan village of Secoton

The thoughts and perspectives of indigenous individuals, especially those who lived during the 15th through 19th centuries, have survived in written form less often than is optimal for the historian. Because such documents are extremely rare, those interested in the Native American past also draw information from traditional arts, folk literature, folklore, archaeology, and other sources.

Native American history is made additionally complex by the diverse geographic and cultural backgrounds of the peoples involved. As one would expect, indigenous American farmers living in stratified societies, such as the Natchez, engaged with Europeans differently than did those who relied on hunting and gathering, such as the Apache. Likewise, Spanish conquistadors were engaged in a fundamentally different kind of colonial enterprise than were their counterparts from France or England.

The sections below consider broad trends in Native American history from the late 15th century to the late 20th century. More-recent events are considered in the final part of this article, Developments in the late 20th and early 21st centuries.

Scholarly estimates of the pre-Columbian population of Northern America have differed by millions of individuals: the lowest credible approximations propose that some 900,000 people lived north of the Rio Grande in 1492, and the highest posit some 18,000,000. In 1910 anthropologist James Mooney undertook the first thorough investigation of the problem. He estimated the precontact population density of each culture area based on historical accounts and carrying capacity, an estimate of the number of people who could be supported by a given form of subsistence. Mooney concluded that approximately 1,115,000 individuals lived in Northern America at the time of Columbian landfall. In 1934 A.L. Kroeber reanalyzed Mooney’s work and estimated 900,000 individuals for the same region and period. In 1966 ethnohistorian Henry Dobyns estimated that there were between 9,800,000 and 12,200,000 people north of the Rio Grande before contact; in 1983 he revised that number upward to 18,000,000 people.

Dobyns was among the first scholars to seriously consider the effects of epidemic diseases on indigenous demographic change. He noted that, during the reliably recorded epidemics of the 19th century, introduced diseases such as smallpox had combined with various secondary effects (i.e., pneumonia and famine) to create mortality rates as high as 95 percent, and he suggested that earlier epidemics were similarly devastating. He then used this and other information to calculate from early census data backward to probable founding populations.

Dobyns’s figures are among the highest proposed in the scholarly literature. Some of his critics fault Dobyns for the disjunctions between physical evidence and his results, as when the number of houses archaeologists find at a site suggests a smaller population than do his models of demographic recovery. Others, including the historian David Henige, criticize some of the assumptions Dobyns made in his analyses. For instance, many early fur traders noted the approximate number of warriors fielded by a tribe but neglected to mention the size of the general population. In such cases small changes in one’s initial presumptions—in this example, the number of women, children, and elders represented by each warrior—can, when multiplied over several generations or centuries, create enormous differences in estimates of population.

A third group suggests that Dobyns’s estimates may be too low because they do not account for pre-Columbian contact between Native Americans and Europeans. This group notes that severe epidemics of European diseases may have begun in North America in the late 10th or early 11th century, when the Norse briefly settled a region they called Vinland. The L’Anse aux Meadows site (on the island of Newfoundland), the archaeological remains of a small settlement, confirms the Norse presence in North America about 1000 ce. Given that sagas attest to an epidemic that struck Erik the Red’s colony in Greenland at about the same time, the possibility that native peoples suffered from introduced diseases well before Columbian landfall must be considered.

Yet another group of demographers protest that an emphasis on population loss obscures the resilience shown by indigenous peoples in the face of conquest. Most common, however, is a middle position that acknowledges that demographic models of 15th-century Native America must be treated with caution, while also accepting that the direct and indirect effects of the European conquest included extraordinary levels of indigenous mortality not only from introduced diseases but also from battles, slave raids, and—for those displaced by these events—starvation and exposure. This perspective acknowledges both the resiliency of Native American peoples and cultures and the suffering they bore.

Determining the number of ethnic and political groups in pre-Columbian Northern America is also problematic, not least because definitions of what constitutes an ethnic group or a polity vary with the questions one seeks to answer. Ethnicity is most frequently equated with some aspect of language, while social or political organization can occur on a number of scales simultaneously. Thus, a given set of people might be defined as an ethnic group through their use of a common dialect or language even as they are recognized as members of nested polities such as a clan, a village, and a confederation. Other factors, including geographic boundaries, a subsistence base that emphasized either foraging or farming, the presence or absence of a social or religious hierarchy, and the inclinations of colonial bureaucrats, among others, also affected ethnic and political classification; see Sidebar: The Difference Between a Tribe and a Band.

The cross-cutting relationships between ethnicity and political organization are complex today and were equally so in the past. Just as a contemporary speaker of a Germanic language—perhaps German or English—might self-identify as German, Austrian, English, Scottish, Irish, Australian, Canadian, American, South African, Jamaican, Indian, or any of a number of other nationalities, so might a pre-Columbian Iroquoian speaker have been a member of the Cayuga, Cherokee, Huron, Mohawk, Oneida, Onondaga, Seneca, or Tuscarora nation. And both the hypothetical Germanic speaker and the hypothetical Iroquoian speaker live or lived in nested polities or quasi-polities: families, neighbourhoods, towns, regions, and so forth, each of which has or had some level of autonomy in its dealings with the outside world. Recognizing that it is difficult to determine precisely how many ethnic or political groups or polities were present in 15th-century Northern America, most researchers favour relative rather than specific quantification of these entities.

The outstanding characteristic of North American Indian languages is their diversity—at contact Northern America was home to more than 50 language families comprising between 300 and 500 languages. At the same moment in history, western Europe had only 2 language families (Indo-European and Uralic) and between 40 and 70 languages. In other words, if one follows scholarly conventions and defines ethnicity through language, Native America was vastly more diverse than Europe.

Politically, most indigenous American groups used consensus-based forms of organization. In such systems, leaders rose in response to a particular need rather than gaining some fixed degree of power. The Southeast Indians and the Northwest Coast Indians were exceptions to this general rule, as they most frequently lived in hierarchical societies with a clear chiefly class. Regardless of the form of organization, however, indigenous American polities were quite independent when compared with European communities of similar size.

Just as Native American experiences during the early colonial period must be framed by an understanding of indigenous demography, ethnic diversity, and political organization, so must they be contextualized by the social, economic, political, and religious changes that were taking place in Europe at the time. These changes drove European expansionism and are often discussed as part of the centuries-long transition from feudalism to industrial capitalism (see Western colonialism).

Many scholars hold that the events of the early colonial period are inextricably linked to the epidemics of the Black Death, or bubonic plague, that struck Europe between 1347 and 1400. Perhaps 25 million people, about one-third of the population, died during this epidemic. The population did not return to preplague levels until the early 1500s. The intervening period was a time of severe labour shortages that enabled commoners to demand wages for their work. Standards of living increased dramatically for a few generations, and some peasants were even able to buy small farms. These were radical changes from the previous era, during which most people had been tied to the land and a lord through serfdom.

Even as the general standard of living was improving, a series of military conflicts raged, including the Hundred Years’ War, between France and England (1337–1453); the Wars of the Roses, between two English dynasties (1455–85); and the Reconquista, in which Roman Catholics fought to remove Muslims from the Iberian Peninsula (c. 718–1492). These conflicts created intense local and regional hardship, as the roving brigands that constituted the military typically commandeered whatever they wanted from the civilian population. In the theatres of war, troops were more or less free to take over private homes and to impress people into labour; famine, rape, and murder were all too prevalent in these areas. Further, tax revenues could not easily be levied on devastated regions, even though continued military expenditures had begun to drain the treasuries of western Europe.

As treasuries were depleted, overseas trade beckoned. The Ottoman Empire controlled the overland routes from Europe to South Asia, with its markets of spices and other commercially lucrative goods. Seeking to establish a sea route to the region, the Portuguese prince Henry the Navigator sponsored expeditions down the Atlantic coast of Africa. Later expeditions attempted to reach the Indian Ocean, but they were severely tested by the rough seas at the Cape of Good Hope. Christopher Columbus had been a member of several such voyages and proposed an alternative, transatlantic route; in 1484 he requested the sponsorship of John II, the king of Portugal, who refused to support an exploratory journey.

Iberia was a hotbed of activity at the time. Ferdinand II of Aragon and Isabella I of Castille had begun to unify their kingdoms through their 1469 marriage, but they were soon forced to resolve bitter challenges to their individual ascensions. Eventually quelling civil war, the devout Roman Catholic sovereigns initiated the final phase of the Reconquista, pitting their forces against the last Moorish stronghold, Grenada. The city fell in January 1492, an event Columbus reportedly witnessed.

Spanish Inquisition

The seemingly endless military and police actions to which Ferdinand and Isabella had been party had severely depleted their financial reserves. This situation was exacerbated by the chief inquisitor of the Spanish Inquisition, Tomás de Torquemada, who persuaded the monarchs to expel any Jews who refused to be baptized. Under his authority some 160,000—and by some accounts as many as 200,000—Jews were ultimately expelled or executed for heresy, including many of Spain’s leading entrepreneurs, businessmen, and scientists. Having lost so many of its best minds, Spain faced a very slow economic recovery, if it was to recover at all. Seeking new sources of income, the royal treasurer, Luis de Santángel, urged the monarchs to accept Columbus’s proposal to explore a western route to the East. Although Columbus did not find a route with which to sidestep Ottoman trade hegemony, his journey nonetheless opened the way to overseas wealth. Spain used American resources to restore its imperiled economy, a strategy that was soon adopted by the other maritime nations of Europe as well.