IN THE PAST, famines have been defined as discrete events, where a large proportion of a population dies of starvation and disease caused by undernourishment. In recent decades, famines have been increasingly understood as more complex, open-ended processes that can have multiple outcomes. Famine can occur during events of chronic food insecurity, which represents a state of continuously inadequate access to food.
One of the worst famines in history, the Bubonic Plagues in 1345–48, claimed more than 40 million lives in Europe. While estimates are often vague, evidence suggests that at the end of the 19th century, somewhere between 30 and 60 million people died in famines in India and China. In the 20th century, more than 70 million people died in famines worldwide. Most deaths occurred in China and the Soviet Union. During the 20th century, famines shifted from Europe and Asia to sub-Saharan Africa, where large famines occurred in the Sahel and the Horn of Africa in the mid–1970s and mid–1980s.
In the 21st century, famines remain a widespread problem in the developing world. The Food and Agriculture Organization (FAO) reports that 842 million people in the world are undernourished, while the vast majority live in developing countries. Though global levels of food insecurity have slightly improved over the last decades, large regional discrepancies persist. Countries in Asia, the Pacific, Latin America, and the Caribbean have largely managed to improve their food security status; while food insecurity has been on the rise in subSaharan Africa, the Near East, and North Africa. The situation remains most critical in sub-Saharan Africa, where one-third of the population is chronically food insecure.
In the past, famines have been associated with natural causes, such as drought and crop failure, and, to a limited extent, war. Up to the late 1970s, famines were considered supply side failures and resolution was attempted by increasing global food supplies through Green Revolution technologies. The recurring food crisis in parts of sub-Saharan Africa during the 1980s demonstrated the limitations of the supply-side focus, and showed that meeting demand alone was not enough. Food security became a crucial component of development as it became clear that national food security did not translate into food security at the local level, and that food security was also determined by effective demand.
In 1981, Amartya Sen argued that food insecurity was not persistent due to shortfalls in production, but due to the lack of effective demand. Sen introduced the concept of entitlements that referred to the condition of people lacking the means to buy or access food. In Sen’s view, access was also related to structural, political, institutional, and socio-economic factors. Sen’s work led to a paradigm shift that was crucial to the way that food insecurity was conceptualized.
Neither drought nor population growth are root causes of food insecurity, but exacerbate the problem, which can be caused by political, social, economic, and environmental constraints; armed conflict; uncontrolled population growth; low levels of literacy; poor access to water and health care; disease; poor or inappropriate agricultural practices; climate variability; and environmental degradation.
In the 1990s, when the understanding that food security is only one of a range of needs furthered the concept of food security, the livelihoods concept emerged as a result. The livelihoods approach focuses on assets and options people have to pursue alternative strategies to make a living, and has become important to provide for more effective intervention. Therefore, the risk of famine, especially when it is part of the daily struggle for survival, cannot be treated as separate from long-term development.
Famines are highly emotive and increasingly politicized. With humanitarian assistance turning into a large industry, and only highly publicized famines achieving global attention (such as Ethiopia, 2002; and Niger, 2005) there is increasing misuse of the term with disastrous consequences. As different levels of food insecurity demand different levels of responses, an exact definition of what constitutes a famine becomes increasingly important.
Famine vulnerability assessments are used to identify the susceptibility of populations to famines. Traditionally, vulnerability assessments aimed to predict short- and long-term changes in natural conditions (such as drought), in order to better prepare and respond. Benchmarks determine the levels of food insecurity, ranging from nutritional indicators (such as wasting, stunting, and mortality), to crop yield and food prices, to combined measurements of famine intensity and magnitude. Benchmarks are particularly controversial in situations of chronic food insecurity, where malnutrition is not a result of the lack of food, but of structural problems; these approaches tended to ignore people’s own coping strategies. Now, increasing emphasis is placed on non-nutritional indicators, such as political conditions, social systems, and market indicators. More recently, emphasis has turned to monitoring livelihoods and understanding coping strategies.
Food aid is an important instrument in addressing food insecurity in terms of meeting emergency needs after disasters and addressing long-term concerns of vulnerability. However, it is highly controversial and has received wide criticism for various reasons. Food aid programs were largely driven by donor needs, mainly to dispose of North American grain surpluses. Food aid was given in a way that was hoped to advance foreign policy objectives in the Cold War era and to develop overseas markets to absorb future surpluses. Humanitarian concerns and acute needs were often secondary.
We're sorry this article wasn't helpful. Tell us how we can improve.