Impact of World War II on the Decolonization Process in Sub-Saharan Africa

Author(s):  
S.O. Abrasheva ◽  
A.K. Babalova ◽  
P.O. Kulakova
Author(s):  
Bruce A. Forster ◽  
Jessica D. Forster

<p class="MsoNormal" style="text-align: justify; margin: 0in 0.5in 0pt;"><span style="font-size: 10pt;"><span style="font-family: Times New Roman;">This paper provides an introduction to the concepts of governance and state weakness, fragility or failure.<span style="mso-spacerun: yes;">&nbsp; </span>Selected indices of performance are presented with an emphasis on Sub-Saharan Africa. As noted by the 2005 UK Commission for Africa &ldquo;The most extreme breakdown of governance is war.&rdquo; The paper discusses the concepts and definitions of civil conflict and civil war, and the prevalence of civil war in Sub &ndash;Saharan Africa.<span style="mso-spacerun: yes;">&nbsp; </span>Among the costs of civil war are the people who are displaced due to their fear for life amidst the conflict.<span style="mso-spacerun: yes;">&nbsp; </span>If displaced persons exit the country they become refugees. The paper provides an introduction to the evolution of international humanitarian law since World War II to protect non-combatants, including refugees.</span></span></p>


1980 ◽  
Vol 18 (2) ◽  
pp. 201-236 ◽  
Author(s):  
Philip Foster

Few would disagree with the observation that the schools and universities of sub-Saharan Africa are perhaps the most important contemporary mechanisms of stratification and redistribution on the continent. They are not simply reflections of extant patterns of social and economic differentiation, but rather powerful independent forces in the creation of new and emergent groupings based on the variable possession of power, wealth, and prestige. Moreover, in using the word ‘contemporary’ we should not overlook the fact that formal educational systems are not a recent phenomenon in Africa. Schools existed on the western littoral in the eighteenth century, and their development in many parts of Africa, though slow up to the beginning of World War II, was of great significance. However, the African ‘educational explosion’ is largely a post-war phenomenon, and as a result we can no longer regard the school as an alien and intrusive institution perched precariously atop a range of predominantly ‘traditional’ societies. In most parts of Africa, the school is now as familiar a part of the local scene as the corrugated iron roof. Virtually everywhere, a whole generation would think it inconceivable to be without schools and, what is more, though Africa still remains the least formally educated of the continents, almost everyone now has a lively sense of the individual benefits that education can bring. As in other areas of social life, Africans perceive schooling in shrewd, pragmatic, and instrumental terms.


2003 ◽  
Vol 42 (2) ◽  
pp. 167-169
Author(s):  
Samina Nazli

Raising the standards of literacy in the developing world has been a major goal of the less developed countries since most of them became independent in the process of decolonisation that followed World War II. The Human Development Report 2004, brought out by the United Nations Development Programme lists some major improvements in increasing literacy levels of a number of countries between the year 1990 and 2002. For example, low human development countries like Togo increased their adult literacy rates from 44.2 percent in 1990 to 59.6 percent in 2002. Congo saw an increase in its literacy rate for the same period from 67.1 percent to 82.8 percent. The rates for Uganda, Kenya, Yemen, and Nigeria are 56.1 percent and 68.9 percent, 70.8 percent and 84.3 percent, 32.7 percent and 49.0 percent, and 48.7 percent and 68.8 percent respectively. If one examines the breakdown by region, the least developed countries as a group saw an increase in their adult literacy rates from 43.0 percent to 52.5 percent, the Arab states from 50.8 percent to 63.3 percent, South Asia from 47.0 percent to 57.6 percent, Sub-Saharan Africa from 50.8 percent to 63.2 percent and East Asia and the Pacific from 79.8 percent to 90.3 percent. If we look at the increase in the levels of literacy from the perspective of medium human development and low human development, the figures are 71.8 percent and 80.4 percent, and 42.5 percent and 54.3 percent, respectively.


Author(s):  
Eve Gray

The prevailing dynamics of today’s global scholarly publishing ecosystem were largely established by UK and US publishing interests in the years immediately after the Second World War. With a central role played by publisher Robert Maxwell, the two nations that emerged victorious from the war were able to dilute the power of German-language academic publishing—dominant before the war—and bring English-language scholarship, and in particular English-language journals, to the fore. Driven by intertwined nationalist, commercial, and technological ambitions, English-language academic journals and impact metrics gained preeminence through narratives grounded in ideas of “global” reach and values of “excellence”—while “local” scholarly publishing in sub-Saharan Africa, as in much of the developing world, was marginalised. These dynamics established in the post-war era still largely hold true today, and need to be dismantled in the interests of more equitable global scholarship and socio-economic development.


Author(s):  
Ruth Ginio ◽  
Jennifer Sessions

The French presence in Africa dates to the 17th century, but the main period of colonial expansion came in the 19th century with the invasion of Ottoman Algiers in 1830, conquests in West and Equatorial Africa during the so-called scramble for Africa and the establishment of protectorates in Tunisia and Morocco in the decades before the First World War. To these were added parts of German Togo and Cameroon, assigned to France as League of Nations mandates after the war. By 1930, French colonial Africa encompassed the vast confederations of French West Africa (AOF, f. 1895) and French Equatorial Africa (AEF, f. 1905), the western Maghreb, the Indian Ocean islands of Madagascar, Réunion, and the Comoros, and Djibouti in the Horn of Africa. Within this African empire, territories in sub-Saharan Africa were treated primarily as colonies of exploitation, while a settler colonial model guided colonization efforts in the Maghreb, although only Algeria drew many European immigrants. Throughout Africa, French rule was characterized by sharp contradictions between a rhetorical commitment to the “civilization” of indigenous people through cultural, political, and economic reform, and the harsh realities of violent conquest, economic exploitation, legal inequality, and sociocultural disruption. At the same time, French domination was never as complete as the solid blue swathes on maps of “Greater France” would suggest. As in all empires, colonized people throughout French Africa developed strategies to resist or evade French authority, subvert or co-opt the so-called civilizing mission, and cope with the upheavals of occupation. After the First World War, new and more organized forms of contestation emerged, as Western-educated reformers, nationalists, and trade unions pressed by a variety of means for a more equitable distribution of political and administrative power. Frustrated in the interwar period, these demands for change spurred the process of decolonization after the Second World War. Efforts by French authorities and some African leaders to replace imperial rule with a federal organization failed, and following a 1958 constitutional referendum, almost all French territories in sub-Saharan Africa claimed their independence. In North Africa, Tunisian and Moroccan nationalists were able to force the French to negotiate independence in the 1950s, but decolonization in Algeria, with its million European settlers, came only after a protracted and brutal war (1954–1962) that left deep scars in both postcolonial states. Although formal French rule in Africa had ended by 1962, the ties it forged continue to shape relations between France and its former colonial territories throughout the continent.


1996 ◽  
Vol 20 (4) ◽  
pp. 559-591
Author(s):  
John C. Caldwell ◽  
Pat Caldwell

The outbreak of AIDS around the world in the last 15 or 20 years is usually referred to as the “AIDS epidemic,” or occasionally “pandemic” (Grmek 1990). These terms have no great analytic value. The major medical dictionaries and epidemiological textbooks define an epidemic merely as an outbreak of a disease marked by a greater number of cases than usual (see Fox et al. 1970: 246–49; Mausner and Bahn 1974: 22, 272–77;Stedman’s Medical Dictionary1977: 470; Kelsey et al. 1986: 212; Walton et al. 1986: 351; Harvard 1987: 247). This condition is contrasted with the endemic form of a disease at “its habitual level, or what previous experience would lead one to anticipate.” The termpandemicis used to describe an epidemic widespread in the world and usually characterized by a large number of cases, for example, the fourteenth-century plague epidemic (or Black Death) and the influenza epidemic during the latter part of World War I. Some authorities stress the fact that epidemics are also characterized by a declining phase. This is true by definition, of course, for otherwise the disease could be described as shifting to a new and higher endemic level. But it is also of interest that most of these unusual outbreaks of disease are eventually limited by such mechanisms as a decrease in susceptibles as persons become immune or die; as interventions, either medical or behavioral, eliminate the source or interrupt transmission; or as the pathogen mutates and becomes less virulent.


2021 ◽  
Author(s):  
◽  
James Baigent

<p>In the post-World War Two era, political decolonisation swept across Africa. In the wake of decolonisation a wide variety of political leadership outcomes have emerged. In many national contexts indigenous political stakeholders were required to wrest political control from colonial powers. This study will compare the progress of the post-colonial political leadership experiences in Kenya and Tanzania - in order to ascertain the nature of the unique pressures and constraints placed upon first generation post-colonial political leaders. This will be framed and informed through the lens of contemporary and historical theories of leadership. Developing a greater understanding of the leadership experiences of these first-generation post decolonisation leaders will provide greater insight into the nature of post decolonisation leadership in sub-Saharan Africa.</p>


Author(s):  
Simon Pooley

Fires have burned in African landscapes for more than a hundred million years, long before vertebrate herbivores trod the earth and modified vegetation and fire regimes. Hominin use of lightning fires is apparent c.1.5 million years ago, becoming deliberate and habitual from c. 400 thousand years ago (kya). The emergence of modern humans c. 195 kya was marked by widespread and deliberate use of fire, for hunting and gathering through to agricultural and pastoral use, with farming and copper and iron smelting spreading across sub-Saharan Africa with the Bantu migrations from 4–2.5 kya. Europeans provided detailed reports of Africans’ fire use from 1652 in South Africa and the 1700s in West Africa. They regarded indigenous fire use as destructive, an agent of desiccation and destruction of forests, with ecological theories cementing this in the European imagination from the 1800s. The late 1800s and early 1900s were characterized by colonial authorities’ attempts to suppress fires, informed by mistaken scientific ideas and management principles imported from temperate Europe and colonial forestry management elsewhere. This was often ignored by African and settler farmers. In the 1900s, the concerns of colonial foresters and fears about desiccation and soil erosion fueled by the American Dust Bowl experience informed anti-fire views until mid-century. However, enough time had elapsed for colonial and settler scientists and managers to have observed fires and indigenous burning practices and their effects, and to begin to question received wisdom on their destructiveness. Following World War II, during a phase of colonial cooperation and expert-led attempts to develop African landscapes, a more nuanced understanding of fire in African landscapes emerged, alongside greater pragmatism about what was achievable in managing wildfires and fire use. Although colonial restrictions on burning fueled some independence struggles, postcolonial environmental managers appear on the whole to have adopted their former oppressors’ attitudes to fire and burning. Important breakthroughs in fire ecology were made in the 1970s and 1980s, influenced by a movement away from equilibrium-based ecosystems concepts where fires were damaging disturbances to ecosystems, to an understanding of fires as important drivers of biodiversity integral to the functioning of many African landscapes. Notably from the 1990s, anthropologists influenced by related developments in rangeland ecology combined ecological studies with studies of indigenous land use practices to assess their impacts over time, challenging existing narratives of degradation in West African forests and East African savannas. Attempts were made to integrate communities (and, to a lesser extent, indigenous knowledge) into fire management plans and approaches. In the 2000s, anthropologists, archeologists, geographers, historians, and political ecologists have contributed studies telling more complex stories about human fire use. Together with detailed histories of landscape change offered by remote sensing and analysis of charcoal and pollen deposits, these approaches to the intertwined human and ecological dimensions of fire in African landscapes offer the prospect of integrated histories that can inform our understanding of the past and guide our policies and management in the future.


2021 ◽  
Author(s):  
◽  
James Baigent

<p>In the post-World War Two era, political decolonisation swept across Africa. In the wake of decolonisation a wide variety of political leadership outcomes have emerged. In many national contexts indigenous political stakeholders were required to wrest political control from colonial powers. This study will compare the progress of the post-colonial political leadership experiences in Kenya and Tanzania - in order to ascertain the nature of the unique pressures and constraints placed upon first generation post-colonial political leaders. This will be framed and informed through the lens of contemporary and historical theories of leadership. Developing a greater understanding of the leadership experiences of these first-generation post decolonisation leaders will provide greater insight into the nature of post decolonisation leadership in sub-Saharan Africa.</p>


2017 ◽  
Vol 23 (2) ◽  
pp. 269-297 ◽  
Author(s):  
Kent Henderson ◽  
Kristen Shorette

Environmental sociologists highlight the exploitative nature of the global capitalist economy where resource extraction from nations in the periphery tends to disproportionately benefit those of the core. From the Brazilian Amazon to mineral-rich Sub-Saharan Africa, the practice of “unequal ecological exchange” persists. Simultaneously, a “global environmental regime” has coalesced as a prominent feature of the contemporary world system. In the post-World War II era, legitimate nation-states must take steps to protect the natural environment and prevent its degradation even at their own economic expense. Stronger national ties to global institutions, particularly international nongovernmental organizations (INGOs) consistently yield more positive environmental outcomes. However, previous work suggests that normative expectations for improved environmental practice will be weak or nonexistent in the periphery. We use the case of palm oil production and its relationship to deforestation to provide a more nuanced analysis of the relationship between material and institutional forces in the periphery. Using unbalanced panels of fifteen palm oil producing countries from 1990 to 2012, we find that stronger national ties to world society via citizen memberships in INGOs result in greater primary forest area among palm oil producers. However, this effect is strongest where production is lowest and weakens as production increases. Even in the cases of Indonesia and Malaysia, where palm oil production is substantially higher than any other producer, ties to global institutions are significantly related to reduced forest loss. These results indicate the variable importance of national embeddedness into global institutions within the periphery of the world system.


Sign in / Sign up

Export Citation Format

Share Document