International Affairs Forum:
Since the 1980s and 1990s we’ve witnessed a shift in donors’ agendas, from limited and tightly targeted safety nets attempting to limit the role of the state, to concepts of ‘good governance’ and state’s responsibility in combating food insecurities. What factors can be attributed to this rise of social protection in the development agenda?
Dr. Stephen Devereux:
The rapid rise of social protection up the donors’ policy agenda during the past decade can be attributed to several factors. The first and most significant, in my opinion, is the failure of neoliberal reform processes in the 1980s and ’90s to generate substantial poverty reduction in Africa. On the contrary, structural adjustment and agricultural liberalisation programmes undermined or removed most government pillars of support for food security – agricultural research and extension services, input and food price subsidies, parastatal marketing operations – and exposed smallholder farmers and poor consumers to weak markets or market failures.
Second is the emergence of new sources of vulnerability – notably climate change, which has made food crop harvests more erratic and unpredictable, and AIDS, which has undermined household and community capacities to cope with chronic poverty and livelihood shocks.
A third factor is the failure of old-style “social safety nets” to provide adequate protection for poor and vulnerable people, and a growing perception that innovative interventions were needed. Related to this are the Millennium Development Goals (MDGs), which focused the attention of governments and donor agencies on halving poverty and hunger by 2015, and initiated a search for new tools and mechanisms to achieve these goals. “Social protection” emerged in the late 1990s as a logical – and potentially more comprehensive – policy response to all these challenges.
Food aid used to be the preferred means of aid to African countries, but this is no longer the case. Why?
For decades, food aid was the dominant aid modality to Africa, for at least two reasons: first, because it offered a direct response (providing food) to a visible problem (hunger); and second, because donor countries had food to give away. One pragmatic explanation for the declining popularity of food aid is that American and European surpluses started dwindling a few years ago, partly due to shifts in crop production patterns to meet the rapidly growing demand for biofuels.
At a theoretical level, food aid has always been contentious. Western countries were accused of over-subsidising food production, then “dumping” their surpluses in poor countries, and the distribution of free food in smallholder communities was blamed for undermining domestic production and markets, by creating disincentives for local farmers and traders. Most damaging of all, many evaluations concluded that food aid doesn’t work – beyond offering humanitarian relief during food crises, there is little evidence that food aid addresses the underlying causes of household or national food insecurity. In some countries, food aid became institutionalised – in Ethiopia in the 1990s, the main determinant of whether a family received food aid was whether it had received food aid the year before. By contrast, cash transfers are seen as giving recipients the means to meet a range of food and non-food needs, as well as boosting demand for local produce and stimulating markets and local economies. Vicious cycles of dependency can become virtuous cycles of poverty reduction.
In the LAC region we have seen a number of successful Conditional Cash Transfer Programs (CCTs) being successfully implemented to simultaneously tackle education, malnutrition and hunger. Are CCTs as widely spread throughout the African continent?
Conditional cash transfer programmes (CCTs) are widespread in Latin America but rare in Africa. One reason for this is that the conditions are linked to education and health services, and these services are generally more accessible and better quality in Latin America. There is no point making a cash transfer conditional on parents sending their children to schools and clinics if there are no schools and clinics in the local community, or if the quality of education and health services is very poor.
A second reason is that many CCTs in Latin America are government-owned programs financed out of domestic fiscal resources, and middle-class taxpayers tend to believe that poor people should do something in return for social grants – ideally, they should “graduate” out of “dependency” on “welfare handouts” by acquiring education and being healthy enough to earn their own living. The third and most interesting point is that the conditions do not necessarily make much difference. Although the uptake of education and health services does increase after CCTs are introduced, poor people spend unconditional transfers on these services as well. A recent cash transfer “experiment” in Malawi reported increased school attendance by recipients of unconditional as well as conditional cash transfers. This positive outcome could be attributed to higher post-transfer incomes by both groups – but not to the conditions attached to the CCT.
Can social protection schemes address the pressing, yet often ignored, question of seasonal hunger? If so, how?
Many problems of food insecurity that poor people face, especially in tropical countries, are seasonal in nature – but this is not always understood by policy-makers. During the “hungry season” before the main annual harvest, granaries on many smallholder farms are empty, food prices in local markets are high, and children weakened by malnutrition are more susceptible to water-borne diseases that peak during the rains. Until the 1980s, when governments in Africa and Asia intervened more heavily in their economies than they do now, many food security interventions were explicitly seasonal. Governments managed “strategic grain reserves” (buying food after harvest, storing it and releasing it at cost in the hungry season to stabilise supplies and minimise price rises), and they implemented “pan-seasonal pricing” policies (legislating a fixed consumer price for staple food throughout the year).
Under structural adjustment reforms in the 1980s these “seasonal safety nets” were abolished, leaving rural families exposed to seasonal hunger. The social protection agenda has replaced these structural interventions with targeted social assistance for vulnerable groups. But the purchasing power of cash is undermined by price inflation, so they provide least protection when seasonal hunger is worst. This problem can be addressed by “index-linking” cash transfers to food prices, or by delivering a mixed package of cash transfers and food aid – these measures have been adopted in a few cases – but this requires exceptional budgetary and administrative flexibility by governments or donors. We need to think more creatively about other ways of protecting poor rural people against seasonal food insecurity, either by tackling the cause of the problem directly or by devising effective “seasonal social insurance” mechanisms.
What lessons can we learn from the successful scaling up of pilot projects to the national level?
Actually, there are very few examples of successful scaling up of pilot projects to the national level in Africa. This is despite the fact that donors and international NGOs have invested heavily in social protection projects in many countries, to demonstrate positive impacts and encourage governments to take them over and expand their coverage. The puzzling question is why this has happened so rarely. One explanation is that a different management model is required – small-scale projects get intensive attention that national programmes do not, and pilot projects are designed to succeed, not to fail, so their demonstration effects are limited. Nationally owned programmes also require substantial and sustainable financing – greater by orders of magnitude than for community-level projects. National programmes require political commitment, but governments and donors do not always share the same priorities – donors prioritise “vulnerable groups”, but governments prioritise electoral constituencies, and these two categories do not necessarily overlap.
The most encouraging success stories in recent years have been when African governments took the initiative and launched social protection programmes at national scale from the start. Examples include the Child Support Grant in South Africa, and social pensions in Lesotho and Swaziland. These schemes display several important characteristics that pilot projects do not share: they are designed, implemented and financed almost entirely out of domestic resources; they are underpinned by legislation that makes government accountable for their delivery; and they are claims-based entitlements that are politically irreversible.
Despite a call for greater government ownership we often see donors and bilateral organizations bypass governments when implementing social protection schemes. Why? And more importantly, what are the threats beyond this line of action?
Donors and their NGO partners invariably argue that they are supporting national social protection policy processes, but in reality they often impose externally designed blueprints on African governments, by exerting their financial leverage and offering technical expertise. Social protection is still a donor-driven agenda, especially in countries where governments are resisting donor-funded cash transfer projects, arguing that they are fiscally unaffordable at national scale, create dependency on “handouts” rather than building self-reliance among the poor, and are not aligned with national priorities.
Most governments prefer to invest scarce public resources in “productive” sectors like agriculture, rather than “unproductive” sectors like social welfare. The risk of bypassing governments – or only loosely engaging government structures in the design and implementation of social protection projects – is that these projects will never become institutionalised nationally owned programmes, and cannot form part of the “social contract” between government and citizens. Instead, the fortunate participants (sometimes only a few thousand people) who happen to be included in donor or NGO projects enjoy preferential treatment – for a while – which does little to build an effective and sustainable national social protection system.
In recent years, India has witnessed the rise of a number of campaigns (such as the ‘right to work’ and ‘right to food’ campaigns) pioneered by civil society. Can we hope for a similar mobilization of civil society in African countries? Are African states any closer to recognizing the need for a minimum social package?
The “right to food” campaign in India culminated in the National Rural Employment Guarantee Act (NREGA), passed into law in 2005, which guarantees every rural household 100 days of public employment every year at the local minimum wage. (Incidentally, by converting a supply-driven “public works” programme into a demand-driven “right to work” programme, the NREGA provides a vital source of social protection against seasonal hunger.) This impressive achievement was almost entirely the product of civil society mobilisation, building on a government commitment since India achieved independence in 1947 – shortly after the Bengal Famine of 1943 – never to allow famine to occur again.
In Africa we have not seen mass mobilisation around social protection on this scale – civil society is either too weak or is actively repressed by the state. The Ethiopian government, for instance, passed the Charities and Societies Proclamation in 2009, which curtails the activism of NGOs and restricts their activities to service providers. A positive exception is the successful campaign by the Treatment Action Campaign in South Africa to secure free access to anti-retrovirals for all HIV-positive citizens. But South Africa is often regarded as exceptional in terms of the strength of its civil society – compared to, say, Ethiopia! The “minimum social package” is another interesting initiative that aims to extend basic social security instruments such as old-age pensions across the world, but this concept is derived from Western welfare states. It is not driven by local mobilisation or participatory consultation processes, so it risks being an imported blueprint that will not gain traction domestically, unless it is adjusted to reflect local realities and citizens’ priorities. Ideally, in my view, national social protection systems should emerge out of a negotiation between citizens and their governments, with external actors playing a facilitating role.
In your work, The New Famines: Why famines persist in an era of globalization (2007), you speak of a fundamental paradigm shift in the way that famines are being conceptualized. What are the main differences between ‘old’ and ‘new’ famines?
Famine theorising went through a paradigm shift in the 1980s, when Professor Amartya Sen conceived of the “entitlement” approach, which argues that famines are characterised not by an overall lack of food – or “food availability decline” – but by a fatal inability of specific groups of people to access food for a prolonged period of time. Sen’s analysis shifted the understanding of famine from a supply-side problem (no food) to a demand-side problem (no “entitlements” to food). Often, of course, the two interact – a food production shock (e.g. a drought) undermines market supplies and drives prices up to unaffordable levels for the poor.
But this does not explain why vulnerable people are not protected against the worst consequences of harvest failure and high prices – hunger, destitution, even death. The New Famines therefore argues that a second paradigm shift is needed to explain contemporary famines. With high-tech communications and a sophisticated international humanitarian industry, there is no reason why a food crisis cannot be prevented with timely and effective intervention, and in this sense every recent famine is “political”. It is never simply a “natural disaster” or a “technical failure” of markets, even if these elements play a contributory role. Recent famines (as in Ethiopia, Malawi, and Niger) must also be understood as “response failures” – why didn’t national governments and the international community intervene to prevent livelihood shocks escalating (over a period of several months) into famines?
Since 2000 the African continent has gone through four different famines: why are we still struggling with this problem? To what extent can corruption and poor governance be blamed for food shortages and famines, such as the Malawi famine of 2002 or that suffered by Ethiopia in 1999-2000?
Governance issues, broadly defined, have a great deal to do with failures to prevent food crises in Africa. Governments must bear much of the blame, but an important common factor in all recent famines is a breakdown in trust and cooperation between national governments and the international community at critical moments. Because humanitarian relief agencies play such a dominant role in responding to crises, it is sometimes unclear whether national or international institutions hold the primary responsibility for famine prevention. I have described this as a “black hole of unaccountability”.
In Ethiopia in 1999, the donors withheld humanitarian aid for the drought-stricken southern Somali Region because they feared it would be diverted to feed soldiers fighting a war with Eritrea on Ethiopia’s northern border. In Malawi in 2001, the IMF “advised” the government to sell off the national Strategic Grain Reserve to repay its operating debts, leaving the reserve empty and the government unable to respond, when a drought struck. The corrupt sale of this maize stock to benefit the ruling elite compounded the consequences of this bad advice. Another common factor linking recent African famines is that they occurred in countries that became democracies only a few years earlier. This appears to contradict Amartya Sen’s maxim that “famines do not occur in democracies”, but the lesson from recent events in Ethiopia, Malawi and Niger is that food crises can occur when democratic institutions are weak and elected governments have limited real accountability to opposition parties, the media and their citizens. Ultimately, famines will be eradicated from Africa only once a “social contract” is established between governments and citizens in every country that makes famines morally and politically unacceptable, and holds governments accountable for failures to prevent them.
Stephen Devereux is a development economist with 20 years experience in food security, poverty and rural development in 12 African countries, including 3 years heading a Rural Research Programme at the University of Namibia and 1 year researching household drought responses in northern Ghana. He has been a Fellow of the Institute of Development Studies (IDS) at the University of Sussex since 1996. He has written or edited 6 books on food security, famine and social protection, and has published articles in more than 15 journals.
|Comments in Chronological order (2 total comments)
| Asharq Al-Awsat, Pan-Arab daily, London, England
World News Review - 08/10/2010
| While all of this may sound discouraging, many researchers are optimistic about soon releasing a vaccine against HIV. As a virus, HIV has a coat of pr