Remembering the Ebola Crisis: What did the world learn?

Do you remember the Ebola Crisis? probably not until you read the title of this essay. Yet, in 2014 the outbreak of the Ebola virus in West Africa reverberated fear across the globe, sparking worldwide concern over the severity and potential proliferation of the disease beyond the borders of sub-Saharan Africa. Despite its discovery in 1976 and its relative dormancy as a significant national or international health concern for a number of decades, the reemergence of Ebola resulted in over 10,000 deaths (WHO, 2018) and crippled further the reputation and socio-economic environment of many African countries. Particularly devastating in Guinea, Liberia and Sierra Leone, Ebola threatened to become an epidemic of monumental proportions, and eventually surfaced in Europe and the United States. However, the widespread media attention and global apprehension towards Ebola gradually fizzled out following its containment in 2016. Yet, little reflection has been given to the thousands of individuals infected and suffering from the ruinous effects of the disease several years after the outbreak, or given to the societal impact Ebola has had on our world.

Firstly, the rapid escalation of the Ebola virus – especially in West Africa, demonstrates that least economically developed (LEDs) countries within sub-Saharan Africa are incredibly ill-equipped in counteracting the outbreak of potentially deadly diseases. Owing to the poor structural conditions in Guinea, Liberia and Sierra Leone, public health systems were simply inadequate in diagnosing, and ultimately addressing Ebola from the outset. While poor political systems played a factor, this was also undoubtedly influenced by the ineffective policy regimes of international lenders in sub-Saharan Africa, including the World Bank and the International Monetary Fund (IMF). ‘Structural adjustment’ has imposed strict conditionalities on governments across the continent in order to meet debt repayments and to provide additional access to loans, restricting the availability of resources for basic public services – including the provision of healthcare.

The poor institutional capacity in the worst effected areas helped in delimiting the national responses to Ebola, with dysfunctional health systems not only providing inadequate services for communities, but also being a source of infection themselves. In stark contrast, Ebola  was quickly eradicated in Western Europe and the United States as a result of the effective and extremely secure public health measures in place in response to viral haemorrhagic fevers (such as Ebola). There were only seven suspected cases of Ebola in these areas, with only one fatality. This is in comparison to over 28,000 cases of Ebola in sub-Saharan Africa between 2014-2016, with over 39% of cases resulting in death. Although the national response in other sub-Saharan African countries should be heralded – particularly in Nigeria where localised interventions were particularly effective (Saxena, 2014), Ebola had a calamitous impact on Guinea, Liberia and Sierra Leone, who were already struggling to recover from the effects of civil war.

The weaknesses in the domestic response to Ebola in sub-Saharan Africa was further exacerbated by the initial indecisiveness of the international response to the outbreak, which not only failed to understand the extent and the seriousness of the disease, but also lacked potency in coordinating an effective, collaborative response before its rapid escalation. The World Health Organisation (WHO) admitted that budget restrictions, political appointments and competing health crises resulted in the African Regional Office becoming the weakest of the WHO regional offices at a technical level, with several aid workers and experts blaming the organisation for failing to prevent the outbreak from become an epidemic (Jose M. Martin-Moreno, 2014). Moreover, although Ebola had potentially dangerous consequences for European countries, the EU were also sluggish in mobilising human, financial, medical, and military resources in the regions worst affected (GianLuca Quaglio et al., 2016).

With that said, the role of NGOs – especially Médecins Sans Frontières (MSF) played an incredibly important role in managing logistics on the ground during the initial phases of outbreak, as well as campaigning for greater involvement by international actors as the Ebola crisis intensified. Their work not only supported local efforts in reducing the spread of the disease, but it also raised global awareness and forced international actors into action in sub-Saharan Africa. This eventually resulted in the United Nations, the African Development Bank, the African Union, and the World Bank, as well as a whole host of other international organisations, governments, and individuals providing millions of dollars in financial aid and other forms of support. Furthermore, the excessive media attention following the outbreak of Ebola also increased public consciousness, which further helped in galvanising international attention and cooperation between external organisations and affected countries (José Joaquín Mira, et al, 2015). However, it may have also strengthened the less-than-desirable perceptions of Africa as a haven for ‘exotic’ diseases, and entrench the negative, insolvent images of the continent.

Today, survivors of Ebola continue to suffer from a variety of health issues despite the end of the epidemic over a year ago, including visual problems, abdominal and epigastric pain, insomnia and headaches (GianLuca Quaglio et al., 2016). For those individuals and communities who have perished and continue to live with the destructive effects of Ebola, it is important that the world takes a step back to reflect on outbreak and remind ourselves of the humanity of inhabitants in the poorest areas of the world. In the future, the weak institutional capacity across poorer sub-Saharan African countries must be improved and supported both by national and multilateral actors and organisations in attempt to redress the many failures during the Ebola crisis, and build upon the cross-border partnerships in anticipation of future public health crises. In a world where information, technology and societal values are becoming more widely dispersed and inclusive, it is time that we consolidate and fortify our global approach to local problems.

If Haiti and Africa are considered “sh**holes,” then the United States must hold responsibility

During a heated meeting with lawmakers concerning immigration to the United States, President Donald Trump was reported to have made derogative comments about Haiti and several African countries. Trump has been accused of arguing against people from ‘sh**hole countries’ coming to the United States, preferring to accept immigrants from countries such as Norway. Although Trump has since denied making the remarks about Haiti and Africa (despite the reputable sources from officials present at the meeting), the US President has been widely criticised for his comments and has received a barrage of complaints from policy-makers and political commentators both within the United States and across the world. Trump’s recent comments only fuel public accusations that he is a racist, and provide further weight to the notion that he is overwhelmingly unfit for office. However, despite the intolent and abhorrent conjectures behind Trump’s scolding of Haiti and Africa, is there an ounce of truth in the US President’s remarks?

In many senses, the answer is yes. Both Haiti and several African countries have been economically deprived for decades, and have been marred by poverty and political instability. Haiti is undoubtedly the poorest country in the Americas and one of the poorest in the world, with a GDP per capita of US$846 in 2014. The World Bank stated that six million of Haiti’s 10.4 million inhabitants live under the national poverty line of US$2.41 per day, with a third of those living in extreme poverty. Haiti’s economy is also heavily depedent on external finance – particularly from Venezula, and has an increasing fiscal deficit, particularly due to poor management of public expenditure and the country vulnerability to natural disasters, e.g. Hurricane Matthew. Moreover, Haiti has been plauged by political instability and corruption, and has struggled to embed democracy succesfully, with coup d’état’s an ever-present reality throughout the country’s history. These issues have been influential in the dramatic rise of Haitian-born immigrants in the United States over the last 25 years. This is despite the US government’s extrodinary efforts to block illegal immigration from the country, as noted by the Migration Policy Institute.

Similiarly, many African countries have also suffered from extreme poverty and political and economic trepidation, despite the rich abundance of resources across the region. Despite recent growth over the last decade, which has partly been driven by an increase in domestic demand and foreign investment, Africa remains the poorest continent in the world. Several African countries have had particular trouble with managing fluctuating commodity prices on global markets, largely due to their depedency on a single agricultural commodities as a source of revenue. In addition, Africa has suffered from a plethora of natural disasters, rampant outbreaks of disease – particularly HIV/AIDs and tuberculosis, and has had an unwavering reliance on foreign aid. This is without mentioning the emergence of brutal dictatorships throughout Africa during 1960s and 1970s, and the ever-present reality of fierce tribal differences that have often been the precursor to civil war in many countries. Evidence from Pew Research Centre shows that although Africans make up a small number of the United States’ immigrant population (4.8% in 2015), immigration from Africa has approximately doubled in every decade since 1970, demonstrating the extent of development crisis in Africa.

However, Trump’s comments – and indeed my previous assessments of Haiti and Africa, ignore the significant role that the United States has played in creating the conditions that have predisposed the two regions to such negative attitudes. Firstly, the historical relationship between the United States and Haiti has been mostly oppresive by the former against the latter. Due to its geographical location, the United States have had a persistent interest and influence in the political and economic conditions in Haiti over the last 200 years. Although maintaining its trade relationship with Haiti following its indepedence from France in 1804, the United States failed to recognise Haiti’s freedom until 1862 (incidently not until the American Civil War), due to a fear that it could result in an slave uprising in Southern states. For much of the 19th century, the United States monitored Haiti with great suspicion and perceived the free state as a threat to its future ambitions.

The lack of political support for Haiti by the United States was most evident in the country’s sparadoic military interventions in the country throughout its early history, culminating in the eventual occupation of Haiti from 1915-1934, which added fuel to the deep political tensions that had existed in the country since its birth. Since then, the United States have played a dominant role in damaging the Haitain economy, largely through their support of the brutal ‘anti-communist’ regimes of François “Papa Doc” Duvalier and Jean-Claude “Baby Doc” Duvalier, which led to widespread corruption and the death of millions, and its influence in driving ill-suited political and economic reform policies, particularly during the 1990s. As of 2016, Haiti’s GDP was roughly  £8 billion – approximately 0.4% of the United States’ GDP in the same year, demonstrating the gulf in economic performance between the two countries.

The United States’ relationship with Africa is of course well-documented, and the lasting impact of slavery on the continent cannot be understated. However, more pertinent to the floudering foundations within modern-day Africa is the United States’ calculated economic and political incentives in the region since the 1960s. In order to secure its global dominance following the Second World War and deligitimise communism as a political ideology, the United States endeavoured to remove several left-leaning or “anti-American” regimes across the region. This led to political instability and civil unrest across the continent, with the United States intervening in several countries, irrespective of their accordance to ‘good governance’ and democracy. Examples include the removal of Patrice Lumumba from Congo in 1960 and the CIA-led coup in Chad in 1980.

The United States’ political involvement in Africa is closely associated with their economic agenda throughout the continent, prescribed through the ‘Washington Consensus’ and expoused by the International Monetary Fund (IMF) and World Bank.  This has forced African countries to liberalise their economies to stimulate trade and foreign investment, decentralise state systems, float their fragile currencies and privatise key industries, all unpinned by an apparatus that ‘encouraged’ large borrowing to the continent to finances their economic transformation. Both the political and economic incentives of the United States has decimated many African countries, who have since had to contend with crippling debts, and poorly managed infrastructures and resources. Albeit several African countries appear to be on the road to recovery – boasting levels of growth similar to that of East Asia during the 1990s, the legacy of US involvement in the region remains.

Of course, it would be wrong to ignore the impact of other internal and external factors in effecting the development of Haiti and Africa. However, Trump’s derogative statements about the two regions suggests a lack of knowledge and antipathy for the extent to which the United States has contributed to their lacklustre growth. In fact, rather than to follow the cynical narrative that has now become synonymous with Haiti and Africa, the World should reflect on the great strides both areas have made in overcoming the imperialist claw of the United States. Haiti and Africa are not “sh**hole” countries – and if Trump thinks they are, maybe his country should look in the mirror first.

The World Bank and HIV/AIDS Governance in Sub-Saharan Africa: Perpetrators or Defenders against the Epidemic?

Since the early 1980s, the HIV/AIDS epidemic has devastated the social, economic, and political landscape of sub-Saharan Africa, and has continued to impose a significant challenge against the region’s development. Indeed, the widespread proliferation of the disease has led to the immense regression of many sub-Saharan African countries. According to UNAIDS in 2013, there were an estimated 24.7 million people living with HIV in sub-Saharan Africa – nearly 71% of the global total, with 1.1 million people dying from AIDS-related illnesses.

In the fight against HIV/AIDs, organisations such as the Global Fund to Fight AIDS, Tuberculosis and Malaria (GFATM) and the World Health Organisation (WHO) have often been regarded as the international forerunners in HIV/AIDS governance in sub-Saharan Africa. However,  the role and impact of the World Bank on HIV/AIDS in the region has largely been ignored within public health and public policy literature. In fact, the World Bank have been pivotal to the proficiency of the sub-Saharan African response to HIV/AIDS at all levels of governance, playing a decisive role in both exacerbating, and reducing the effects of the disease throughout the region.

*

The role of the World Bank on HIV/AIDS governance originates from the organisation’s recovery programme for sub-Saharan African economies during the 1980s and 1990s, principally through Structural Adjustment Programmes (SAP). However, the World Bank’s market-oriented lending policy in sub-Saharan Africa was crucial in weakening the functionality of sub-Saharan African governments prior to the outbreak of the epidemic. The strict conditionalities that formed part of SAPs in order to encourage political and economic reform had extensive destabilising effects on sub-Saharan African societies, particularly on the delivery of basic services such healthcare, education and welfare. In implementing austerity measures under SAPs, social spending in sub-Saharan Africa per capita declined to 76 per cent of the level spent in 1981 by the 2000s, impoverishing several countries throughout the region, and was exceptionally damaging for the poorest individuals. This is despite the fact that there is little conclusive evidence that SAPs had a positive impact on economic growth in sub-Saharan Africa.

The depleted basic services helped to reduce the governmental response to HIV/AIDS, and shaped the conditions for the proliferation of the disease. Inflated food prices lessened employment and trade opportunities, particularly in rural areas – increasing the threat of the disease by encouraging rural migration to urban areas. Moreover, the introduction of user fees in health and educational services decreased the accessibility of treatment and HIV/AIDS awareness. In fact, basic services themselves – such as health centres, became sources of HIV infection. Cuts to civil services also restricted the capacity of several governments, deteriorating the administrative ability of sub-Saharan Africa countries in addressing the disease.

The World Bank’s response to HIV/AIDS in sub-Saharan African countries during the adjustment period was also limited. World Bank funding for HIV/AIDS projects began in 1986, however the organisation’s support remained limited over the next decade. Prior to 2000, the World Bank funded a number of generic projects with HIV/AIDS components throughout Africa, none of which exceeded US$10 million. Three projects in Zimbabwe, Uganda and Kenya did receive more substantial funding from the Bank in the early 1990s, however these projects focused more broadly on sexually transmitted infections (STIs). Although the World Bank’s relative inactivity during this period echoes the dormancy of other multilateral organisations, and the general misunderstanding and stigma surrounding HIV/AIDS at the time, as academic Chris Simms argued the World Bank’s significant influence and leadership in the region should have warranted a greater response against the disease. Undoubtedly, the World Bank’s inactivity and its agenda for reform through SAPs contributed to the catastrophic impact HIV/AIDs would have on sub-Saharan Africa. In fact, between the early 1980s and 2000s, people living with HIV infection in sub-Saharan Africa increased from less than one million to 22 million.

15120lpr-700x450

However, the growing severity of HIV/AIDS on sub-Saharan African development, and its threat to human and national security in the absence of an effective cure would eventually prompt the World Bank to take emergency action against the disease. Thus, in 2000  the Multi-Country HIV/AIDS Programme (MAP) was launched. Under the MAP, the World Bank would pledge over $1 billion into the fight against HIV/AIDS, drastically expanding the Bank’s financial commitments against the disease. The programme was designed to scale up state-led responses to the epidemic, underpinned by a multi-sectoral framework that encouraged the participation of the public and private sector, NGOs and community organisations in anti-HIV/AIDS projects in the region. The MAP elevated HIV/AIDS governance to the pinnacle of the organisation’s development agenda in sub-Saharan Africa.

The MAP and the extensive impact of the disease on development in the region provoked a shift in the way the World Bank financed governments in sub-Saharan Africa. In contrast to the stringent conditionalities under the SAPs, MAP funding was issued without a specific link to macroeconomic performance. This allowed countries to scale up their response to HIV/AIDS without the explicit strain of further economic liberalisation. However, although the MAP appeared to represent a ‘softer’ financial relationship between the World Bank and sub-Saharan Africa, the organisation’s wider policy of ‘good governance’ and its neoliberal dogma remained intact. MAP funding may have not imposed strict economic conditionalities on sub-Saharan Africa, but it still required states to meet specific eligibility criteria that encouraged the decentralisation of HIV/AIDS governance.

It is true that the MAP did facilitate a number of breakthroughs in the governance of HIV/AIDS in sub-Saharan Africa. The MAP increased the political commitment towards HIV/AIDS at the highest government level, particularly through the creation of national AIDS councils. The programme also succeeded in promoting a multi-sectoral, decentralised response to HIV/AIDS by allowing national, regional and local actors to play an active role against the disease. Furthermore, the MAP also helped to mobilise larger donor initiatives such as the GFATM and the President’s Emergency Plan For AIDS Relief (PEPFAR), strengthening the response to the disease. As of 2008, development partner funding had increased by 2,240% and 502,958 people infected or affected by the disease were receiving support.

However, the MAP should not be exempted from its flaws; the programme had a consistent issue relating to the non-accessibility of funding for projects, particularly damaging for the most vulnerable in the region. This was as a result of the MAPs inability to remove the weak institutional capacity within sub-Saharan African governments, harming service delivery and the coordination of donor efforts to public and private actors, and communities. The MAP also failed to support effective monitoring and evaluation systems, which affected the veracity of HIV/AIDS outcome indicators, though efforts were made to address this in phase 2 of the MAP in 2006. The clearest indication of the MAP’s shortcomings is that HIV prevalence and infection rates have remained relatively unchanged in sub-Saharan Africa. Thus, while the MAP revolutionised sub-Saharan African HIV/AIDS governance, and produced a number of positive outcomes, its overall success is questionable.

*

The World Bank’s restructuring of sub-Saharan African governance through the SAP significantly undermined the capacity of the region’s governments against HIV/AIDS, with the adjustment programmes enervating vital preventative functions against the disease, and as a result, propagating its escalation. However, by the new millennium the growing impact of HIV/AIDS on the development of sub-Saharan Africa would inspire the World Bank to redefine its approach to HIV/AIDS, and its governance reform measures in the region overall. This led to the formulation of the MAP, placing the fight against HIV/AIDS at the centre of the World Bank’s relationship with sub-Saharan Africa, and drastically increasing the organisation’s obligation towards reducing the effects of the epidemic. However, the MAP has failed to convincingly tackle HIV/AIDS, and has once again demonstrated the uneasiness in implementing the World Bank’s philosophy on governance in sub-Saharan Africa. As such, it remains unclear whether the World Bank can be considered as the perpetrators or defenders against HIV/AIDS in sub-Saharan Africa. However, could the World Bank – or indeed any other organisation have done more against the disease?

The impact of neoliberalism on post-colonial Africa in the 20th century

The decolonisation of Africa in the 1950s and 1960s was seen as the great opportunity for the continent to finally realise its potential independently. Spurred by the sustained demands for self-determination by leading nationalists such as Jomo Kenyatta and Kwame Nkrumah, many countries throughout Africa would take back their sovereignty from their European possessors. For the first time in more than a century, Africans had once again acquired control of their resource-rich continent, and could now build upon their individual liberation movements and fulfil their ambitions to not only compete with, but overtake their former oppressors. However, merely twenty years after independence several African countries encountered severe economic complications. Thus, as Africa’s principal development partner, the World Bank, alongside its partner organisation the International Monetary Fund (IMF) stepped in by supplying loans to cash-strapped African economies through Structural Adjustment Programmes (SAPs). However, the SAPs – and the programme’s specific emphasis on neoliberal reform, would further disintegrate Africa’s economic capabilities, and entrench poverty throughout the region by the end of the 20th century.

African economies had originally performed relatively well after independence. As Nana K. Poku stated, during the 1960s gross domestic product (GDP) and exports grew at rates comparable to other main developing regions at the time, and generally better than South Asian countries. Economic growth in the continent was stimulated by an emphasis on industrialisation to reduce dependency on manufactured imports (with domestic agriculture being undervalued in the process), and a significant increase in government-led initiatives to strengthen the public sector. Motivated by the popular socialist economic principles of the Cold War, and the large sums of donor support from international financial institutions (IFIs) and developed countries, government expenditure accelerated throughout Africa. On the contrary, the private sector remained relatively dormant during this period. African economic policy during the 1960s led to major investments in infrastructure, and created an expansion in public health and primary education.

Unfortunately, by the 1970s African economies began to take a dramatic turn for the worst, negating the positive gains made a decade earlier. There were several factors that perpetuated the economic decline in Africa. Internally, between 1972-73 the continent contended with a severe drought that affected large parts of the region, shattering the already under-utilised agricultural sectors. This caused food and livestock shortages across many countries, stimulating forced migration and the inflow of refugees. Moreover, growing political instability and corruption throughout Africa resulted in the mismanagement of government enterprises, and led to the poor maintenance of highly expensive machinery and domestic labour forces.

There were also external shocks that threatened Africa’s economic sustainability. The Organisation of Petroleum-Exporting Countries (OPEC) decision to dramatically increase oil prices in 1973 devastated the economies of most African countries, who now had to deplete their foreign exchange reserves and acquire foreign loans to import oil. In fact, oil imports as a percentage of export earnings rose from 4.4% in 1970 to 23.2& by 1980. However, the decision was incredibly beneficial for oil exporting states, including African countries such as Nigeria and Gabon, who were able to maximise their profits at the expense of their continental counterparts.

There was also a fall in the price of African commodities, with some commodity prices becoming lower than the cost of production for the commodity itself. This was propagated by the low consumer demand in the West emanating from the economic recession of the late 1970s. The recession intensified Western protectionism, reducing the commitment of Western governments to IFIs, while simultaneously increasing the interest rates on loans obtained by African countries. This coincided with the gradual move away from Keynesian economics in favour of neoliberalism – spearheaded by the ‘New Right,’ and manifested by the policies of  Margaret Thatcher and Ronald Reagan. As a result, the attempts by the Organisation of African Unity (OAU) to quell the economic disparities through the state-driven Lagos Plan of Action were largely disapproved of by IFIs and Western governments in favour of economic reform based on free market capitalism.

Despite the internal and external shocks, the World Bank and the IMF argued that the financial difficulties in Africa were largely attributed to the defective economic policies of domestic governments, particularly the excessiveness of state intervention and their gross resource mismanagement. Thus, in order to stimulate positive growth in Africa, the World Bank and IMF proposed under the principles of the ‘Washington Consensus’ that African governments should refrain from intervening in their economies and social services, liberate market forces by increasing privatisation and deregulation, reduce trade barriers and price controls, and impose strict fiscal and credit policies to control government  expenditure.

Under the Structural Adjustment Programme (SAP) the World Bank would attach the aforementioned neoliberal reforms as conditionalities to their lending policy, with the IMF assisting with the macroeconomic development of the region. This subsequently forced indebted African countries across the continent to adopt the adjustment policies ascribed by the World Bank to acquire the vitally needed loans from the organisation to alleviate the existing financial strains on their economies. Though the SAP were expected to encourage long-term growth, throughout the 1980s and 1990s the implementation of SAPs was extremely damaging to social, economic and political conditions in Africa.

SAPs had an incredibly detrimental impact on the delivery of basic services in sub-Saharan Africa, particularly in the provision of healthcare, education and welfare. The economic reform packages severely rationalised government expenditure commitments, with the introduction of user fees diminishing access to social services, particularly for the most vulnerable in African society. background-20_295x215During the two “lost decades” of the 1980s and 1990s, social spending in sub-Saharan Africa declined per capita to 76 per cent of the level spent in 1981. The reductions to civil services also diminished government capacity, deteriorating the administrative ability of African countries to address public concerns. Furthermore, liberalisation policies inflated the price of food, and lessened employment and trade opportunities, particularly in rural areas. This caused widespread poverty, and lead to the proliferation of disease, especially in the case of HIV/AIDS and tuberculosis. The anticipated transformation in private enterprise through the SAPs failed to stimulate growth, creating deeper holes in the finances of already-depleted services, and escalating social inequality.

The negative effects of the SAPs was compound by the fact that it damaged the ability of African governments to service their foreign debts. In 1980, at the onset of the World Bank and IMF’s intervention in Africa, the ratios of debt to gross domestic product (GDP) and exports of goods and services were respectively 23.4% and 65.2%. By 1990, they had deteriorated to respectively 63.0% and 210.0%. African government can only pay off debts from earnings in foreign currency (through exports, aid or from additional loans), making it incredibly difficult for countries to manage their debts.

The scale of problem is overwhelmingly illustrated by the fact between 1980 and 2000, Sub-Saharan African countries had paid more than $240 billion as debt service, that is, about four times the amount of their debt in 1980. Moreover, by the end of the 20th century all African countries with the exception of South Africa were spending more on debt repayments than on domestic healthcare. The World Bank and IMF did introduce the Heavily Indebted Poor Countries (HIPC) initiative in 1996,  providing debt relief and low-interest loans to the world’s poorest countries. However, eligibility for the initiative hinged on the successful implementation of SAPs, which as stated above, often led to the decline of Africa countries in any case.

The inception of the HIPC coincided with the introduction of Poverty Reduction Strategy Papers (PRSP), which eventually replaced SAPs in 2002 with a poverty-focused approach to economic development. Indeed, the World Bank would ultimately acknowledge that the SAPs had very little impact on economic growth in several of its evaluation reports on sub-Saharan Africa. With that said, rather than to place the blame on the destructive nature of their neoliberal economic agenda, the World Bank argued that African governments had once again failed to manage their economies effectively, and that they had poorly implemented SAPs.

The PRSP has placed a greater emphasis on reducing poverty in Africa, and has led to a more country-driven direction towards development , unlike the “one size fits all” approach under the SAP. Despite this, PRSPs have continued the World Bank and IMF’s insistence on neoliberal reform, with both organisations maintaining the imposition of conditionalities on their lending policy. The positive impact of PRSPs remains questionable, and have ultimately failed to eradicate the extensive criticisms and negative discernments aimed towards the World Bank and IMF for causing needless hardship in Africa that continues to this day.

Should the United Kingdom leave the European Union?: Challenging Eurosceptics on the British economy

On 23rd June 2016, the United Kingdom will vote to decide whether the country is to remain or leave the European Union. For the first time in a generation, the general public will have the opportunity to formally declare their position on the U.K.’s membership in the EU, a declaration that could potentially transform not only the country’s political, social and economic landscape, but the landscape of its European and international neighbours. In the lead up to the EU referendum, the main political parties have announced their allegiances to the opposing campaigns, which have either defended the U.K.’s commitment to the EU, or have argued for the country’s abandonment of the European politico-economic project. Within the Conservative Party, the EU referendum has reignited the historic divisions concerning the U.K.’s membership, creating a schism among the Tory leadership.

It is true that both the ‘In’ and ‘Out’ campaign groups have displayed instances of unsavoury scaremongering to encourage support for their respective causes. However, the fear-inciting tactics used in the defence of Brexit are in my view particularly concerning. Proponents of the Leave campaign, such as Nigel Farage, Michael Gove and Boris Johnson, have used the EU referendum to propagate misconceptions regarding the ailing state of the British economy, and promote an aura of hate towards immigrants across the country. The vile effect of the claims made by the ‘Out’ campaign have been exemplified by the brutal murder of Jo Cox MP by far-right activist Thomas Mair, who had previously links to neo-Nazi and pro-Aparthied organisations. But are ‘Brexiters’ correct in suggesting that the U.K. will be able to  improve the economy and control immigration if the country leaves the EU?

One of the central arguments advanced by the ‘Out’ campaign is that a Brexit would allow the U.K. to negotiate its own trade deals with other countries independent of existing EU trade agreements. There are currently no tariff barriers to trade for countries within the EU, creating a single market of goods and services (though non-tariff barriers such as product standards produce limitations on trade). However, EU member states (including the U.K.) are also subject to the trade agreements made with non-EU countries by the European Commission.

The EU trade agreements prevent the U.K. from negotiating separate trade deals with the rest of the world – which may appear suffocating given the rise in economic growth in the developing world, and the ever-increasing U.K. trade surplus with non-EU countries. In comparison, the U.K. has a trade deficit of £61 billion with the EU, with domestic consumer demand outweighing the value of our exports to the continent. In fact, U.K. exports have made a gradual shift from the EU to non-EU countries, with a decline in the EU share of total exports from 54.8 percent to 44.6 percent from 1999 to 2014. For Brexit campaigners, this illustrates the potential of the U.K.’s economic growth in the rest of the world, which could be maximised as a result of leaving the EU.

While it may be advantageous for the ‘Out’ campaign to postulate positive economic outcomes if the U.K. was to abandon the EU, we are simply unable to predict what kind of trading relationship the U.K. will be able to negotiate with the EU, or with the rest of the world. Eurosceptics have argued that the U.K. could negotiate a trade deal with the EU similar to that of Norway and Switzerland. Both countries are not members of the EU, but have access to the single market and can negotiate trade deals with non-EU countries independently. However, Norway and Switzerland are subject to EU rules and policies with limited or no influence, and make budget contributions to the union. Moreover, both countries must abide to the free movement of good, services and most importantly peoples.

On the other hand, Canada has also negotiated tariff free access to the EU, without making any budgetary contributions or the free movement of peoples. Though, the trade agreement took an incredible five years to complete, and is difficult to compare to a potential U.K. model as the EU’s share of Canada’s total exports was less than 10% – significantly less than the U.K’s. In the worst case scenario where the U.K. is unable to make any sort of trade agreement with the EU, the country would be subject to World Trade Organisation (WTO) rules. This would prevent access to the free trade agreements for goods or services with the EU, and the U.K. could face EU and non-EU external tariffs and no access to the single market. This uncertainty over what kind of trade deal can be arranged would heavily impact on the productivity of the U.K.’s economy in the interim, as the EU is the country’s main trading partner, evidently making Brexit a risky, uncalculated gamble.

Another important argument put forward by Eurosceptics regarding the economy is the view that the U.K. could save an estimated “£350 million a week” by leaving the EU – as quoted by Boris johnson in a recent television debate, as the country would no longer have to make a financial contribution to the union. It has been suggested that the £18 billion saved could instead be invested into public services, such as education or health care. However, as the Institute for Fiscal Studies noted, the figure of “£350 million a week” is absurd as it ignores the rebate the U.K. receives from the European Union. If the rebate is included the figure actually stands at £14.4 billion or 0.8% of GDP in 2014 (around £275 million a week).

The EU also spends £4.5 billion on the U.K. in areas such as agriculture and research, thus the U.K. only provides approximately £8 billion a year to the EU budget. Moreover, though the U.K. receive per-capita the lowest spend from the EU of any member state, EU spending is only 1 per cent of our GDP – lower than the EU average, and is a relatively small amount in comparison to the £169 billion spent on health and education alone by the government. It is also important to note that the membership fee also brings with it increased trade, investment and jobs, which the U.K. would lose if it is unable to negotiate a trade deal with the EU. In fact, many economists have revealed that lower economic growth as a result of leaving the EU will negate any savings made from EU contributions anyway.

Arguably the most distinguishable argument of the ‘Out’ campaign is the impact of EU immigration on the U.K. economy. UKIP leader Nigel Farage in particular has been defiant in his attempts to reduce immigration from the EU into the U.K., most demonstrable by the party’s  recent “breaking point” campaign poster. Nigel-Farage-UKIP-436573However there is no conclusive evidence to suggest that EU immigrants cause a strain on the British economy, with most studies suggesting that their impact is relatively small, costing or contributing less than 1% of  the U.K.’s GDP. Rather, a recent UCL report stated that since 2000, immigrants have made a more positive fiscal contribution than longer-established migrants – £5 billion from immigrants who entered the U.K. from countries that joined the EU in 2004. There is also evidence to suggest that EU immigrants pay more in tax than they gain in welfare or public services, and they have a higher net fiscal impact than non-EU migrants.

In any case, leaving the EU may not lead to a significant reduction in immigration. As alluded to above, the kind of trade deal the U.K. is able to negotiate with the EU after “Brexit” is pivotal to controlling immigration, as access to the EU single market is also likely to allow for the free movement of people. Therefore, it may require the U.K to leave the single market altogether in order to effectively control immigration, which could neverthless have a negative impact on the economy. In my view, the argument relating to the impact of immigrants is largely cultural; as Nigel Farage proclaimed “the country has become unrecognisable.” However, while the cultural impact of EU immigration is questionable, there is enough evidence to suggest that the economic value of EU immigration is likely to be beneficial to the U.K., counterpoising any superficial arguments that a Brexit will conclusively have a positive impact on the U.K. economy.

Finally, let us not forget the overwhelming consensus of economic institutions and researchers and businesses (according to the British Chambers of Commerce) that argue that the U.K. must remain in the EU in order to preserve economic stability and continued growth. There is one notable exception to this trend – the suitably named Economists for Brexit.  Supported by Patrick Minford and Roger Bootle, Economists for Brexit have predicted a net benefit to the U.K. economy. However, even the most comprehensive analysis on Brexit by independent think tank Open Europe has predicted that even if the U.K. is able to negotiate a trade deal GDP  could increase by 1.6% by 2030. However, if a trade deal is not negotiated GDP could be lower than 2.2% in the same period.

The argument against Eurosceptics is clear. I plead with the people of this great land, do not put our country at economic risk.

Reimagining race: Should the colour of your skin still matter?

Since the dawn of mankind, human beings have always used various forms of identification to distinguish themselves from their counterparts.   In the contemporary world, social identity has been shaped by notions such as a person’s religious beliefs, cultural attitudes, sexual orientation, or their disposition towards a particular gender group. However, it is the categorisation of ‘race’ that serves as the  fundamental component of our identity today, structuring civilisation into a complex intermixture based on the colour of our skin. This structure has repeatedly produced adverse implications on the cohesion of humanity, both on a local and global scale.

Unlike other forms of identification, the modern international system of capitalism was built upon the conception of ‘race,’ emanating from the exploits of slavery and colonialism. Indeed, in order to rationalise the seizure of human beings, territory and resources, European explorers categorised people across the world according to their pigmentation, building a meticulous framework that not only emphasised their superiority, but demonised and subordinated the ‘alien’ populations they encountered. Supported by the scientific revolution of the early modern period, and the bastardisation of Christian beliefs by proponents of the church, Europeans were able to justify the fabricated damnation of people considered ‘black,’ and the hierarchical aggregation of  other ‘coloured’ peoples.

Despite the abolition of slavery, the decolonisation of large parts of the world, and the global fight for civil rights against racial segregation and discrimination, the differences in our skin colour remains an impediment against social justice, and contributes to the existing divisions already present in the world today. ‘Race’ continues to hold a transcendent influence on the provision of wealth and economic opportunity, health, education, friendly and intimate relationships, civil obedience, and on other areas of the domestic and international arena, often to the detriment of those formerly subjugated by the racial categories built by imperalists centuries ago. Moreover, while this article refuses to divulge into the common stereotypes associated with different ‘racial’ groups, we all recognise (whether consciously or subconsciously) that they serve as a caveat in the minds of every thinking individual on this planet, ultimately contributing to the maintenance of the unjust racial status-quo.

But why must humanity remain confined to such illusionary ‘racial’ conceptions? The globalised nature of the 21st century has only reinforced the fallacy of the long-standing ‘racial truths’. Thus we, as humans being, must now avoid confining our identities to such preposterous beliefs, which were originally created in order to purposely propagate our separation. ‘Race’ is a social construct, a construct developed by the holders of power to prevent the societal conditions necessary to produce equality. As such, ‘race’ should be considered as menial as the colour of our eyes, the hairs on our heads, or the lines on our hands in order to end the dictative influence of the tendentious racial ideas.

How can social cohesion ever be achieved if I preserve my identity as a ‘black’ man,  reigniting the suffering of peoples that share the same shade of skin, while another preserves their ‘whiteness’ and their ‘ideological superiority’ over the rest of the world? Humanity must agree that our race has no colour, and is not a stimulus for societal domination or subjection. With the elimination of ‘race’, we naturally eliminate the ancestral relationship between the oppressed and the oppressor, the slave and the slave owner, the conquered and the conqueror.

Social differences exist, and they always have, but ‘race’ is not a condition that should lead to our disunity. Humanity is built upon the physical differences between man and woman, the ideological and theological dissimilarities regarding the proof our existence, and the cultural and linguistic distinctions as a result of our individual geographical origins. However, the importance of our skin colour should only have a bearing on our identity if we consider the hierarchy of different racial groups to be intrinsic to our nature – which in my view, is completely baseless.

In reflection, it is time the world re-evaluated the role of ‘race’ in our societies, and its significance in the wider context of our post-imperalist global environment. If my view appears utopian, then unfortunately you are controlled by the deep-rooted psychological manipulation of European expansionism. Wake up.

Food for thought.

Is Western foreign policy to blame for the ‘European migrant crisis’?

Over the last few weeks, the British public have been well-informed of the harrowing experiences faced by migrants travelling to Europe in an attempt to escape the harsh realities in their homeland. As a result, the media frenzy over the ‘European migrant crisis’ has not only raised fears across the continent over the safety and wellbeing of the incoming migrants, but has also led to an increase in xenophobia towards the thousands of escapees in search of a better life. In Britain there has been a growing apprehension and antipathy aimed in the direction of migrants, swayed by a refusal to accept their newfound habitation in the European Union. This view is somewhat epitomised by David Cameron’s use of the word ‘swarm’ in describing the numbers of people crossing the Mediterranean Sea in search of security in Europe.

But should Britain and its Western compatriots shoulder some of the blame for the recent influx of migrants to the European Union? In answering my own question, Yes. Of course the West should hold a significant amount of responsibility for the crisis.

First and foremost, while the crisis is commonly being described as an issue concerning ‘migrants,’ in reality a huge percentages of those arriving in Europe are in actual fact refugees seeking asylum. This would suggest that the people arriving in the EU are generally not in search of the financial benefits that the region has to offer – as the term ‘migrant’ has often implied. Rather, people have fled from the horrors of violent conflicts, or political and social persecution in their countries of origin.

However, most importantly the majority of the people claiming refuge have directly (or indirectly) been affected by the often self-seeking foreign policies of the West outside of the EU. The highest number of migrants that have travelled to Europe are from Syria. Since the turn of the year, over 100,000 people have fled from the destructive Syrian Civil War, which has seen the country torn between the regime of Bashar al-Assad and armed militia, including the new proponent of Islamic radicalism the Islamic State of Iraq and Syria (ISIS). Although the West have stopped short of armed intervention in the civil war, the United States and Britain have consistently bolstered the Syrian opposition by supplying intelligence, training and ammunition to groups fighting against the government. This has only helped to excerabate the severity of the conflict, and further tarnish the lives of civilians in Syria.

syrian_refugee_photo__2014_03_21_h12m42s44__DSAfghans also constitute a significant number of refugees arriving in the EU, largely due to the fallout from the US and NATO led War in Afghanistan. The conflict – which was motivated by the September 11 attacks, has caused widespread devastation across the country. Fourteen years later the war continues to plague the citizens of Afghanistan, with the somewhat unsuccessful removal of the Taliban proving in hindsight to be a costly and ruinous venture for the West. According to the United Nations Human Rights Council (UNHRC), over 2 million people have been classified as refugees since hostilities began in Afghanistan.

Like in Syria and Afghanistan, recent Western involvement in Libya and Iraq has extended the extirpation of civil society across the Middle East and North Africa, and as a result has increased the number of refugees entering the EU. In 2011, NATO intervened against the Libyan dictator Muammar Gaddafi following his apparent refusal to cease actions against civilians that the West considered as ‘crimes against humanity.’ Three years later, with the help of his Western allies Barack Obama would also initiate another offensive in Iraq against the growing influence of ISIS in the north of the country.

In both conflicts, the West have to some extent been successful in achieving their military objectives. With that said, the West have also proliferated the humanitarian crisis in Iraq and Libya, and have played a decisive role in worsening the living conditions of civilians. In Iraq alone, the UN Office for the Coordination of Humanitarian Affairs estimates that roughly 5.2 million people now need humanitarian assistance, including food, shelter, clean water, sanitation services, and education support. Furthermore, the demise of national order and security within Libya is embodied by the fact that migrants are repeatedly using its northern border as an escape route to Europe. Incredibly, in August 4000 people travelling from Libya were saved from boats off the coast of the country in what has been described as one of the largest rescue missions during the ‘crisis.’

Away from the Middle East and Libya a growing number of refugees continue to emigrate from Eritrea, which is recognised by the UN to have one of the worst records for human rights. In recent months, thousands of Eritreans have escaped to the shores of Italy in response to the oppressive, single-party state under Isaias Afwerki. However, despite the West’s insistence on protecting the liberties of people across the world – as demonstrated in its exploits in the Middle East and Libya, the plight of the Eritrean people has largely been ignored by Western policymakers and its mainstream media. This is typified by the fact that reports on the European migrant ‘crisis’ have scarcely mentioned the amount of Eritreans entering the EU or the reasoning behind their escape. Unlike in the Middle East and Libya, Eritrea does not present itself as a ‘goldmine’ of natural resources, and as a consequence Eritrea falls outside of the West’s economic and political interests.

As the members of the EU scramble to find a solution to the migrant ‘crisis,’ a multitude of people will continue to risk their lives  to leave behind the trials and tribulations they face in their homelands. Europe has now become the ‘promise land’ for many of the migrants escaping their mother countries, with approximately 340,000 men, women and children having already journeyed through cruel and unsavoury terrain towards the EU border. However, many have now perished in the most inhumane of circumstances on the road to a safer environment that they could one day call home. But will there be a robust resolution to the European migrant ‘crisis’? It is unclear. What is certain is that the West must accept considerable accountability for the plight of migrants and take the necessary steps to make certain that they are given the protection they deserve in Europe. Going forward, it is vital that the West re-evaluate the measures used in relation to its foreign policy and ensure that lessons are learned to avoid a migrant crisis on this scale from occurring in the future.

Should we question the integrity of the Conservatives victory in the 2015 General Election?

On Friday 8th May 2015, David Cameron stood victorious in front of the British media after witnessing the Conservative Party defy the polls and sweep to an unexpected triumph in the general election. In what was one of the most anticipated elections in recent history, the Tories would defeat their election opponents across the country and collect an unprecedented 331 seats. This would ultimately be enough to form an outright majority and eradicate the prospect of a successive coalition government. No longer bound by the chains of the Liberal Democrats – soundly beaten during the general election, and with Labour and UKIP also left with some ‘soul-searching’ to do, Cameron gleefully returns to 10 Downing Street with the future of the United Kingdom in his hands.

Despite David Cameron’s now infamous victory, a burning question remains of great interest to political commentators and the public as a whole; did the Conservative Party actually deserve to win the general election? Yes – if the result of the general election is taken at face value. The Tories won five more than the 326 seats required to form the slender majority administration, with the Party seizing 24 seats from its political rivals and surprisingly increasing its percentage of the vote.

However, if we are to assess the Conservative Party’s share of the vote, and more importantly the often disputed ‘First-Past-The-Post’ voting system, there is an argument that challenges the Tory supremacy over the House of Commons. The Conservative’s did muster an impressive 36.8% of the public vote, but if this percentage is calculated under a system that adheres to proportional representation, the Tories would not have reached the impressive number of seats it tallied at the end of the election.

Arguably as telling was the detrimental effect of ‘First-Past-The-Post’ on the other political parties. Although Labour’s share of seats would have only seen a minimal decrease if it was reflective of the public vote, UKIP and the Liberal Democrats suffered immensely as a result of FPTP. Despite gaining 12.6% and 7.9% of the vote respectively, only 9 seats were gained between UKIP and the Liberal Democrats. Conversely, the SNP – widely applauded for their success in the general election, gained 56 seats despite only obtaining 4.7% of the vote. Whilst the Conservatives would have remained the largest party under a proportional representative voting system, they would have certainly been forced into forming another coalition government and contend with a greater number of diversely affiliated MPs in Parliament. Unfortunately however, the Conservatives can expect a somewhat ‘muzzled’ resistance against their governance over the next five years as a result of the flawed FPTP system.

Another arguement that questions the integrity of the Conservative victory was the indecisive nature of the British electorate and the increased use of ‘scare tactics’ within right-wing politics. The opinion polls right up to the general election suggested that the final result would be too close to call, with Labour fighting tooth and nail with the Conservatives for the right to be in power. However the exit poll would paint an entirely different picture, teasing a Tory victory by implying that they were inches away from the majority they required to govern solitarily.

The ‘Shy Tory Factor’ has been used to theorise the disparity between the opinion polls and the outcome of the general election. Seemingly like in the 1992 general election, a large portion of the British public ‘disguised’ their support for the Conservatives until poling day as a result of their ‘shame’ towards their inclinations to the blue corner of british politics. However, whilst this theory may be true for a portion of the ‘silent’ Tory vote, it appears that many balloters were simply undecided as to who to vote for and made the decision to vote Conservative without having a clear and definitive allegiance to the party on (or days before) polling day.

The decision-making of the ‘undecided voter’ during the election was a reflection of how the right had instilled fear into the electorate. The Conservatives – and more profoundly UKIP, flooded the British public with misconceptions over Labour’s destruction of the economy and immigration, whilst also igniting the subject of Britain’s exclusion from the European Union. This ultimately proved to be a successful tool in dramatically increasing the share of the vote for the Right and consequently destroyed the possibility of the Labour Party gaining a majority in Parliament. Roy Greenslade expressed the view that in the five years leading up to the election, the right-wing press continually propagated the views of the Conservatives and UKIP and gave ‘disproportionately favourable coverage to Nigel Farage and his party.’

Building from this fact, the role of the media cannot be underestimated in swaying public opinion, and fundamentally providing the Conservatives with the platform to win the general election. The Conservatives were backed by six major newspapers – The Sun, The Telegraph, Financial Times, Daily Mail, The Independent and the Times, all of which bombarded its readers with Tory propaganda and anti-Labour hyperbole in attempt to maximise the Conservative vote. The Sun’s insistent character assassination of Ed Miliband was particularly noteworthy in the lead up to the election, damaging the reputation of the former Labour leader and his credentials as the future Prime Minister. The newspaper’s ‘Save Our Bacon’ headline a day before the elections was decisive in sealing the fate of Miliband’s election campaign and was arguably the pinnacle of the Sun’s attempts to persuade its 5 million-plus readership to support David Cameron.

256679_1

Broadcasters also appeared to be pro-Conservative; Sky News – another media organisation that Rupert Murdorch has vested interest in, unsurprisingly reproduced the favouritism it showed towards the Tories in the 2010 general election. This was evident during the first televised election debates, where there was no question that Ed Miliband received a far stringent assault against his character from Jeremy Paxman and Kay Burley than his Conservative adversary, including personal jibes at his relationship with his brother.

Contrary to popular belief, The BBC have also been noted for their bias towards the Tories. As Phil Harrison eloquently put it in his article ‘Keeping An Eye On Auntie: The BBC & Pro-Tory Bias,’ the BBC lent increasingly towards the right in order to appease the ‘status quo’ of capitalist realism that has been promoted by the Conservatives. The lack of media attention across all of the major broadcasting organisations following the recent protests against the government only days after the election only perpetuates the overwhelming media disposition towards the Tories.

Now that the dust has settled from an extraordinary general election, the Conservatives will go down in folklore as the ‘conquerers of coalition politics’ despite the universal expectation of a hung parliament. However as this article indicates, the Tory victory seeps of unworthiness, not least due to a widely criticised voting system, an electorate with ambiguous political affiliations and a media environment that showed considerable support towards David Cameron and his party – as well as the Right in general. Of course it would be wrong to deny the deficiencies in the Labour election campaign, though there is sufficient evidence to suggest that the Conservatives were given more than a helping hand towards their election achievements. Nevertheless, be sure to expect the Conservative’s austerity programme go into full swing and the continual vilification of Labour’s ability to govern the country again by a pro-Conservative press. Oh the despair of another five years with Mr Cameron at the helm of this wonderful country.

The Île-de-France attacks and the proliferation of Western ‘Islamophobia’

Before I begin, I must stress that I in no way, shape or form condone the actions of the gunmen who shot dead twelve people after storming the offices of satirical newspaper Charlie Hebdo, or the subsequent acts of terror that took place in the Île-de-France region thereafter. Moreover, I do not sympathise with the aims and objectives of al-Quaeda, Islamic State, Boko Haram, or any other group associated with Islamic radicalism. However, I have certainly struggled to comprehend the growing antipathy towards muslim communities – and against the integrity of Islam, as a result of the fatal shootings in the French capital. In the last few weeks ‘Islamophobia’ has heightened across the Western world, perpetuated by the reactions of politicians, various media outlets, and other commentators, leading to a profusion of verbal and physical attacks against Muslims, most notably epitomised by the Chapel Hill shootings.

On 7 January 2015, Saïd and Chérif Kouachi would initiate a chain of attacks that would stun the world, generating immidiate attention worldwide and causing shockwaves across social media. In total, 17 people were killed with many others wounded, in what was the deadliest act of terrorism since the Vitry-Le-François bombings of 1961. The events would receive widespread condemnation from across the world, most evidently from Britain, the United States, and Israel who voiced their detestation and reinforced their aims to tackle Islamic radicalism. The French people would also rally against the actions of the terrorists, with the rest of the Western world quickly following suit under the banner of “Je Suis Charlie.”

However, despite the public knowledge that the terror attacks had been motivated by Charlie Hebdo’s controversial lampooning of the Prophet Muhammad in the newspaper’s previous publications, the West comprehensively defended Charlie Hebdo’s depictions of the ‘founding father of Islam’ in support of the liberal concept of ‘Freedom of Speech.’ This directly challenged a key Islamic principle that forbids images of Muhammad and in turn, was an demonstration of the West’s attempt to sow a dissension between Islam and Western ideology after the Île-de-France attacks. This was exemplified further by Barack Obama’s and David Cameron’s statement that they would “continue to stand together against those who threatened their (the West) values and their way of life” and along with France, they made it clear to those “who think they can muzzle freedom of speech and expression with violence that their voices will only grow louder.” There was a defiance to remove all accountability from the publications of Charlie Hebdo, and an unwavering ignorance by the West of the undeniable disrespect shown by the newspaper against a faith followed by millions of people across the world.

charlie_hebdo_newcover

Not only did the Western world largely disregard the argument that the portrayals of Muhammad were a suitable catalyst for the Île-de-France attacks, but Charlie Hebdo would also receive unprecedented support for their first edition immediately after the events in Paris – that once again depicted the Prophet Muhammad satirically. The edition of 14 January 2015 sold over seven million copies in contrast to the standard nominal amount of 30,000. This exposed the scale of the support for ‘Freedom of Speech’ and the extent of the West’s insolence towards Islam and negligence to what the religion’s believers considered as blasphemy. The wide-scale publication of the edition also served to disillusion a large number of Muslims across Europe and the United States and was yet another example of the West’s naivety in relation to the impact the publication’s success could have in fuelling Islamic radicalism further.

‘Islamophobia’ was not only confined to the events surrounding Paris but manifested itself into a number of other ways; a week after the Île-de-France attacks, 128 ‘anti-Muslim’ incidences were registered with the French Police in comparison to a mere 133 in the whole of 2014. In fact, many of the incidents included shootings, attacks against mosques and threats or insults, many of which received minimal or no media attention. This illustrates the considerable disregard for the welfare of Muslims in France and is evidence of the contrasting approach to acts of terror when Muslim communities are on the receiving end. Furthermore, there were many reports of Islamophobia in Germany, United States and other parts of the Western world.

As predicted, the Île-de-France attacks invigorated the far-right movement, with the likes of Marine le Pen and Geert Wilders using the events to rationalise their anti-muslim agendas. There were also other high-profile episodes of ‘Islamophobia’ on social media; for example, Rupert Murdorch’s comments on Twitter that suggested that Muslims must be held responsible for the acts of terror. This is unsurprising given the racist nature of some of Mr Murdorch’s statements in the past, and the fact that Twitter and Facebook has been used as a fitting avenue for the flourishing of ‘Islamophobia’ as per an investigation by the Independent. The views of terrorism ‘expert’ Steven Emerson, who falsely claimed that Birmingham was a completely ‘Muslim city’ and that in areas of London (and I quote) “there are actually Muslim religious police that actually beat and actually wound seriously anyone who doesn’t dress according to religious Muslim attire,” only reiterates my argument that Western ‘Islamophobia’ is on the rise.

Arguably the most identifiable instance of ‘islamophobia’ was carried out on the 10 February 2015, commonly known as the Chapel Hill shootings. Merely a month after the Île-de-France attacks, three innocent Muslims were shot dead by Craig Hicks in another incident of terror. In what has largely been reduced to an ‘isolated dispute over parking,’ there is sufficient evidence to suggest that the murders conducted by Craig Hicks were motivated by hate and his aversion towards the religion of his victims. Again, unlike the Île-de-France attacks, the Chapel Hill shooting was also given minimal attention from Western media, and received a deafening silence by the very politicians who argued so valiantly against the atrocities in France. An American news company even went as far as interviewing Craig Hick’s wife, during which she was given the opportunity to defend her husband’s actions, and proclaim his efforts in ‘championing the rights of others’ despite considerable reports of his ‘gun happy’ and extreme atheist tendencies. Undoubtedly, it would be hard to imagine Western media ever interviewing the wife of an Islamic man after orchestrating a potential hate crime, particularly in a world where Islam is often feared and repudiated.

US-Islamophobia-CAIR-UC-Berkeley

As demonstrated, there has been a dramatic rise in ‘Islamophobia’ in recent weeks since the Île-de-France attacks that have once again served to stigmatise and vilify Islam and its worshippers. A concept that has solidified in Western culture after the events of September 11, ‘Islamophobia’ has resurfaced to target muslim communities and reinforce the historical divisions between the Islamic and Western Judeo-Christian civilisations. Can we foresee an end to the cultural persecution of Islam? Maybe, however if if remains the West’s objective to antagonise an entire religion for the actions of an isolated few, the world may not see a cessation of ‘Islamophobia’ for a considerable time yet.

 

The regeneration of Hackney – the saviour of a borough in despair

Historically, the London Borough of Hackney was renowned as one of the poorest areas in Britain, plagued by widespread poverty, unmanageable social tensions and tormented by an endemic of criminal activity. From the derelict-stricken housing estates to the considerable deficiencies within the borough’s schools and hospitals, throughout all walks of life its residents encountered incredible scenes of marked deprivation. However, since the turn of the century Hackney has emerged as a vibrant centre of affluence, propagated by the booming property market in the area and a growing incentive by local government to address the wealth inadequacies – supported by the advent of the 2012 Olympics. With that said, an influx of middle class “voyagers”  have solidified in Hackney coinciding with the flourishing atmosphere within the borough, submerging with a working class population well aware of the relative indigences of yesteryear. Ultimately however, it would be difficult to argue against the fact that the emanation of the regeneration and gentrification of Hackney has not only revitalised the borough, but removed Hackney from its incredibly ruinous past.

Over the last two decades Hackney has made a considerable effort to purge the economic and social shortcomings. Initiated in the late 1980s, the Council planned to rid the borough of its “sink estates” resulting in the demolition of Trowbridge, Clapton Park, Nightingale (right), Holly Street (where 80% of residents had applied for a transfer) and the Kingshold Estate. The Woodberry Down, Haggerston, Kings Crescent and Pembury Estates are also currently facing reconstruction. In its place more traditional low-rise housing has appeared, along with a plethora of privately-owned developments. Moreover, since 2006 under the Decent Homes programme the Council have invested over £184 million in renovating thousands of existing homes.

2-590x484

The closure of Hackney Downs school, Kingsland school and Homerton College of Technology due to the below-par performances of its students and recurrent behavioural issues stimulated the emergence of innovative academies, commenced by the Mossbourne Academy in 2004. This, along with the Learning Trust’s dominion over education in Hackney has led to the rebuilding of all secondary schools and the implementation of constructional improvements to primary schools across the borough, decisively improving education. From 2006 to 2013, GCSE results (5 A*-C) increased from 50.9 % to a staggering 79.6%. Corroborated by the advancements in health care and transport – particularly with the expansion of the London Overground subsequent to the completion of the East London Line, ultimately the fortunes of the borough have taken a considerable turn for the better. There have also been increased attempts by the Metropolitan Police (and more specifically Operation Trident) to tackle the eminence of gangs in Hackney after the ill-fated riots of 2011.

Undoubtedly, the recent trends in Hackney have not only statistically reduced crime and poverty rates across the borough, but more importantly have been effective in dissolving the established adverse reputation of the borough. The Metro newspaper recently ranked Hackney as the 2nd best borough in London – remarkable considering Hackney was perceived in the past as the worst place to live in Britain. Furthermore, benefit claimants have reduced by 6% since 2006, employment rates have steadily risen in the same period to 63.7% and across Hackney deprivation has seen a sharp fall, particularly in the Haggerston, Clissold and Lordship wards. These factors have intertwined with the shift in the demographics in the area from a low-income, impoverished community to a prosperous and blossoming place to reside and visit.

Whilst it would be difficult to suggest that the regeneration of Hackney can completely disguise the remnants of poverty in the borough and avoid an ‘indigenous’ population grieving with antagonisms towards the appearance of wealthier newcomers, it generally has had a positive impact in rescuing the borough from further distortion. 697392-120810-hipstersHow long will the social prosperity last? It is difficult to estimate, however  what is for certain is that there is evidence to suggest that gentrification has conserved my place of origin from returning to the social horrors it was once accustomed to.