Having given up Facebook just before Christmas, I’ve found myself scrolling through Twitter more than usual lately. Yesterday evening was no exception. As I scanned through my newsfeed, a story by Katie Hopkins from the Mail Online caught my eye. The headline read: “‘No Trump, no KKK, no Fascist USA’. I’d Wager all 3 Unlikely to Listen to People in Need of a Job, a Shower or Both.” I could see that it related to the protests organised by Owen Jones on Monday evening, and, having attended the protest in Westminster myself, I was almost ready to use my 140 characters immediately. However, recognising this as an emotional rather than a rational response, I convinced myself that it would be better to read the full article first. I didn’t want to write anything that might later prove false. The opening sentences of the story dragged me straight down into the depths of “Katie’s world”; a world where post-truth is the real truth and her followers neither know, nor care, if she knew the real truth in the first place.
After having read the entire story I still felt angry. Without giving it too much thought, I sent the following tweet:
My error wasn’t to have tweeted abuse at Katie Hopkins – lots of people do that. My mistake was to be a teacher who had been to a protest with a banner that was made by some students. Was my tweet unprofessional? Possibly. Is it offensive? Only directly to Katie, a woman who herself never hesitates to cause offence. Does it suggest that I was brainwashing children and forcing them to manufacture propaganda to impose my views upon other people? Absolutely not in the real world. But then I wasn’t in the real world. As soon as Katie reposted my tweet, claiming that ‘[she feared] for young minds, brainwashed by liberals pushing their agenda [aged] 8?’ my Twitter account went crazy. The notifications poured in. 50 notifications. 112 notifications. The number of notifications was increasing by the minute, and all of the posts appeared to be filled with hate. I was “Jackie the brainwasher Nazi” and accused of grooming my own “Hitler Youth”. Apparently I had never been “out of an educational institution & [I was] unable to see a distinction between teacher/student”.
It was really quite overwhelming and I genuinely started to panic. But, I reasoned, the worst case scenario would see me unemployed by Monday – something that the people tweeting at or about me were not only calling for, but also providing useful links, to direct people to where exactly they should report me: the Department for Education; OfSted; and to Justine Greening on Twitter. The abuse continued to escalate. I was informed that I was in “breach of Section 407 of the Education Act”. I had “broken the law.” It also turned nastier and more threatening. I was “another idiot teacher polluting the minds of the young”. This, I was told in the same tweet, “[was] abuse as serious as sexual or physical abuse.” One man posted that he would “come and kick [me] out the classroom if [I] taught his kids” and another agreed with him. A former teacher, who identifies herself in her Twitter biography as an “Islamophobe” who believes “Islam is evil” told me that teachers like me were the reason the real teachers – like her – had retired. I was also surprised to learn that I am the reason a lady called Ana whom I’ve never met home-schools her child.
In addition to the abuse flung my way, the Twitter community began to flesh out the details of my indoctrination programme. It transpired that there was a lot more to the sentence I had tweeted than met the eye. A young lady stated that I had “abandoned the curriculum” and “brought in my own materials”. I had then used these materials to get the children to make protest banners for me. Yes, they were “Anti-Trump” posters another tweet confirmed. That was instantly retweeted. In Katie’s world it became a “fact”. Katie’s followers kept retweeting these “facts”. Before long, UKIP Party Members started to retweet the material, Suzanne Evans among them. It was a textual version of Chinese whispers, in which the “facts” now circulating about my Year 8 class and I have no basis in reality.
Twitter is not a suitable platform to engage in a meaningful debate, so I’d like to take this opportunity to reflect on the best way to respond to some of the charges levelled at me. Not the personal attacks, which don’t warrant a response, but, rather, the ones in which non-teachers thought that they had the right to tell a teacher what – and how – it was acceptable to teach. Every one of the tweets that suggested that children should not be discussing President Trump’s actions could be discounted by referring to the Department for Education guidance booklet Promoting Fundamental British Values as part of SMSC in Schools. Schools, we are told should develop in students “an acceptance that other people having different faiths or beliefs to oneself (or having none) should be accepted and tolerated, and should not be the cause of prejudicial or discriminatory behaviour.” Banning people because of their nationality or religion runs completely counter to this. Should a parent ever question why their child is learning about a particular topic, then a teacher can accurately state that this is what the government prescribes.
Other tweets seemed to suggest that it was beyond the remit of a teacher to teach about topics that were either deemed to be political in nature, or might be considered as current affairs. To put it bluntly, if we were to remove topics that were political in nature, then not much of the National Curriculum for History would remain. The introduction of the National Curriculum in 1991 marked a defining moment in education across England and Wales. Each revision to the history curriculum reduced the amount of prescribed content – the ‘what to teach’ element. In its current iteration, the only mandatory topic for study in the key stage three history curriculum is the Holocaust. This means that teachers have a greater level of autonomy in selecting what content to teach their classes across the rest of the key stage. Teachers will make decisions concerning what to teach based on a variety of factors, including, but not limited to, the resources they have available, the strength of their subject knowledge and passion for individual topics, what they have taught previously and the ability of their classes.
Students typically begin their study of history at secondary school, learning about the Middle Ages. The first time that they will really encounter political ideas is when they learn about the barons confrontation with King John at Runnymede in 1215. They’ll start to develop their substantive knowledge so that when they later learn about other examples of protest they will have some knowledge to ‘think with’, which they can then transfer and apply to different contexts. Students – with guidance from their teacher – are thus equipped to draw out the similarities and differences between political struggles in different eras. For example, one of my Year 8 classes has recently looked at the question of when Britain became a democracy, and while the enquiry is mainly based on events in the nineteenth-century, they were also encouraged, and most were able, to make comparisons with the sixteenth-century. While students are able to recognise the similarities across periods and to understand political problems in different eras, I would suggest that those who posted tweets to say that politics should not be taught in the classroom shows a comprehensive failure in this respect. Is there, for example, a qualitative difference in the nature of the learning taking place when students design campaign propaganda for the contender to the throne of England after the passing of Edward the Confessor, to those completing the same activity to suggest who would make the best leader of the Conservative Party in 1979? I would suggest not, but then I doubt that anyone has ever accused a teacher of trying to brainwash a class into electing Harald Hardrada instead of William of Normandy.
We would also do well to remember that school students are perfectly capable of forming and expressing their own political viewpoints, and indeed protesting government policy. The 1985 School Student Strike, which saw students take to the streets to voice their opposition to the Conservative government’s threat to make the exploitative Youth Training Scheme compulsory, is just one historical example of valid and effective student protest.
Finally, the question of making use of current news stories in the classroom can also be justified on the following grounds: one of the most effective ways to get students engaged in a historical topic is to make the contemporary relevance of it explicit to them. Sometimes the best place to start a history lesson is in the present – with a current news story – and then ask how we got here. A few of my Year 8 students went on the women’s march on 21st January. When they came into class the next week and we were learning about the Suffragettes, it all made far more sense to them. I will make no apologies for adopting exactly the same approach over the coming weeks as I teach about the Civil Rights Movement and we ask why there are still protests demanding that we Stop Racism and confirming that Black Lives do Matter going on at the moment. Is this brainwashing or political indoctrination? No, it’s just one way of introducing a topic to students.
Jackie Teale is a secondary level history teacher and doctoral student at Royal Holloway, University of London. Her thesis is supervised by Professor Dan Stone and focuses on the ways in which press photography has shaped public responses to genocide.
There has been some considerable debate in recent days about the public purse bearing the cost of refurbishing Buckingham Palace to the tune of £369 million while the rest of the country endures austerity, the NHS budget is trimmed and the number of homeless rises. Unsurprisingly, an exploration of the historical the relationships between Crown, finance and the public purse provides us with some valuable context for this discussion.
In the past, kings and queens exercised their majesty through a conspicuous display of wealth and power. The right to rule was represented and reinforced by the sheer opulence of the monarch’s dwelling places and possessions. Magnificent courts, sumptuous homes, golden carriages, the largest jewels, the finest horses, the most splendid paintings, were not just the trappings but the foundations of regal power. Wealth was the cause rather than a symptom of power. Historically there were intellectual dimensions to this material majesty. Kings were thought to be appointed by divine right, the keystone in a natural hierarchy celebrated in a culture of deference. Simply put, the king stood just beneath god in the natural order, and this exalted position was reflected by his extravagant wealth.
The same question is more difficult to address with clarity today as debates about heritage and national identity get mixed up with constitutional issues. Whether the current monarchy is a greater asset for the nation as part of the tourist industry, or simply because defending it seems to save us the time of thinking of an alternative, is still a debatable point. For some, even raising the question implies sedition, subversion and immorality. Certainly beyond those die-hard loyalists, besotted with the ineffable mystery of the crown, and firmly wedded to principles of deference reinforced by royal hierarchy, it is difficult to contrive a robust philosophical defence of the institution today. If we pose the same question in the historical sense – what were monarchs for? – it is perhaps easier to arrive at plausible and persuasive answers.
In the past kings and queens were warriors, symbols and enactors of military might, dispensers of justice, makers of law, and, very commonly, representatives of God on earth. From its foundations in the act of William I’s conquest, through to the Imperial majesty of Victoria, the English monarchy has acted as if they were the centre of political power. Competing with the Papacy and later the Church of England, the monarchy erected a powerful jurisdictional claim to be not only the source of morality but also the arbiter of true religion. Despite two revolutions in the seventeenth century, (one, in 1649, which saw the most radical act of anti-monarchical reform in the act of decapitating Charles I; and the other in 1689, a more decorous affair, but nevertheless clear evidence that kings and queens were reliant upon a broader political constituency than simply God) claims to divine right legitimacy have still not been discarded by ardent monarchists, even though the political constitution finally abolished the notion in 1701. It is quite clear that one cannot engage with the English past without considering the nature and power of the monarchy. The powerful material remnants of the institution lie all across the land: castles, forests, parks and ancient and relatively modern placenames. The Royal imprimatur can also be found on everything from caviar to toilet paper. Almost silently these monuments, buildings and spaces plead a royalist cause.
In the past apologists for monarchy adopted a number of justifications. Many of these were based on appropriating the most effective political document of the pre-modern world – the Bible – to the case for the defence. Arguments in favour of legitimacy included radical claims for political dominion based on conquest, the Biblical figure of Nimrod being a particular favourite. Others claimed even as late as the seventeenth century that since God had given all dominion over the world to Adam, and all kings were direct descendants of the first father, so they had supreme power. Although kings might be morally bound to govern in the reasonable interests of the community, the subject had no claim against arbitrary behaviour. The difference between absolute authority and arbitrary power was quite subtle. Despite a conceptual distinction between ‘tyranny’ and monarchy, many defenders of kingship furiously underscored the principle of both passive obedience (put up with what ever happens without complaint) and non-resistance (never, even in the most extreme circumstances, even imagine raising a finger against the king). This world was shattered in 1649 with as much cultural trauma as the attack upon the Twin Towers in New York. Killing the King was understood by contemporaries as a blasphemy equivalent to the sacrifice of Christ. English republicans have struggled with this legacy ever since.
The strongest claims for the monarchy appear to be those that invoke tradition, historical continuity and the sanctity of the ancient constitution. A thousand years of regal majesty, evident in the still robust and bewitching spectacle of Royal funerals of Lady Diana and the Queen Mother, seems an almost unanswerable argument. We should remember that despite invoking the sonorous authority of tradition, the past is as much a projection of present-centred aspirations, an invented tradition, as it is a persisting truth. Put another way, celebrating the past does not necessarily mean living in it. Turning to the past may however give us something with which to compare current institutions. By asking historical questions – What did kings and queens do in the past, what was their function in the polity? How did subjects understand their duties and obligations to regal figures? Where did their authority to rule come from? – it might be possible to raise legitimate questions about the nature and function of the modern institution. However, the mere act of broaching these issues has commonly been dismissed as insolent and inappropriate mischief.
It has never been fashionable to be a republican in England, even in the heady days of the 1650s when the country was ruled by a Lord Protector in the name of the sovereignty of the people. The history of English republicanism, despite the persistent charges of conspiracy levelled against successive figures such as Oliver Cromwell, Thomas Paine, the Chartists and Willie Hamilton, has not been a lineage of subversive king-killers. In fact, the history of English republicans have traditionally been more interested in making good citizens than neutering extravagant monarchs. The persisting cultural memory of 1649, mixed in with nightmarish images of French sans-culottes, the guillotine and 1789, as well as twentieth Russian and Spanish revolutionary traditions, has always successfully tainted republicanism with regicide. This political stereotyping has its origins in an eighteenth century tabloid creation known as the ‘Calves Head Club’ – a clandestine fraternity that gathered each January 30th to celebrate the execution of Charles the Martyr, in a drunken and impious manner. The Calves Head men, almost certainly an invention of the fevered imaginations of loyal clergymen became a powerful way of neutering any public political discussion of the rights, prerogatives and power of the monarchy. Any such discussion would lead inevitably to regicidal action. The same logic still applies today in many quarters. That it does provides simple proof of the enduring centrality of the institution of monarchy in England.
English republican thinking, even in its most radical years, was very rarely regicidal. Admittedly the great apologist of the English Republic, John Milton, defended the execution of Charles I in robust and comprehensive arguments. But the intellectual origins of the majority of these arguments were not necessarily ‘republican’. A defence of popular sovereignty (that the safety and common good of the community is more important than that of the monarch), a description of both the rights and duties of resistance to illegal government, were all central components of a ‘democratic’ theory of government which we all subscribe to today. Very few people realise that the trial of Charles I was as much a religious matter as a political one. Indeed, the Old Testament was as important a document in his condemnation as the writings of republicans.
Although Kings were subject to restraint, it should be noticed that one of the major targets of this republicanism was ‘tyranny’ – corrupt courtiers, corrupt government, self interested and immoral ministers were the targets of hostility. Bad government irrespective of institutional form, was tyrannous. In other words, one did not have to live under a monarchy to experience tyranny. The same is true today when any of us can have republican values and aspirations without starting with the immediate issue of kingship. Republicanism in the British Isles since the 1650s has very often been more concerned with making good citizens than punishing wicked kings. Indeed, for many thinkers and writers, the issue of monarchy after 1689 was a side show to the bigger business of eradicating inequality, deference and oppression. The distemper of deference, and the extravagance of the costs of supporting a monarchy, were sometime perceived as contributing to a more general political oppression, but the focus of ambition was turned (and continues to turn) on providing the political institutions that cultivate an active, virtuous, tolerant and just community. The problem with monarchy was not personal, or even financial; rather, it was resisted and criticised because it was a symptom of an ancient constitution riddled with anachronistic prerogatives and privilege. Given the extraordinary year that is now slowly drawing to a close, we may do well to remember that, in this view, the Houses of Parliament are just as corrupted and ‘monarchical’ as the Royal Family.
The First World War is not quite ancient history, but it is very much part of the past. All those who fought in the war have now died and even those with hazy childhood memories of the conflict are very few in number. So the war is no longer part of living memory. Yet in the summer of 2014, the outbreak of this seminal clash of empires was commemorated, with varying degrees of enthusiasm, across Europe and the wider world. Since then, the centenary anniversaries of many of the major milestones of the war have been marked by national governments and local communities and received extraordinary levels of popular, political and media attention. In Britain, the well-established rituals of Remembrance Sunday and the 1st of July have been imbued with yet greater symbolic weight and the Imperial War Museum has been completely reinvigorated, its lavishly overhauled galleries now offering a flawed, but highly inventive and visceral impression of the British experience of the war. We’ve also seen the staging of some truly imaginative and moving commemorative projects, including the ‘Poppies in the Tower’ installation, which has become a traveling exhibition, and the extraordinary ‘we’re here because we’re here’ living memorial ‘unveiled’ to mark the centenary of the first day of the Battle of the Somme in July. Alongside these, there have been a myriad of smaller but no less affecting community projects, which have seen schoolchildren born in the new century remember the dead alongside senior citizens whose fathers fought in the war.
As we continue to journey through this period of intense commemorative activity, it’s worth emphasising just how unprecedented all of this is. In the history of the commemoration of conflict, nothing on the scale of what we have witnessed over the past three years has ever occurred before. The very early date at which the British government announced its intention to commemorate the centenaries and the sheer amount of government money being spent on commemoration were both quite new. As early as October 2012, almost two years before the centenaries formally began, David Cameron pledged to devote at least £50 million to commemorative projects, a substantial amount of money at a time when many communities in the UK were experiencing severe economic hardship. The very fact that the outbreak of the war was commemorated by several national governments in August 2014 was a very new departure indeed. Traditionally, of course, the start of the war has not been commemorated so this was a new, and not uncontroversial, move.
But why has the British government been so determined to demonstrate its commitment to commemorating an extraordinarily violent war that took place a hundred years ago? A key reason for the political interest in centenary commemoration is the fact that remembering the First World War remains central to British identity and popular culture. Indeed, arguably no other historical event retains the same emotional resonance for the British as the so-called ‘war to end all wars’, and the commemoration of the conflict is something many British people take very seriously. This emotional attachment to the war ensures that even those who know little about what occurred between 1914 and ‘18 tend to have strong feelings about how the war should be interpreted and commemorated. We see evidence of this every year in the weeks before Remembrance Sunday, when there is always a certain amount of discussion about commemoration in the British media. There is nothing new in this, but the centenaries have really thrown both the positive and negative aspects of British commemorative culture into sharp relief.
In the year or so before the centenary commemorations ‘broke out’ in the summer of 2014, there was a particularly interesting and often heated public debate in Britain about the real meaning of the war and how it should be appropriately commemorated. The debate intensified with the publication of an article by then British Secretary of State for Education, Michael Gove, in the Daily Mail on January 2nd that year. This all seems very distant now that we remember Gove for other things, but two years ago he was a rather controversial Secretary of State for Education seeking to promote a narrowly Anglo-centric version of British history. In the article, he was scathing of what he regards as the apparently dominant left-wing version of the war which, in his view, portrays the conflict as a ‘misbegotten shambles’ and thus denigrates the ‘patriotism, honour and courage’ of those who lost their lives in the conflict. His comments quickly met with robust and highly critical responses from his Labour counterpart, Tristram Hunt, and a range of other prominent commentators, including historians Richard J. Evans and Antony Beevor. A group known as the Stop the War Coalition were particularly critical of his views on the war.
The German press also weighed in, with Die Welt printing an article with the headline ‘Britischer Minister gibt Deutschen die Kriegsschuld’ on 9 January. But Gove was also publicly defended by the then mayor of London, Boris Johnson, who went so far as to demand Hunt’s resignation and, in an article in The Telegraph, insisted that ‘Germany started the Great War, but the left can’t bear to say so.’ In brief, the Daily Mail article was a charmless combination of ignorance and self-righteousness, but it did revive a debate that reveals a great deal about British understandings of the First World War.
In very broad terms, the debate, which continues, divides those who, like Gove, regard the war as a bloody but necessary conflict in which British servicemen heroically achieved a great victory, and those for whom the war was little more than a futile exercise in mass slaughter. Most British people probably stand somewhere between these opposing views or, importantly, have no view. Yet the image of the First World War as the ultimate example of the futility of war, which was reinforced during the 50th anniversary of the conflict in the 1960s, remains very persistent in the UK. One striking feature of this emotive public discourse is that while there is a great deal of political, academic and popular disagreement on the historical interpretation of the war, everyone seems to agree that those who lost their lives between 1914 and ‘18 should be remembered with reverence and respect. So, unlike in other states that experienced the conflict, notably Germany, in Britain literally no one publicly opposes the custom of remembering the war and those who died in it. Even groups that are very critical of official, government-driven acts of commemoration, such as the Stop the War Coalition and the No Glory in War Campaign, have never, to my knowledge, gone so far as to question the wisdom or morality of commemorating dead soldiers.
As an act of community remembrance, or a simple expression of solidarity with our ancestors, the commemoration of war is not necessarily political. The millions of British people who wear poppies every year in the weeks before Remembrance Sunday are not making political statements by so doing. Nor are they retrospectively endorsing or honouring the First World War, or any war since. What they are doing – at least on the face of it – is honouring the dead. And yet the intense and generally exclusive focus on the dead is perhaps the most obviously problematic aspect of British commemorative culture. An undivided emphasis on the sacrifice of those who died as a result of military service made sense in the 1920s and ‘30s when millions of people across the globe were still suffering intense bereavement. We should also remember that, whatever we feel about the cause for which they were fighting, the dead gave everything they had; they made the ‘ultimate sacrifice’ in the parlance of the times, and this alone makes them worthy of our interest. The relative youth of those who died makes their stories all the more poignant, and we are understandably moved– and disturbed – by the power of industrialised warfare to cut short so many lives. Indeed, reflecting on the fate of the dead helps us to appreciate the catastrophe of inter-state conflict.
And yet a commemorative culture that focuses exclusively on the dead arguably overlooks the vast majority of people who were affected by the war. Approximately 750,000 British and Irish servicemen died as a result of service in the First World War. But many more, in both countries, technically survived the conflict but were physically disabled or psychologically traumatised by their experiences. As soon as the wounded began returning from the fighting fronts in 1914, and for decades after the war, severely disabled veterans were a common sight on the streets of European cities. Men who were psychologically traumatised were perhaps less visible but no less numerous than the physically disabled. Allied and German soldiers were remarkably resilient and cases of shell-shock, although common, were not as widespread as we might think. But tens of thousands of soldiers from across the British Empire were traumatised to the point that they simply couldn’t function in civil society and most veterans suffered some form of psychological distress. The pain of witnessing this mental torment – and the various, sometimes violent ways in which it manifested itself – must have weighed very heavily indeed on the families of those who returned from the front, physically intact but visibly altered by what they had seen and done.
Which brings us to those who played no direct role in the fighting but were nonetheless deeply affected by the violence that raged around them. Millions of families across Europe and the wider world suffered from profound and enduring grief when their sons, husbands, brothers and friends were killed or mortally wounded in the theatres of war. Death, even the sudden death of young people, is by no means unique to war but during the First World War the pain of bereavement was often particularly traumatic. Many of those in mourning knew little of the circumstances in which their loved ones had died, and were denied the consolations of a funeral or a grave on which they could focus their feelings of loss. In a move that was unique to the British Empire, the grieving relatives of the dead that lay in identified graves were granted the right to pay for a customised epitaph. These personal inscriptions give us an extraordinary insight into the mentalities of those who neither fought nor died but on whom the violence of the war had a very direct and lasting impact. Indeed, the often moving personal messages from grief-stricken relatives remind us that every headstone we see today in a Commonwealth cemetery marks the grave of a dead serviceman, but it also represents a lost son or husband or brother and a family in mourning.
Many of us feel – on some level – that the bereaved, the disabled and the traumatised are central to the story of the Great War, and are as deserving of our attention as ‘the fallen’. And yet the popular and official language of 21st century commemoration rarely alludes to the plight of those who were left behind and there is little room for them in our official commemorative ceremonies. To its great credit, the Royal British Legion began incorporating disabled servicemen into its ad campaigns in 2013 when Lance-Corporal Cassidy Little, a Royal Marine who lost his leg while serving in Afghanistan, featured in the poppy appeal campaign that year. The sight of a soldier who had very clearly lost a limb on posters throughout the UK reminded us that war and those mutilated in combat remain a part of British life in the 21st century. Other organisations and individuals have also drawn our attention to veterans who have suffered life-changing injuries, including Prince Harry, who helped establish the Invictus Games in 2014. And yet no disabled servicemen were formally incorporated into any of the major centenary commemorative events that have thus far taken place. In a more historical but equally glaring omission, the hundreds of thousands of disabled soldiers who were demobilised during and after the First World War are simply never mentioned in the national discussion on commemoration.
Added to this rather one-dimensional retrospective understanding of wartime sacrifice is a superficial valorisation of all those who served. While it would be wrong to suggest that British commemorative culture glorifies war, it does arguably venerate dying in war to the point of glorification. The focal point of the official Remembrance Sunday ceremonies each year is Edwin Lutyens’ great Cenotaph in Whitehall, on the side of which are inscribed the words ‘The Glorious Dead’. During the war and in the decades that followed the Armistice, it was important for those in mourning to believe in the righteousness of the Allied cause. The belief that their son or husband or brother hadn’t died in vain, that the cause for which he had given his life was glorious, clearly gave comfort to many of those in mourning across the Empire. Indeed, our ancestors’ interpretation of the war as a just, necessary and even glorious conflict is quite understandable. The war was an extremely personal business for those who experienced it, either as civilians or servicemen, and they were very emotionally invested in it. That same interpretation should, however, be a lot less understandable today.
Expressing solidarity with our ancestors and empathising with their pain is perfectly human, and even healthy. For cultural historians who explore the mentalities and emotions of past generations it’s also a professional skill. But a completely uncritical retrospective understanding of soldiers who fought in this extraordinarily bloody conflict is problematic because it tends to suggest that all the morality during the First World War
was on one side. This widespread view was reflected in media commentary on the war throughout 2014, in which historians and others highlighted the German violation of Belgian neutrality and the mass killing of unarmed civilians by members of the German armed forces. Such mass killings of civilians definitely occurred during the invasion of Belgium and northern France in 1914 and in the U-boat and airship raids that began in 1915 and continued for much of the war. As uncomfortable as such historical incidents may be politically for German leaders, and indeed for ordinary German people, it is important that we become familiar with them if we want to understand the ‘total’ nature of the war. That the war was sold to and understood by the people of these islands as a righteous endeavour is key to understanding the British and Irish experience of the conflict. And German atrocities – both those that were fabricated and those that were all too real – were central to the process of cultural mobilisation across the United Kingdom in 1914 and ’15.
It is important, however, that we acknowledge that the armed forces of the Allied states were also directly involved in the taking of civilian lives between 1914 and ’18. The most obvious example of this was the Allied naval blockade of the German coast, which was orchestrated by the Royal Navy and thus very much a British enterprise. The North Sea was formally declared a ‘British military area’ in November 1914 and the blockade essentially lasted from then until German representatives signed the Treaty of Versailles in June 1919 (fully 7 months after the Armistice). By mid-1915, German imports had fallen by 55 per cent from pre-war levels, leading to major fuel shortages but also to serious and increasing scarcity of vital foodstuffs. By 1917, disorders related to malnutrition, including scurvy, tuberculosis and dysentery, were common across Germany. Official statistics gave a figure of approximately 760,000 for those who died of malnutrition as a result of the blockade.
The blockade was certainly a factor in German capitulation in 1918 and the outcome of the First World War. It also significantly influenced the experience of the war for ordinary Germans and evidently led to the deaths of many thousands of civilian men, women and children. Yet in all the commentary on the war that has pervaded the British media since the beginning of 2014, I do not recall a single reference to the Royal Navy’s role in the taking of German lives. Nor has there been any mention of the hundreds of unarmed civilians who were killed by British servicemen – invariably veterans of the Great War – in the immediate aftermath of the war in India, Mesopotamia and Ireland.
I do not raise this issue of British complicity in the killing of civilians to in any way denigrate the conduct of British soldiers on the Western Front or at Gallipoli or elsewhere during the First World War. Many British officers and men, and indeed soldiers from all sides, served in a plainly self-sacrificing and often heroic fashion, and we are understandably impressed by their stories. But there is a persistent popular belief in the UK that the British soldiers of the Great War were victims rather than perpetrators of violence – and not both – and that they are morally beyond reproach. If we genuinely want the centenaries to become a moment in which we improve our understanding of the ‘war to end all wars’, this should ideally change.
Finally, there is a striking and increasingly jarring contrast between the great dignity and reverence with which most British people remember the war dead and the tone of the national conversation about commemoration. Over the past number of years, press commentary during ‘remembrance season’ has become ever more antagonistic and we’ve seen the emergence of the particularly unpleasant practice of calling television personalities and other public figures to account for not wearing the poppy. Jon Snow is a well-known example, but there have been others. This year, the Daily Mail has led a campaign to force the FA to go against FIFA and allow English and Scottish footballers to wear poppies emblazoned on their shirts when they play against each other on Armistice Day. ‘Poppy War’ screams the front-page headline. At the risk of stating the obvious, this press-manufactured spat over football jerseys is not a war. It is nothing remotely like a war. And using shrill, deliberately exaggerated language of this kind arguably does a great disservice to the memory of the men who fought and died on the Western Front and elsewhere. Historians still disagree about the motivations that led so many men to enlist or seek commissions in the British Army in the first year or so of the war. We can be quite sure, however, that whatever they thought they were fighting for, it wasn’t self-righteousness, invented indignation or gutter jingoism.
The British have every reason to be proud of their highly distinctive culture of commemoration. The sincerity and dignity with which most British people – irrespective or class, or status, or race – remember the dead each year is truly impressive and admirable (particularly to a foreigner). But if we genuinely believe that the dead of the First World War are worth commemorating, we should seek not simply to remember them, but to understand them. We should thus take time to reflect upon the totality of their experiences, to think of those they left behind, and to appreciate the remarkable colour and complexity of the world in which they lived and died.
Edward Madigan is Lecturer in Public History and First World War Studies at Royal Holloway, University of London
The autumn is upon us. And Poldark is back! The images of the beautiful Cornish coast around Treen, Porthcurno, and St Michael’s Mount are a welcome visitor to the screen as the grimy dark nights draw in. The television series, reborn from the novels of Winston Graham and the earlier screen adaptations of the mid-1970s, continues to attract considerable attention from the general public and historians alike.
Recent posts on this blog from Sarah Crook and Graham Smith have raised some very interesting questions about gendered perceptions of public history, both in popular books and on television. Following on from this commentary, I’d like to consider the ways in which historical writing and research have inflected the production and reception of BBC’s very successful fictional history of eighteenth century Cornwall.
Hannah Greig, historical advisor on Poldark, and Greg Jenner, of Horrible Histories fame, have already offered some very insightful views regarding the role of the historical advisor in contributing to the ‘accuracy’ of fictional representations of the past. For both Hannah and Greig, the priority of a television drama is precisely that – dramatic structures must take priority: ‘drama is there to entertain us. Dramatists are there to spellbind us, to make us laugh and cry and fear for our favourite characters’. Hannah confirms and explains ‘the most important thing is to have great story. That has to be the priority. A historical adviser can help to drive that story forward, informed by what we know about the past’. Historians are thus not invited to ‘determine what that story is’, but to inform the ‘look’ and conduct of the action. And yet I feel we should at least consider the idea that complementing the programmes with further historical context might make the drama all the more compelling and resonant.
The historical preparation for the portrayal of mid- to late-eighteenth century rural, coastal and town life on Poldark has been meticulous: attention to the details of dress, commerce, urban sociability and gentry etiquette have been scrutinised by the learned and the expert. Whether a pasty contained rabbit or deer might depend on the local ecology or the skill of the poachers, and the details of costume, deportment or the pistols and rigging are useful markers of historical ‘accuracy’, if not necessarily carriers of truth. In strictly dramatic terms, the narratives in Poldark are compelling, blending the personal, the emotional and the political in a very challenging and provocative way. Over the course of successive Sunday evenings the heroes and villains, the scoundrels and the indigent, encounter each other in a variety of social, institutional, cultural and legal settings. These encounters also expose the deeper seams of eighteenth century life: the rule and administration of law by local elites, the impact of commerce on the routines of customary economic practises and the complexity of popular and parliamentary politics. Yet while much has been made of the visual reconstruction and the marvellous acting, the more profound themes of gender inequality, class war, the ‘old corruption’ of public politics before the days of the secret ballot, and the abject poverty of rural labour have not been teased out in the reviews, although they are, in effect, the sinews of the power of the narrative which keep us engaged.
As those familiar with the early-modern social history of ideas and crime may have realised, the narrative of Poldark conveys very powerfully one of the key insights of Edward Thompson’s work: that although the rule of law in the eighteenth century was contrived to protect property, it was also bound by its own authority. The role of the jury in freeing Ross Poldark from the noose, for example, represents the significance of the tradition of trial by jury in the administration of justice enshrined in the birthrights of freeborn men and women. The radical John Wilkes’ freedoms were preserved by this process in his defence of liberty in the 1760s when Middlesex juries repeatedly protected him from conviction. The current episodes of Poldark engage with the histories of the complex processes of social mobility which drove, and were driven by, the marriage market; the crises of familial relationships that shaped reputation and authority; the dangers of gambling and the financial markets, and the hard grind of the everyday lives of ordinary people.
Although the original novels were written in the immediate post-war contexts of the mid and later 1940s, they have been made more directly historically interesting by the growth of social history in the 1970s. The ages of Walpole, and then the Pitts, elder and younger, were not simply made up of stories of meticulously landscaped county houses, glittering society balls and the routines of polite culture: they were times of revolution and turbulent class struggle. The American wars of Independence saw a great diffusion of radical commonwealth ideas across the Atlantic. At home the popular resistance manifest in the campaigns of John Wilkes for the liberties of the freeborn English, and later in the French revolution, offered radical opportunities for protest and freedom in Europe, including in Ireland during the bloody United Irish rebellion of 1798, and for the ‘Black Jacobins’ of Haiti (see C.L. R. James’ powerful study of Toussaint L’Overture). In Britain, ‘riots’ prompted by political ideology or economic desperation reflected the increasing dominance of ‘King Property’, and the progressively rapid destruction of what the great historian Edward Thompson called ‘customs in common’. Labourers, artisans and skilled workers – both men and women – saw the traditional means of regulating their working hours, and providing for their families, constrained and disrupted by the demands of the market and the ever powerful coercive legal code which led to Douglas Hay referred to as ‘Albion’s fatal tree’. Smuggling, poaching, and wrecking were all subjected to criminal codes of brutal savagery.
Poldark addresses many of these themes in the social history of crime and society explored in the great and formative works of historians like Edward Thompson (Whig and Hunters, 1975), and the collection of essays exploring the lives and deaths of labourers and city workers (Albion’s Fatal Tree, 1975). Markus Rediker wrote a wonderful book on the Atlantic world of pirates and seamen some thirty years ago (Between the Devil and the Deep Blue Sea, 1987), while Peter Linebaugh’s London Hanged (1991) explores the lives of those that were victims of Tyburn in the struggle between rich and poor. The criminalisation of what were regarded as customary rights, in the name of defending property and order, is the backcloth to the struggle of Poldark and his friends. Commerce and maritime innovation may have brought new commodities to the banqueting tables of the gentry, but they also destroyed the system of regular employment which enabled the poor and labouring to survive by helping themselves to reasonable benefits of their labour (known then as perquisites, or in our modern world ‘perks’). Exploring these histories will make the viewing of the series even more exciting.
For those interested in the histories of smuggling, poaching and the highwayman there is an alternative fictional series which seems to have been forgotten. The ‘Dr Syn’ novels of Russell Thorndyke, written during and after the First World War, and set in the smuggling culture of Romney Marsh in Kent and Sussex, combined smuggling, piracy and politics. Thorndyke’s novels travel widely, involving Caribbean characters, pirates and American revolutionaries: this might provide a much more diverse palette for the modern viewer. These novels have been serialised on the radio (read by none other than Rufus Sewell who recently starred in Victoria), and indeed were turned into a series of graphic novels and reasonably gentle Disney films. There was a ‘Carry-on’ version in the 1970s, the great Led Zeppelin recorded a song, ‘No Quarter’, drawn from the stories, while ‘The Day of Syn’ is a festival held in the town of Dymchurch to fund-raise for local community activity. A modern script-writer might work with the novels, but also explore them alongside the new Atlantic history inspired by the landmark histories of Peter Linebaugh and Marcus Rediker, whose work, The Many Headed Hydra: The hidden history of the revolutionary Atlantic (2000) explores the communities of slaves, commoners and sailors who resisted economic and social oppression from elites and mercantile interests from the West Indies, Africa and North America. Weaving those ‘real’ narratives into a fictional narrative would be a great challenge but it would also produce a very attractive and diverse series that allowed different voices and characters to perform in mainstream viewing. Perhaps and enterprising commissioner at the BBC or Channel 4, will explore the possibilities of creating a further series? Let’s hope so.
Sexed-up television histories, it seems, are just for girls. Histories, that is, that embellish and dwell on human relationships, that exalt the tactile thrill of the inadvertent touch, that are attentive to the colour a frisson of desire can add to the way we tell stories. Or so a recent article in The Spectator by James Delingpole would have us believe. The article, which was subsequently amended to remove some of its more outlandish claims, put forward some quite brazen generalisations about how men and women approach the past. ‘Boys’, the author postulates, ‘being of a more trainspotterish disposition’ would be more critical of the recent ITV series Victoria, for they are ‘more jealous of their facts and period detail’.
The critical response to the unamended article from historians on Twitter was swift and brutal. This was followed by a more critical approach: Delingpole is a provocateur, it was claimed, and the academic community should not engage with his trolling. But as other historians pointed out, Delingpole’s view that women’s interests are less intellectually rigorous and factually oriented than men’s are less unusual than we might hope. My own view, and that of others, is that we have a responsibility to attack sexism as and where we find it. Beyond this, the article raises two issues for historians interested in the public representation of the past. First, and perhaps less controversially, that the purpose of popular portrayals of prominent figures is to inform as much as to intrigue and entertain. Second, that women are driving the ‘MillsandBoonification of history’ while men are the dispassionate stewards of historical fact.
It is impossible to lay the sexism of the article to one side. The dichotomy between men and women trespasses from their representation of history and into their representation of characters. Rufus Sewell ‘smoulders so tastefully’ as Lord Melbourne (he acts) while Jenna Coleman ‘looks gorgeous’ as Victoria (she exists). Moreover, a collective sigh surely arose at the declaration that the author ‘blamed the ongoing feminisation of culture’ for the direction of the series. This feminisation, the author suggests, drives an ‘irresponsible’ history. But if we scale out from the article to examine the landscape of historical dramas more broadly, can we really say that this attentiveness to desire and romance is a peculiarly feminine trait? Are women responsible for driving men off the sofa on a Sunday night? The author makes it clear that they are – and not to bed with a serious historical tome, either – rather, he argues, men are driven off the sofa by romantic dramas to ‘cavort with rent boys’.
The briefest of journeys through films situated in the (sometimes mythical) past suggests that men happily sex up their representations: Troy (dir. Wolfgang Petersen); Pearl Harbor (dir. Michael Bay); Braveheart (dir. Mel Gibson). Male directors and writers evidently find it just as easy to elaborate, extemporise and appeal to emotion. When men take the reins they are inclined to emphasise the human relationships that underpin events regardless of whether they envisage a male or female audience. In dramas emotions often do the work of explaining complex historical convergences. Does it matter if these desires are fictitious, that it is unlikely that Queen Victoria and Lord Melbourne ever exchanged looks charged with sexual intent? While this might grate for historians (we are, after all, ‘for History’), we must first be wary of doing a disservice to viewers by infantilising them as an uncritical public.
My own view is that the best historical dramas do inform as they entertain. As historians are only too aware, and at the risk of being platitudinous, the truth of history is often more scandalous and more intriguing than dramas allow. It is a shame that Victoria marginalises the genuine political and social tensions in favour of a fabricated romance. But laying the blame for this at women’s feet is simply laughable to today’s historians. Rather, Delingpole’s piece brings to mind the anxiety raised by novel reading in the eighteenth century or the consternation over the proliferation of girls’ magazines in early twentieth century Britain. A culture that indulges fictive representations of lust is often considered risky and threatening. As for his claim that ‘mostly men … value history’? Just imagine how he’d react if he found out that some women not only value it, we also teach it.
Sarah Crook is the Cox Fellow in History at New College, Oxford. She completed her PhD on mothers and depression in post-war Britain at Queen Mary, University of London in September 2016.
Just over a week ago The Guardian published an interview with Rebecca Rideal whose narrative history 1666: Plague, War and Hellfire has just been published. The interview provoked a number of historians to Tweet criticism of Rideal, a PhD student and former TV producer who founded The History Vault. Her assertion that ‘The time of the grand histories that are all about male figures is coming to an end’ seems to have touched a particularly raw nerve. The common complaint was that Rideal had failed to acknowledge that the fight against great men histories had been waged for over three decades.
I have some sympathy with these grumblings. Back in 1982, I returned from completing an MA in Social History at Essex to my first university armed with a poster for Leonore Davidoff’s course. I was just pinning it to a noticeboard when the department’s senior professor of economic history spotted me and declared, ‘Women in History, Graham? Whatever next?’
However, as others have pointed out, the fact that the struggle to go beyond hegemonic discourses continues suggests that winning once is not enough. My belief is that evidence of a new generation reinventing ways of taking up that fight should be a cause for celebration rather than condemnation. As tends to happen on Twitter, battle-lines were drawn, allies and enemies were quickly made and exchanges sharpened after those initial criticisms of Rideal. On one side were historians who clearly identified with Rideal, especially those aiming to make a living from producing popular histories. On the other, for the most part, were historians working in universities, some of whom began to question whether Rideal was even qualified to write early modern history.
Spiraling sub-fights, with supporters weighing in from various camps, fed a debate that became increasingly acrimonious. There was also the usual Twitter induced comic confusion – it is not always clear who is responding to what strand as arguments fork, overlap, separate and loop. Nevertheless, things were very evidently turning nasty.
Responses – public history
A common response during the course of the spat and afterwards has been to present the ‘history profession’ as broad enough to encompass both those working inside and outside of universities. Such claims were underpinned in most cases with the argument that Rideal is engaged in ‘public history’. Leaving aside the rather odd formulation of the ‘history profession’ with its Rankean pretensions, intellectual insecurities and constant discipline making, patching things up with another poorly conceived label seems like an inadequate way to proceed. Instead, moving the debate forward will require genuine reflection on the nature of ‘history’ as a profession; otherwise we will continue to periodically descend into bickering and trading insults.
One difficulty amongst historians in Britain is that public history is not as well developed or understood here as it is elsewhere, especially in North America and Australasia. We tend to talk about public history as history that is produced outside of university departments; an activity, such as a television history. Or sometimes we stretch this base definition to include public history as impact, especially the influence of historical policy research translated for the consumption of publics or politicians. But the roots of public history are older and the acrimony of the recent Twitter battle reminds me of a wider war.
In those very early and heady years of the 1980s, I had left Stirling University to learn about oral history at Essex. At the time, oral history was despised by ‘professional historians’, rather than generally misunderstood or dismissed as is the case now. The economic historian I referred to above taught me in my final undergraduate year and on being approached for a reference recommended I should continue to study with him. By so doing, he insisted, I would be able to take, ‘A panoramic view of the past, rather than going down in the dirt with the yokels’. My response to such unashamed elitism, was to attend his final seminars dressed in a top hat and frock coat bought from the local Oxfam.
These days, the battles within ‘the profession’ are mainly over resources and too often fuelled by egotism. With its proponents organised into warring tribes according to the periods and places they study, or corralled into sub-disciplinary groupings, History is fractious even within the academy. In all of this sound and fury, and despite constant internal sniping, the discipline has been traditionally slow to innovate and much of the sparring is about maintaining rather than extending boundaries. It is worth noting, for example, that those pioneering courses in women’s history and oral history at Essex were taught in the Sociology Department. While members of other disciplines frequently offer support for new ideas, historians – too often operating as lone scholars – revel in knocking lumps out of one another, reserving particular spite for those who try to innovate. The result is that in open competition for resources, most obviously for research grant income or in the formation of mutually beneficial research partnerships, historians do not achieve the same results as, say, political scientists or human geographers. Nor are we as prepared to look after our researchers or early career colleagues as would be the case in economics or sociology.
So what can public history offer? In answering that question I’m alluding to historians who actively research and publish as reflective public historians and are not only making up numbers in the history commentariat. Drawing on the work of early oral history, at least some public historians have developed a greater sense of working in partnership, and have come to genuinely appreciate the notion of ‘shared authority’. This at its most basic is the recognition of different forms of expertise and was developed in response to the simple question of who were the authors of oral history interviews. Was it the (oral) historians conducting the interview? Or the interviewees who were specialists in their own lives and always much more (otherwise why bother interviewing them)? Or both?
It cannot be beyond the capability of historians, irrespective of where they work or what they work on, to collaborate on projects in a spirit of shared authority. While rigour in handling evidence, of broader interpretation and writing should be upheld, there is much to be gained by recognising that we may all be engaged in a common project that goes beyond individual conceptualisations or where we work. Just recognising that connecting with members of the public involves a different skill set and that the ways in which we communicate should become the subject of historical study would be a major step forward. Even more pressing is the need for greater recognition that large numbers of people, especially in Britain, are often deeply invested, passionate and knowledgeable about history. The notion that ‘we’, whether ‘we’ are in community or academy settings, are the arbiters or the sole traders of the past is pure delusion. The idea that there still a great deal to do within our imagined profession even after a peace treaty is declared, should keep us all busy and out of Twitter trouble.
Political scientists are already mining Twitter for research, most notably on its use in revolutionary situations. One recent study has pointed to the significance of Twitter as a means of ‘collective sense making’ during times of instability. It will be interesting to see what historians make of Twitter in the future. As an echo-chamber for congratulatory thought collectives or as a means to conduct acrimonious debate, the 140-character-a-time medium will offer rich evidence of the historiographical making and unmaking of ‘us’ and ‘other’.
 Oh, O., C. Eom, and H. R. Rao, “Role of Social Media in Social Change: An Analysis of Collective Sense Making During the 2011 Egypt Revolution,” Information Systems Research, vol. 26, no. 1, pp.210–223, 2015.
The recent photos in the media showing armed police apparently forcing a Muslim woman wearing a burkini on a French beach to remove it, or alternatively some of her outer clothing, in public, and then seemingly fining her, highlight beautifully the challenges facing historians in a post-modern historical world. What the ‘facts’ of the matter really are is no longer relevant. It is what we believe to be happening that counts, and so it is our interpretation of those facts that matters. Whether or not it was really a dreaded burkini (an outfit “not respecting good morals and secularism”) – at best, unhygienic, or at worst, to quote the French Prime Minister, part of the “enslavement” of (Muslim) women, this episode underlines yet again how central Muslim women’s bodies are to wider questions of identity, community and ‘modernity’. For the last couple of centuries Muslim women have been under close scrutiny in terms of what they wear, or do not wear. Their sartorial choices have not been individual choices. Rather they are so often the litmus test for ideas about progress versus non-progress, however these two terms might be understood. Interestingly, back in the 1850s, when the US activist Amelia Jenks Bloomer pioneered the wearing of the loose pantaloons that came to bear her name, western women followed the example of so-called eastern (Muslim) women and adopted ‘the Turkish dress’ in order to liberate themselves from the restrictive clothing – complete with bone-crunching corsets – that dominated at that time. Of course, bloomers in due course retreated to the private world of Western women’s underwear. Burkinis, like other supposedly threatening forms of covering worn by twenty-first century Muslim women, have to be seen, like the women who wear them, in public. Surely their public presence is a good thing?
See here for an informed discussion of when and why items of clothing have caused political storms.
Sarah Ansari is Professor of South Asian History and head of the Department of History at Royal Holloway, University of London.
In the spring of 1848 Europeans rose up on the streets of dozens of continental cities from Budapest to Paris and Berlin to Milan. Their demands were disparate and sometimes contradictory: free assembly, representative government, national self-determination, economic reform and much else besides. After initial successes, many of these revolutions followed a common pattern: reform, violence, division, repression and ultimately failure. By the following year constitutions had been torn up, newspapers banned and activists exiled.
After Brexit it is not the revolutions of 1848 that should capture our attention but rather what came next. As Chris Clark has shown, the 1850s witnessed a collective European experiment in which ruling elites mixed politics in new combinations in an attempt to respond to and address the instability that had produced revolutionary situations across Europe. The regimes that took over after 1848 did not simply carry out a conservative ‘reaction’, shutting down elections and newspapers. Innovative and expansive, they sought to provide the sort of order that would reassure threatened elites while also alleviating the social tensions that had made politics so dangerously volatile.
Importantly, governments in the affected countries began adopting a newly assertive role in economic and public life. The Spanish government laid telegraph cables to the Balearics, Central European states built railways and drove tunnels through the Alps. Many regimes also experimented with novel techniques for managing public opinion through the press and measuring social problems by gathering statistics.
France’s Emperor Napoleon III was emblematic of this trend: first elected as President Louis-Napoleon under universal suffrage in the wake of 1848, his longer-term survival relied on his ability to build a coalition of interests that represented apparently contradictory drives. His governments appeased big financiers and banks while granting workers the right to strike, renovated Parisian slums while expelling many of their residents, and ruled by plebiscites held under universal suffrage while all but outlawing conventional party politics.
We may not have seen a revolution in the past two months, but we are certainly living through another age of trans-European political disillusionment and ideological remixing. Brexit is but one symptom of the contradictions that are rapidly dismantling assumptions about the politics-as-usual status quo. Political forces across Europe and beyond are engaged in a struggle to fuse the popular appeal of protectionism and nativism with the interconnection and cosmopolitanism they see as intrinsic to modern societies and essential to their economies. At the same time, they seek to reconcile these divergent impulses with the distinctive features of their national political cultures – secularism and the republic in France, the union in Britain, and so on.
Yet just because these forces look irreconcilable does not mean that the Louis-Napoleons and Bismarcks of our age will not find ways to fuse them together, however delicately.
Faced with these challenges and a hostile public in much of England and Wales, pro-European Brits might be tempted to throw in the political towel altogether. But as writers such as Flaubert and Marx recognised from very different political perspectives back in the mid-nineteenth century, there is no ‘elsewhere’ outside history to which one can flee. Equally, while Britain’s role in the specific political institution of the European Union is now coming to an end, our implication in common European historical processes is, if anything, becoming even clearer.
On 26th June I tweeted:
No government, economic stagnation, anti-immigrant populism, political fragmentation: my fellow Brits, today at last we are true Europeans!
I was only half-joking. We may soon cease operating within some of the legal, administrative and economic channels with which we have become familiar, but there is no escaping the broader structural and cultural bonds between Britain and the rest of the continent. Only through sharing ideas with our neighbours will we be able to develop responses to the centrifugal forces of our age that can challenge the populist and quasi-democratic solutions on offer to European publics.
In recent decades Britain’s pro-Europeans did not always bang the drum about the benefits of certain forms of collaboration, integration and exchange with our neighbours loudly enough. We are now entering a new era, and we need a new drum.
The Brexit referendum was about something far bigger than Britain’s political and economic relationship with the rest of Europe. School pupils and university students spontaneously broke into tears on the morning the results came in, seeing their future life prospects destroyed. At the same time, people who look or sound different were told by triumphant leavers to ‘pack your bags and go back where you came from’, across the country and without any apparent coordination or official political backing. Such happenings are ominous, and they become more ominous still if serious incidents such as the murder of an MP as a perceived ‘traitor’ to the nation are factored in. Many of us historians have developed a special sense for such moments because we are trained to connect the dots intuitively and imaginatively. We have seen similar outbursts of collective emotion in the past and know what they can harbour – situations like 1789 in France, 1947 in India, 1990 in Yugoslavia.
There has been a widespread sense of disquiet about the state of the world for some years now. The ability to visualize a better future has never in living memory seemed so remote. It feels as if Francis Fukuyama’s much maligned ‘end of history’ has been stripped of its messianic optimism and then never gone away. A dark cloud of ‘there is no alternative’ has being hanging over us. There have been global pandemics like swine flu or Ebola, environmental disasters and unusual natural events, and the diffuse threat of Islamic terrorism has combined with a sense of economic crisis to produce a generalized climate of fear and foreboding. Yet until Brexit struck, this sense of impending doom still seemed to be somewhat intangible; perceptible below the surface but not powerful enough to disrupt the order of everyday life. Now, it feels as if things are at last kicking off for real. One historian on Twitter began to wonder half-jokingly whether people a century hence will speak of the ‘generalized crisis of the early 21st century’, others whether the year 2016 would be remembered as the date when the dissolution of the world as we know it began in earnest.
In times like these, history can be a great consoler. By standing back and contemplating larger connections and storylines, the febrile mind can at last find a grip, a resting place that offers some sense of ‘taking back control’. But such consolation should come with a health warning. Getting a grip is not the same as optimism, let alone offering a workable vision for political action. Some of the best long term analyses have been driven by the experience of defeat. Think of Fernand Braudel, who discovered the agency of geographic features over the longue durée when incarcerated in a German prisoner of war camp. Or of Antonio Gramsci who wrote his exceptionally perceptive interpretations of history from a fascist prison cell. It is Gramsci’s ‘pessimism of the intellect’ rather than his ‘optimism of the will’ that colours the way we see the world today. Making sense of Brexit within a larger historical framework is like staring into the abyss in order to make one’s fears more manageable, an exorcism by anticipation, perhaps.
The books that I felt most compelled to revisit in response to the Brexit crisis all deal with the historical sociology of capitalism. I had come across some of this material as a student in the early and mid 1990s, but until recently, lost sight of it to explore other intellectual territories. Most of it is of Marxist provenance broadly construed – more precisely of North American Marxist provenance, where big picture analyses of the global economic system have received particularly careful attention. Relevant names include (among others) Immanuel Wallerstein, Saskia Sassen, David Harvey, Royal Holloway’s own Sandra Halperin, and somewhat peripheral to this tradition, the German historian of capitalist crises, Robert Kurz.
There are several reasons why this body of literature seemed to be particularly appealing when trying to make sense of Brexit. In the first instance, the referendum has been accompanied by the ongoing self-destruction of the British Labour Party, and a general reassessment of left theory and practice going back to first base seemed appropriate and pressing for the moment. Before we can even argue about what kind of politics we now need, we need to know where we stand in terms of big-picture stuff. In addition, the emotional flavour of this historical sociology chimed with the post-Brexit blues. The authors involved were all more or less shaped by the experience of belonging to an intellectual tradition – Marxism – which had not the slightest chance of wider political relevance where they lived or worked, the USA. But they carried on writing regardless, and with a heightened sense that at least intellectually they could defeat an otherwise overwhelming system. And, perhaps most importantly, there seemed to be an immediate fit with the empirical evidence. The map of how Britain voted over Brexit – with ‘remain’ areas coloured yellow and ‘leave’ areas coloured blue – was an almost perfect illustration of some key arguments that had been made in this body of scholarly writing.
The historical sociology of global capitalism, then, is the grand narrative which can help us situate Brexit, its causes and consequences. It allows us to read the referendum result as an outward sign – local and specific to the UK – of a much wider structural contradiction which is currently transforming the world as we know it. What is at stake is enormous, and holds the frightening but real possibility that our future may be a good deal less democratic than our present.
Let us begin with a key contention: over its long historical formation, capitalism has only occasionally and in very particular locations marched in step with the two other trademark institutions of modernity, the nation state and democracy. For most of the modern period, capitalism has produced economic and political geographies that cut across the nation state in various ways, and relied on means of political organization that involved some degree of authoritarianism and coercion.
This is immediately evident when we take a global view of capitalism in its formative period from the 17th to the 19th centuries. This was never just a story of an industrial (or as some would have it, an ‘industrious’) revolution in one particular territory such as Britain, operating in tandem with the creation of new citizens equal before the law, and their gradual incorporation in political decision making. There was always another side to it: first, what Marx originally called ‘primitive accumulation’, the forcible appropriation of peoples, places and goods in the first rush for capital accumulation. It was exemplified by the enclosure of common lands, the infamous highland clearances or the robber-baron colonialism of the East India Company. Then came the no less brutal but more systematic disempowerment of the majority of the world’s population on racial grounds under European empires. In some places, and for three centuries or more, this included capitalist slavery, the most coercive system of labour management imaginable. The link between authoritarianism and capitalism did not end there. The twentieth century brought forth capitalist dictatorships around the world, including communist dictatorships. Against other sections of the Left, much of the literature under review argues that communists were politically successful in many parts of the global South and underdeveloped East not because they were anti-capitalist, but, on the contrary, because they offered a turbo-charged version of capitalist development directed and monopolized by the state. This is what Lenin’s famous celebration of ‘electrification of the whole country’ and Mao’s ‘great leap forward’ were all about. The untold horrors committed in the name of communist development are as much part of Robert Kurz’s ‘black book’ of global capitalism as modern slavery.
Whatever may have changed over these centuries, capitalist development was rarely confined within national boundaries (the communist path to development is a possible exception). More often than not, the global capitalist elite was transnational in orientation. The members of this elite shared a common culture built around such things as opera, a love for renaissance art and classical education. They intermarried across borders. They owned assets in several countries. Friedrich Engels, German industrialist with strong British connections, was by no means unique; even a quintessentially German company like Siemens had British as well as German family branches before the First World War. Capitalism constituted, as Immanuel Wallerstein once famously called it, an emergent ‘world system’ linking metropolitan core areas in Europe with colonial and semi-colonial peripheries around the globe. Colonial Guyana in South America provides a perfect example: labour came from Africa (initially as slaves), from India (as part of the indentured servitude system) and from China; they worked plantations owned by international shareholders to produce cash crops like sugar which had to be shipped across the Atlantic to be sold to mostly working class consumers in Europe’s industrial heartlands. The profits went into anything from railway companies to city banking houses and village church renovations. It should be noted that this self-fuelling and highly exploitative system continued, and become more efficient, in the 150 years after slavery was formally abolished in the region. At their most productive, around the mid-twentieth century, the sugar plantations of British Guiana were unique in that they produced not one but two annual crops.
There was only one relatively brief period when the common-sense picture of capitalism applied, when a flourishing ‘national’ economy coincided closely with the borders marked on a political map, and when economic reproduction went hand in hand with democratic governance. This was the time between the end of the Second World War and the emergence of a neo-liberal economic system in the late 1970s. Over these three decades – but even then not everywhere – an economic logic of industrial manufacture, mass consumption based on rising incomes across the working and the middle class, of state planning, and a more or less consensual form of politics prevailed. This is capitalism as it is most familiar to us: of workers manufacturing goods like cars or TV sets and earning enough to then buy these same items back for their personal enjoyment. This system still relied on the basic Marxist category of exploitation, but nevertheless functioned for some time as a self-sustaining engine of prosperity creation for the many.
All this began to change in the aftermath of the 1973 oil crisis, which ushered in a global recession. Capitalism survived and reinvented itself, but with a new modus operandi that had become firmly entrenched by the 1990s, and is ultimately responsible for the dislocations that Brexit brought to the fore. The main method of surplus creation shifted from labour exploitation to financial speculation and ‘securitisation’, a new form of primitive accumulation by stealth, as Saskia Sassen describes it. This new system no longer requires people to be turned into capitalist labourers or consumers for it can create wealth without people. For the first time, this means that capitalism is no longer expanding – seeking to bring more people and territories under its control as it had done for the last many centuries. Instead it grows richer by ‘expulsions’ (again Sassen’s term), by getting rid of people in order to speculate with what they leave behind. Sub-prime mortgages foreclosed in dying American cities, landscapes ransacked by fracking or mining, bodies plundered for organ donations and patented for medical copyright, whole populations killed or displaced while the international arms trade makes a fortune.
Even though the wealth concentrated at the top has increased enormously since the days of the welfare state, this is not a self-sustaining system. As some historians of capitalism have pointed out, capitalism may well have entered a final crisis mode. The current slowing of growth around the world may be the first concrete evidence of this disturbing trend. While the rich economies of the North are in or close to recession, countries that have previously been held up as hopes for a globalised future are in deep trouble, too. China sits on a mountain of real estate debts while the economy slows, and India had to falsify official data by its Central Bank to maintain any semblance of economic growth outperforming population growth.
The geographic shape of the new system once again transcends national boundaries. There are new and reinforced global links cutting across the old division between a rich North and a poor South, exemplified by new areas of global ‘outsourcing’ and glittering global cities on all inhabited continents. The new structures also create growing regional disparities within national economies, between the nodal points of a still thriving global network and areas of ‘expulsion’ for which the system no longer has any use. This brings us directly back to the referendum map mentioned above. The yellow areas voting for ‘remain’ in England and Wales coincided almost perfectly with areas that still have a stake in the global economy: London and its wealthy hinterlands, the M4/M40 corridor of Berkshire and Oxfordshire, Britain’s knowledge cities like Manchester, Cambridge, Leeds or Aberystwyth. Those areas that voted ‘leave’ by the largest margins, in contrast, were the old industrial heartlands and rural areas now left behind.
This explanation of the geographic shape of the Brexit vote is powerful but not in itself particularly original. Many have made this point without recourse to Marxist meta-history, not the least ex-Prime Minister Gordon Brown who demanded that globalization had to work for everyone not just the few. Other commentators believe that Brexit will usher in a wider people’s revolt against the excesses of neo-liberalism. It is here that the perusal of the historical sociology literature offers a starkly different perspective. If the likes of Harvey, Sassen and Kurz are right with their grand narratives of capitalist development then it is unlikely that a political upset like Brexit can alone reverse deep structural developments. Unless it is forced into a yet completely unknown new modus operandi – for which there is little evidence – global capitalism will continue to rely on ‘expulsion’ as its main method of surplus extraction. That it has also entered crisis mode can only mean that regional disparities between core areas and left-behind areas will grow further still. Tax and spend, or a politics of redistribution, will no longer work as a remedy. Insofar as Brexit was a vote to take us back to the days of a national economy it cannot fulfil its core promise. (It is worth noting that there was also a leave argument at play that argued for more rather than less globalization – David Owen’s new ‘blue water diplomacy’, for instance, or Andrea Leadsom’s new trade deals, but this theme was quickly overshadowed by a rhetoric of ‘taking back control’ over national borders.)
This is not simply speculation. In countries with longstanding regional disparities across Europe – Italy with its industrial North and mezzogiorno South, say, or East and West Germany – the divide has become sharper over recent decades, despite long-standing and hugely expensive ‘development efforts’ by the countries themselves or by the EU. Meanwhile, across Europe, the gap between the networked core – located mostly in the North – and the expulsions areas in the South has also intensified. It is happening elsewhere, too, from Nigeria’s increasingly unbridgeable split between North and South to India’s great divergence between the Western and Southern coastal regions and the Northern ‘cow-belt’. It is very unlikely that the people of Sunderland or the Welsh Valleys will be any luckier than those in Greifswald in East Germany, Trapani in Sicily, or indeed Patna in India’s Bihar, when it comes to their state’s ability to overcome inequality through redistribution or protectionism.
This brings us to the crux of this exercise in historical analysis: what will be the political effects of these structural contradictions? A steady growth in the number of people who are of no use to the system, not even good enough to be exploited or to buy useless commodities, and a simultaneous crisis of the system as a whole will produce a colossal amount of discontent. Those who still have a stake in global capitalism, meanwhile, will seek to protect their life chances tooth and nail – as the emotional reaction of so many ‘remainers’ to Brexit demonstrated beyond doubt. The interests of those living in the still thriving network core and those in left-behind areas have become irreconcilable. One’s dream has become the other’s nightmare, while there is still no alternative political economy that could overcome such divisions in sight. How is this conflict going to be managed through democratic institutions? How is system compliance and consent going to be generated within a geographic framework – the nation state – that no longer fits the shape of the political economy?
One can think of several possibilities here. Discounting the unlikely event that the people of Britain collectively decide to leave the capitalist order altogether and try out some other system on their own, two alternatives stand out. First, areas that are small enough, well-connected enough and have the kind of identity politics in place to sustain such a move, may seek to become small independent nation states that play the new global system for what it is worth. Catalonia in Spain is a good example. Scotland is clearly weighing up its options to follow this path.
Where such secessionist moves are less feasible – as in England and Wales – some kind of artificial consent will have to be manufactured by means of an authoritarian political order. The Chinese Communist leadership is perfectly open about the need to manage regional disparities in their own country through repression, and justifies it with reference to a Confucian political culture. Elsewhere – as developments in Hungary and Poland, in Erdogan’s Turkey, Modi’s India and Putin’s Russia suggest – there is a trend towards majoritarian pseudo-democracy. A climate of radical nationalism prevails, post-truth politics becomes the norm, universities and the media are purged of opposition, perceived minorities are used as scapegoats, an artificial and hollow ideology of consumption and development spectacle papers over deprivation. But people are still able to vote, in fact, are even invited to vote to periodically consecrate the holy union of popular will and populist leadership.
It is not even necessary for a rabidly authoritarian party with genocidal urges, such as Narendra Modi’s BJP, to gain power for such a system to work. It is sufficient if such a party is strong enough to compel everybody else to rally behind a pro-establishment alternative that governs solely on the promise of keeping the barbarians at bay. Such a perennial party of power would have a free hand to resort to authoritarian measures as long as they remain marginally less off-putting than those demanded by the other side. Discontent will remain high, but has nowhere else to go but to the radical nationalists who stand in perpetual opposition, thereby only reinforcing the dominance of the ‘centre’.
Such a situation is by no means inconceivable in post-Brexit Britain. In fact, one can already see the contours of it taking shape: witness the implosion of the Labour Party in line with what is happening to other social democratic parties around the world, the emergence of the Tories as the only ‘centrist’ alternative in a world where the ‘centre’ has moved very considerably to the right, and an entrenchment of UKIP and assorted right-wing extremists as the attack dogs that prop up the system from the outside.
You have been warned – historical analysis in times like these is likely to yield depressing results. The only consolation is that even the best structural analysis does not fully accommodate human agency which will always remain unpredictable. The best historians can hope for at this juncture is that they are wrong.
I’m not British but I did get to vote in the referendum on British membership of the European Union. Irish people resident in the United Kingdom were one of just three non-Commonwealth immigrant groups that were afforded this privilege, which I felt gave me a real stake in the future of the country that is now very much my home.
Personally, while I can certainly understand popular disaffection with the EU, I was never in the slightest doubt about voting against Brexit. On every conceivable level – economic, social, political and cultural – remaining within the Union seemed to make such obvious sense. The blatantly xenophobic aspects of the leave campaign just made the decision to vote in favour of remaining all the easier; a vote for staying in the EU wasn’t simply an informed political choice, it was a vote against Boris Johnson, Michael Gove, Nigel Farage, and every mean and petty thing for which they stand. So the result of the referendum came as a shock, not because I hadn’t realised that an ‘out’ vote was a real possibility, but because, in the blinking of an eye, a truly great country seemed to have become palpably smaller and colder.
But the outcome of the referendum is an inescapable reality and, with it, a page has turned in British history. Historians had a lamentably limited impact on the debate about Brexit, despite the best efforts of at least some of them. There is still hope, however, that they may be able to influence the choices people make in the coming months.
The pitfalls of Brexit seem almost too numerous to contemplate. They also seem to increase daily as we descend yet further into political chaos. Indeed, the atmosphere of national jeopardy fuelled by the hour-by-hour machinations of the political elites would be pretty exciting if it wasn’t all so serious. So many things about the result of the referendum give cause for concern that it’s hard to focus on one especially unsettling outcome. Yet whatever else may worry us, the degree to which the UK’s impending exit from the EU has undermined Britain’s relationship with Ireland, and the integrity of the peace process in Northern Ireland, should give everyone on these islands pause for thought.
Given the shadow of uncertainty the result of the referendum has cast over the ongoing peace process, it seems like a good time to reflect on the role history played in respectively fuelling violence and helping people move beyond violence in Britain and Ireland within living memory. The story of the Northern Irish conflict is one of cruelly unexpected death, widespread bereavement, and lives blighted by fear, anger and bitterness. But it is also the story of a remarkably resilient people whose desire for peace led them ultimately to reconsider their attachment to the past and embrace compromise and reconciliation.
For this was a war – and a peace – that was all about the memory and interpretation of history.
* * *
Northern Ireland is the only part of the UK that shares a land border with another EU state. From the early 1970s until the late 1990s that border was heavily militarised and dotted with army checkpoints and watchtowers. The peace process, which, crucially, was aided a great deal by the EU context in which it evolved, meant that the border had essentially ceased to exist physically for the past fifteen years. The region is unlikely to be re-militarised, but when the UK leaves the European Union there will have to be a functioning border between Northern Ireland and the Republic of Ireland, which, if the Leave campaign’s promises of controlled immigration are to be delivered upon, will presumably have to be policed as such. These altered dynamics directly threaten the close political, social and cultural relations between North and South that have been painstakingly fostered since the emergence of the peace process in the mid-1990s. As with so much about the referendum, no plans seem to have been put in place to address this potentially very dangerous outcome.
One of the more compelling arguments David Cameron put forward as he led the ill-fated campaign to keep the UK in the EU is that steadily increasing levels of inter-state communication and collaboration have helped preserve peace in Western Europe since the end of the Second World War. It is certainly true that those who envisioned a more unified Europe during the darkest days of the Nazi terror hoped that greater economic and political integration would ensure that European states would become so interconnected that it simply wouldn’t be possible for them to go to war against each other. It is also quite obviously the case that no inter-state conflict has occurred in Western Europe since 1945. Yet while no two western states have gone to war over the past 70 years, there has been a great deal of political violence in the region, most notably within the United Kingdom, which was the scene of consistent and often intense violence throughout the 1970s, 80s and 90s.
The peace process has trundled on since the signing of the Good Friday Agreement in 1998, and in the past decade or so Islamic fundamentalism and right-wing extremism have generally been regarded as greater threats to British security than Irish republicanism. It has thus been quite easy in recent years to forget just how devastating the conflict we still euphemistically refer to as ‘the Troubles’ actually was. In strictly military terms, the war in Northern Ireland could accurately be regarded as a ‘low-intensity’ conflict, a case of asymmetric warfare that required a military commitment but never the full deployment of the armed forces. And yet between 1969 and the Provisional IRA ceasefire of 1994, over 3,500 people lost their lives as a direct result of violence in Northern Ireland or emanating from the region. This includes approximately 1,000 members of the British security forces, over 720 of whom were British soldiers, and about 500 Republican and Loyalist paramilitaries. As is usually the case with urban guerrilla warfare or terrorism, however, most of those who died were unarmed civilians; no fewer than 1,800 British and Irish civilians were killed over the course of the conflict, often in extremely violent circumstances.
Quite apart from those killed, about 50,000 people – again, mostly civilians – were injured during the Troubles, many of them to the point of permanent disability. These figures, of course, don’t take into account people who escaped injury but were psychologically traumatised by their experiences and those who suffered intense bereavement as a result of the killing (and mental health problems remain a major issue in Northern Ireland). Nor was the conflict contained within the relatively small area of the six counties, a region not much bigger than Yorkshire; violence consistently bled across the border to the Republic of Ireland and to Britain, where London, Birmingham, Brighton, and Manchester were all bombed with significant loss of civilian life. The sheer number of British soldiers stationed in the North – some 22,000 at the height of the Troubles in the mid-1970s – also meant that families in Britain who had no other connection to Ireland were touched by the conflict in a very real way. In diplomatic terms, the war put a continuous strain on relations between the UK and Ireland, with the North being a constant bone of contention between the British Foreign Office and the Irish Department of Foreign Affairs. Atrocities committed by the British security forces also occasionally stoked popular Anglophobia across the island. At a time when Anglo-Irish relations are warmer than at any other point in history, it is sobering to remember that in the aftermath of Bloody Sunday in 1972, an angry mob burned the British embassy in Dublin to the ground.
* * *
Rigid, exclusive and often highly territorial understandings of the past directly fuelled the violence that erupted so catastrophically in 1969 and the polarisation and cultural entrenchment that would mark the next few decades. On the one hand, nationalists across the island, and Republicans in the North in particular, regarded themselves as heirs to a rich and ancient Gaelic culture, whose ancestors had been systemically dispossessed, marginalised, exploited, and murdered by colonists from the neighbouring island. On the other, many Ulster unionists were proud of a history of colonial settlement dating back to the early 17th century, in which industrious, god-fearing Scottish and English Protestants carved out a niche of British civilisation in an otherwise wild and inhospitable corner of Ireland. Importantly, the memory of moments of suffering or victimhood experienced by the tribes that clung to these narratives helped sustain them. For Unionists, there was, and remains, the 1641 Rebellion, the Siege of Derry and the Battle of the Boyne. Nationalist identity, by contrast, was informed by memories of the Cromwellian conquest, the 1798 Rebellion, the Great Famine of the 1840s, and a hundred other moments of calamity and betrayal.
Yet the modern historical episode that would have by far the greatest influence on the perpetuation of divided identities in Northern Ireland, and across these islands more generally, was the First World War. Well over 200,000 Irishmen, from both political traditions and all walks of life, fought in the war, and somewhere between 35,000 and 50,000 of them died as a result of military service. They served in every branch of the British armed forces and often served with great distinction. As military conscription was never enforced in Ireland, moreover, most of the Irishmen who fought in the conflict were wartime volunteers. Their motivations for volunteering were often quite complex, but one major reason that so many Irishmen joined up is that they were strongly encouraged to do so by their political and spiritual leaders, and by the British government. The war was consistently sold to the Irish people as a conflict in which Irish interests were very much at stake, and in which Ireland was a quasi-independent and willing participant. The conflict was also widely interpreted by Irish political leaders, both Nationalist and Unionist, and indeed by the Catholic and Protestant clergy, as a morally righteous endeavour; as a just war. Irrespective of their religious or political backgrounds, many Irishmen who joined the armed forces, at least during the first two years of the war, thus believed they were fighting for Ireland and were regarded as patriots.
And while Unionists and Nationalist soldiers rarely served together, they shared similar experiences of violence, loss and deprivation on the Western Front and elsewhere. Yet the Easter Rising of April 1916, and the social and cultural forces it unleashed, would fundamentally transform the country to which many Irish veterans returned in 1919. Ultimately, the Rising, the subsequent War of Independence, and the partition of the island would ensure that the ways in which the Unionist Community in the North and the Nationalist community across the country engaged with the memory of the First World War were very different indeed.
For the men and women of the Unionist community in Ulster, the memory of the Great War in general and the Battle of the Somme in particular took on an almost sacred significance over the course of the 20th century. The blood sacrifice of the men of the 36th (Ulster) Division, who sustained such terrible losses on the first day of the Battle of the Somme, was regarded as having purchased the right of the six counties to remain within the United Kingdom after the Irish War of Independence. Commemoration of the war is thus not simply an element of Unionist culture, it is absolutely central to way Unionists understand themselves and their place in the world.
In independent Ireland, and among nationalists in Northern Ireland, commemoration of the war was much more complex and usually more muted. In the 20s and 30s, major Armistice Day ceremonies were held in Dublin, Cork and Limerick and poppies were quite commonly worn in the Free State between the wars. Nationalist politicians, including Eamon DeValera, also expressed a certain amount of reserved sympathy for Irishmen who had died while serving in the British Army. Indeed, in terms of housing, employment and pensions, veterans of the Great War were often treated reasonably well by the Irish Free State. And yet there can be no doubt that at a popular and official level, there was much more commemorative emphasis on the rebels of the Easter Rising and the men who served in the IRA during the War of Independence than on the Irishmen who served on the Western Front or at Gallipoli. There was also something of a popular notion that the Irishmen who fought the British Empire at home were more patriotic, and indeed heroic, than those who fought the German or Turkish empires. As the century wore on, the memory of the Great War faded across much of the island, and, outside the Unionist community, service in the British Army was rarely recalled with pride or recognised with esteem.
This division in memory between Unionists and Nationalist was very clearly revealed in 1966, the year in which the fiftieth anniversaries of the Easter Rising and the Battle of the Somme occurred. Commemorations of these events were highly divisive and fed into the cultural polarisation in Northern Ireland, which directly fuelled the violence that erupted in 1969 and would continue until the mid-1990s. The public Nationalist celebration of the men and women of the Easter Rising in parades and ceremonies across the region was regarded with great suspicion, and indeed contempt, by many Unionists, who focused exclusively on the anniversary of the Somme offensive later in the year. The intense focus on the past in 1966 further polarised communities across Ulster and contributed to the rise in prominence of Ian Paisley, a firebrand preacher whose intransigent anti-papist rhetoric was taken straight from the 17th century. In the Republic, virtually all of the commemorative emphasis that year was on the Easter Rising. The mid-sixties thus marked the emergence of the simplistic and misleading idea that, during the First World War and in its immediate aftermath, Irishmen either fought for the British Empire or they fought against it.
By the 1980s, nationalist memory of the period of the Easter Rising and the War of Independence was not necessarily triumphalist, but it was exclusive and territorial in the sense that there was little room in the popular or official imagination for anyone who did anything other than fight against the British in 1916 or in the years afterwards. The 200,000 Irishmen who fought in the Great War and, importantly, those who had been against all forms of violence, were thus largely forgotten in the Republic. Commemoration of the First World War was also generally regarded as an exclusively British or Irish Unionist tradition. The Armistice Day or Remembrance Sunday ceremonies that did occur in the Republic during the 1970s and ‘80s took place behind the closed doors of Protestant churches or schools. In a key indicator of division, the public wearing of poppies, a custom staunchly adhered to by northern Unionists, was virtually unheard of among the rest of the population during this period.
The sense on the part of many nationalists throughout the 1960s and the following decades that commemoration of the First World War was, and should be, the preserve of Unionists and Brits was generally a function of ignorance or indifference rather than antipathy towards those who had fought in the conflict. Any cultural association with the British armed forces was anathema to extremist republicans, however, and in November 1987 the Provisional IRA expressed its contempt for Unionist commemoration with one of the worst atrocities of the Troubles. The bombing of the Remembrance Sunday service at the cenotaph in Enniskillen was not simply an attack on unarmed civilians, but a sectarian assault on Unionist culture and the public remembrance of the dead of the two world wars. The bombers took the lives of eleven people, all Protestant and most of them elderly, and the incident was widely condemned as an indefensible massacre. Expressions of sympathy for the victims poured in from across the Britain, Ireland and the wider world, and many within the republican movement began to question their either tacit or active support for the IRA’s armed campaign.
The Remembrance Sunday bombing was a particularly dark episode in the history of the Troubles, but it also arguably marked a turning point in the way in Irish people engage with the memory of the First World War. Over the next number of years, the Farset Youth Project, an initiative that was already bringing disadvantaged teenagers from both sides of the divide in Belfast and from Dublin together to explore early Christian history, began to focus on the Irish experience of the Battle of the Somme. These efforts led to a well-attended cross-community event at the Ulster Tower at Thiepval in 1989 and to the formation of the Somme Association, an organisation committed to honouring the ‘sacrifices of all those from Ireland who served in the War’ in 1990. Attempts to raise awareness about the cross-community experience of the war in the North coincided with a resurgence of interest in the First World War in the Republic and gathered pace in the aftermath of the Provisional IRA ceasefire in 1994.
The Good Friday Agreement, a historic British-Irish treaty that was years in the making and enshrined some fairly major concessions on both sides, was signed and ratified by voters across Ireland in 1998. In November that year, President Mary McAlesse and Queen Elizabeth II came together to open the Island of Ireland Peace Park at Messines in West-Flanders. The men of both the mostly Nationalist 16th (Irish) Division and the mostly Unionist 36th (Ulster) Division had fought at the Battle of Messines in June 1917 and the location was deemed appropriate for a manifestly all-Ireland site of memory, mourning, and commemoration. The park features a modern re-construction of an ancient Irish round tower, which really stands out in the Belgian countryside, along with several stone tablets inscribed with the words of Irish soldiers who served on the Western Front. Importantly, the park also contains a memorial plaque that expresses unreserved regret, on behalf of both communities, for the years of violence in Northern Ireland.
The creation of the Island of Ireland Peace Park, conceived of by the Unionist activist Glenn Barr and Fine Gael politician, Paddy Harte, was a ground-breaking moment in the history of commemoration on these Islands. It is, of course, notable, however, that the memorial was established in neither Britain nor Ireland but on the ‘neutral’ territory of a former war zone in Belgium. Since 1998, instances of cross-community or Anglo-Irish remembrance of the First World War have become more common in the UK and Ireland and still have the power to impress. The first British state visit to Ireland, which took place in May 2011, was such a success partly because Queen Elizabeth and President McAleese directly and publicly confronted the historically troubled relationship between the islands. When the Queen bowed her head at the Republican memorial in the Garden of Remembrance on the second day of her visit, with the same reverence she shows every November at the Cenotaph in Whitehall, even the most cynical among us were won over.
More recently, in July 2014, a project jointly supported by the Commonwealth War Graves Commission, the British Government and the Glasnevin Trust, culminated in the dedication of a Cross of Sacrifice at Glasnevin Cemetery in Dublin. These stone crosses, inlaid with a bronze sword, were originally erected in cemeteries across the globe containing the graves of more than forty British or British Imperial dead of the Great War in the 1920s. The one place in which this tradition was not observed was the Irish Free State, where the political climate was such that iconography associated with the British Empire was unwelcome. The centenary of the outbreak of the First World War felt like an appropriate moment to rectify this cultural anomaly and, importantly, to organize an event that would bring Irish and British representatives together to express solidarity with the suffering experienced by their ancestors. One of the most symbolic and moving features of the ceremony was the presence of two colour parties composed respectively of soldiers of the Irish Defence Forces and the Royal Irish Rifles, a British regiment composed largely of recruits from Northern Ireland. The latter were the first British soldiers to be seen in Dublin since the early ‘20s, and the sight of them greeting their counterparts in the Irish Army with broad smiles and handshakes made the event seem all the more powerful and momentous. The choice of Glasnevin for the unveiling of a monument to the Irish dead of the First World War was both deliberate and highly significant. The cemetery is also the final resting place of hundreds of men and women who participated in the Irish struggle for independence and the unveiling of the Cross alongside more manifestly nationalist memorials complicates our understanding of the period of the First World War and the Irish Revolution. The message this juxtaposition of monuments sends is that there was a remarkable degree of overlap between the Irishmen who fought imperial tyranny on the continent and those who fought it at home, and one group does not have to be remembered at the expense of the other.
These events, and dozens of other less official but no less meaningful projects, reflect the emergence of a new, more positive and conciliatory commemorative culture on these islands over the past two decades. Politicians, diplomats, community leaders and ‘ordinary’ men and women from very disparate backgrounds now regularly come together to remember their dead in a way that would have been unthinkable just fifteen years ago. There is a distinct irony in promoting a shared memory of the bloodiest war in British and Irish history to help people come to terms with, and move away from, the violence of the much more recent past. But it’s an irony that anyone with an interest in lasting peace should be prepared to embrace.
* * *
Through its border and its shared history with Ireland, Britain is even more connected to Europe than it sometimes remembers. That same shared, complex, troubled history should remind us of other things. It is not so long since conflict tore lives apart within the borders of this state. Cooperation between neighbours helped resolve it. The European Union helped resolve it. Finally, a recognition that Britain’s history is inextricably intertwined with those of its neighbours, helped resolve it.
As we enter an undeniably new era in the history of North/South and Anglo-Irish relations in the aftermath of the referendum, we should remember that the road to peace in Northern Ireland and positive relations between the UK and the Republic was long and arduous. The relative stability that prevails in the North now simply could not have been achieved without years of effort on the part of political leaders, diplomats and, crucially, without the goodwill of ordinary Irish men and women from both of the ancient traditions and both sides of the border. The process of using more complex historical narratives to help people move away from a conflict that was shaped by understandings of the past will not be jeopardised by the UK’s break with the European Union. Peace in Northern Ireland, and thus within the United Kingdom, is dependent, above all, on people’s desire for peace. That desire remains strong, but the climate of uncertainty that now pervades these islands should remind us that lives are potentially at stake and that lasting peace should never be taken for granted.
Edward Madigan is Lecturer in Public History and First World War Studies at Royal Holloway, University of London and co-editor of the Historians for History blog.