When I was admitted as a member of Lincoln’s Inn in 2003, it didn’t occur to me that I could have been refused entry merely because I am a woman. Yet this is what happened to any woman who applied to join the Inns of Court or the Law Society before 23 December 1919, when the Sex Disqualification (Removal) Act received Royal Assent. This ground-breaking piece of legislation removed any legal barrier to women, including married women, working as lawyers on the grounds of their sex.
The approaching centenary of the passing of the 1919 Act is an opportune moment to look back and consider how much has been achieved by women in the legal profession over the past one hundred years. I am therefore delighted to announce a new exhibition, Celebrating the Centenary of Women Lawyers, which will be on show at The Honourable Society of Lincoln’s Inn in collaboration with the First 100 Years project and Royal Holloway, University of London. The exhibition will place the emergence of Britain’s first female barristers and solicitors in the broader context of the women’s movement and the opening of higher education to women in the nineteenth and early-twentieth centuries. Rather fittingly, Royal Holloway was established following the merger, in 1985, of two pioneering women’s colleges: Bedford College and Royal Holloway College. When it opened in 1849, Bedford College was the first institution in Great Britain to offer higher education to women, and Royal Holloway has its own proud history of producing pioneering female practitioners across a range of professions. Come along to the exhibition and find out who among the first women lawyers had links to Royal Holloway and the University of London. Among the women who will be profiled are Bertha Cave who, when her application to Gray’s Inn was refused, sought (unsuccessfully) to appeal that decision; and Gwyneth Bebb, whose application to be admitted to the Law Society ended up in the Court of Appeal. ‘In point of intelligence and education and competency’, the Court of Appeal acknowledged that Miss Bebb was ‘probably, far better than’ many male candidates but, because she was a woman, in 1913 she could not be admitted to the Law Society.
Today, one third of all practising barristers and approximately half of all practising solicitors are women. More than half of British judges aged under 40 are female and over the course of the last five years more women than men have been admitted to the profession. This represents a remarkable female presence in the legal field, considering that 100 years ago women were barred from the profession altogether. Inequalities, of course, remain but by taking the opportunity afforded by the forthcoming centenary to consider what has been achieved in the last 100 years, we can hope to look forward to greater equality in the century to come.
The exhibition will be launched on Wednesday, 19 July 2017 and all are welcome to join us for a drinks reception in the Old Hall Crypt at Lincoln’s Inn anytime from 6.00-8.00 pm. There will be informal talks at 6:30 and 7:30 pm.
Tickets are free but to join us for what promises to be an interesting and engaging evening celebrating the history of women in law please register at:
From 20 July 2017, this free exhibition will be on display on the east side of the hoardings around the Great Hall, Lincoln’s Inn and you are invited to view the exhibition and explore this hidden heart of legal London.
Katie Broomfield is a postgraduate student on the MA in Public History at Royal Holloway, University of London. This exhibition will form the final project for her MA. You can contact Katie for further information via @KRBroomfield on Twitter.
First 100 Years is a ground-breaking history project, supported by the Law Society and the Bar Council, charting the journey of women in law since 1919. Work is currently underway to produce a digital museum made up of 100 video stories that tell the story of women in law. To find out more and to donate to the project please click here.
History matters. Our histories warn us, inform us, and inspire us. More than that, they help us know ourselves, and shape what we believe we know about each other. As I was getting ready to go to the first protest outside the newly-unveiled Ripper Museum in Shadwell, I started looking for a quote which could express what I was struggling to say. The quote I found was from Chimamanda Ngozi Adichie’s excellent TED Talk about the ‘single story’ of Africa:
The single story creates stereotypes, and the problem with stereotypes is not that they are untrue, but that they are incomplete. They make one story become the only story.
The infamous Whitechapel Murders have long overshadowed the many stories we could tell about east London, and the obsession with ‘Jack’ means that the story most people associate with the area is one of violence against women and failed justice. Yet if our single story is about the brutal, unsolved murders of five working class women, how does that shape the way people see us? How does it shape the way we see ourselves?
The Ripper Museum has offered plenty of reasons to be angry: not least the mythologising of a misogynist serial killer, and the insult of swapping a museum presented to the local community and council as about women’s lives for a tourist attraction about their violent deaths – complete with a mannequin of one of his victim’s corpses, ‘Ripper’ cupcakes, and an audio loop of women’s screams.
The museum is, however, just the latest and most egregious example of London’s Ripper tourist trade (although the London Dungeon seems to be challenging them for the crown). And yet contrary to the narrative presented by many of the institutions and individuals involved in this ‘trade’, violence against women hasn’t gone the way of gaslights and top hats. It is incredibly common and frequently lethal, and Ripper tourism helps to trivialise it. The story is told again and again with no reference to the wider context of violence against women – especially violence against women sex workers – and usually in an insensitive, sensational, titillating way.
The protest at the Ripper Museum in 2015 wasn’t the first such protest by a long stretch. One of the local historians who has inspired me most, my co-author Rosemary Taylor, recalled a women’s march that took place 25 odd years ago in protest at the Ten Bells pub which had essentially reinvented itself as a Ripper theme pub, with t-shirts for sale behind the bar. More recently, the Women’s Library (when it was based in Aldgate) and the LIFT campaign both ran Alternative Ripper Tours which told the stories of the women who were murdered – their lives, their communities – and put up temporary plaques to honour them. Protests have also been staged online, including one by the Everyday Whorephobia blogging collective, which ran an online campaign condemning Ripper tourism in 2013.
Feminism is cool now?
One of the things that has set the ‘museum’ on Cable Street apart, attracting criticism from so many sources, is their baffling attempt to pass off the attraction as a genuine celebration of women’s history – even after the logo of a top-hatted man standing in a pool of blood was revealed, and the contents of the museum exposed. It was a bizarre strategy, which they thankfully now seem to have abandoned. It was particularly eerie for us to see the language we were using to describe our fledgling East End Women’s Museum co-opted for their press releases. Indeed, it’s interesting that the Ripper Museum’s owners felt that a museum of women’s history would be more likely to receive interest and support than a museum about Jack the Ripper, a longstanding staple of London tourism. On the one hand this is a testament to the strength of the current resurgence of feminist activism, and to decades of work by pioneering women’s historians. On the other, it reveals the extent to which a particular aspect of feminism has become depoliticised and absorbed into the mainstream. Perhaps the Ripper Museum is an extreme example of ‘femvertising‘. Much has been written about the oil industry’s support for museums, and the ‘halo effect’ they hope to glean from sponsoring exhibitions – could it be that the Ripper tourist industry was seeking out the same respectability?
At primary and secondary school level the history curriculum is not particularly concerned with women’s experiences. A recent survey by Girlguiding UK revealed that over half of girls aged 11-21 say that the role women have played in history is not represented as much as the role of men. In higher education women’s history is typically something to be sought out proactively, as an ‘added extra’ or specialism. Yet the problem is by no means confined to our classrooms; women are underrepresented on a local and national level in public history, in museum collections, archives, and academia. Just 2.7% of UK public statues feature historical women who weren’t royalty, with only one statue of a named black woman in the entire country. Just 13% of English Heritage blue plaques in London honour women and only four of the 50 bestselling history books in 2015 were written by women. Unsurprisingly, where women do appear they tend to be those with the most privilege, with women at the intersections of oppression rendered almost invisible. The histories of women of colour, women with disabilities, lesbian and bi women, trans women, and working class women have not only been pushed to the margins but right off the page.
History for resistance
Why does this matter? Unsurprisingly, Chimamanda Ngozi Adichie said it better than I can, in the same TED Talk:
Stories have been used to dispossess and to malign, but stories can also be used to empower and to humanize. Stories can break the dignity of a people, but stories can also repair that broken dignity.
Marginalised histories can be powerful tools to dismantle stereotypes and counter myths, to challenge assertions that ‘this is how it’s always been’. Sometimes a story can get through where an argument can’t. Uncovering hidden histories can also play a part in consciousness-raising. Recognising shared experiences across decades, even centuries, help to make the deep roots of inequality and structures of power visible. It means something to discover that your struggle is not only individual but shared, not accidental but systemic. That’s not to suggest that there is a single shared female experience or history, but simply that there are many common threads. Besides, examining the differences between women’s experiences is as illuminating as looking for similarities. We mustn’t simply replace Top Ten Kings and Generals with the Ten Best Ladies, but rather widen the lens, enlarge the story, and examine the power structures which cut across women’s history too.
Something else women’s history can offer us today is inspiration. Studiessuggest that women and girls respond better to role models who are also women and girls, and there is something especially magic about a local hero. (I’m writing this in a cafe in Stratford less than 100m from a statue of my most local hero, Joan Littlewood.)
The East End Women’s Museum
While we hope that the lessons we learn through the East End Women’s Museum may be useful to women’s history projects elsewhere, our focus is firmly on east London. Although we do use a deliberately loose and ahistorical definition of the area, rather like John Strype in 1720 when he described the district as “that part beyond the Tower”.
We’ve been very lucky to have had such a lot of goodwill and enthusiasm for our project, but there are some challenges ahead. One of the most pressing is lack of funds. We’ve reached a point where we can’t expand the project until we have funding to cover things like volunteer expenses, travel, and printing costs. Aside from practical challenges, there are other issues to contend with: for example, as our profile grows we are encountering more criticism and hostility. We’re more often reminded that some people feel very threatened by the idea of throwing a spotlight on women’s history, as if by including more stories about women everyone will suddenly forget about Henry VIII, Newton, and Brunel. While it’s frustrating and sometimes unpleasant, the backlash tells us that we’re on the right track.
What we hope to achieve and how
Our goal is to research, record, and represent women’s histories from across east London, and in doing so celebrate a shared local history, challenge gender stereotypes, and offer inspiration. We want to create opportunities for women and girls to gain the skills and confidence to tell their own stories.
Our hope is that we can build a long lasting resource for historians, schools, curators, and community groups. We know that many museums have slender resources and little support to diversify their collections, especially as past decisions about what is ‘important’ influence what is available to us today. We want to partner with more fantastic archives, collections, and community heritage projects and work together to get the girls to the front. We’re drawing on approaches including oral history, family history, social history and narrative history. Our ultimate aim is to co-create the content of the museum with groups from across east London, and to make it as accessible as possible, collecting and sharing stories in public spaces – parks, streets, schools, pubs, places of worship – as well as in our own museum space and online.
We’ve already started doing a lot of this, thanks to the support of some fantastic partners and volunteers. Over last year we’ve helped to develop two joint exhibitions – with Eastside Community Heritage and the East End Women’s Collective – and begun working on a third with Hackney Museum. We’ve also staged a sell out history event for local feminist activists, organised a pilot schools workshop with 70 year seven students in Hackney, and launched a research project that explores the history of women and East End markets with University College London and King’s College London. This year our main focus is putting some firm foundations in place while we continue to listen and learn about what would make the best possible museum for the women and girls we aim to serve.
An East End mystery
Returning to the idea of the ‘single story’ of east London, I often hear that what makes the myth of Jack the Ripper so irresistible is the element of mystery. I don’t doubt that’s the case. But here’s another mystery for you: why aren’t the other stories better known? Who does it serve to sideline women’s voices and experiences? Or to present working class people as powerless, to suggest trans identities are just a recent ‘trend’, to portray sex workers only as helpless victims, or to paint a picture of London in which every face is white?
Having given up Facebook just before Christmas, I’ve found myself scrolling through Twitter more than usual lately. Yesterday evening was no exception. As I scanned through my newsfeed, a story by Katie Hopkins from the Mail Online caught my eye. The headline read: “‘No Trump, no KKK, no Fascist USA’. I’d Wager all 3 Unlikely to Listen to People in Need of a Job, a Shower or Both.” I could see that it related to the protests organised by Owen Jones on Monday evening, and, having attended the protest in Westminster myself, I was almost ready to use my 140 characters immediately. However, recognising this as an emotional rather than a rational response, I convinced myself that it would be better to read the full article first. I didn’t want to write anything that might later prove false. The opening sentences of the story dragged me straight down into the depths of “Katie’s world”; a world where post-truth is the real truth and her followers neither know, nor care, if she knew the real truth in the first place.
After having read the entire story I still felt angry. Without giving it too much thought, I sent the following tweet:
My error wasn’t to have tweeted abuse at Katie Hopkins – lots of people do that. My mistake was to be a teacher who had been to a protest with a banner that was made by some students. Was my tweet unprofessional? Possibly. Is it offensive? Only directly to Katie, a woman who herself never hesitates to cause offence. Does it suggest that I was brainwashing children and forcing them to manufacture propaganda to impose my views upon other people? Absolutely not in the real world. But then I wasn’t in the real world. As soon as Katie reposted my tweet, claiming that ‘[she feared] for young minds, brainwashed by liberals pushing their agenda [aged] 8?’ my Twitter account went crazy. The notifications poured in. 50 notifications. 112 notifications. The number of notifications was increasing by the minute, and all of the posts appeared to be filled with hate. I was “Jackie the brainwasher Nazi” and accused of grooming my own “Hitler Youth”. Apparently I had never been “out of an educational institution & [I was] unable to see a distinction between teacher/student”.
It was really quite overwhelming and I genuinely started to panic. But, I reasoned, the worst case scenario would see me unemployed by Monday – something that the people tweeting at or about me were not only calling for, but also providing useful links, to direct people to where exactly they should report me: the Department for Education; OfSted; and to Justine Greening on Twitter. The abuse continued to escalate. I was informed that I was in “breach of Section 407 of the Education Act”. I had “broken the law.” It also turned nastier and more threatening. I was “another idiot teacher polluting the minds of the young”. This, I was told in the same tweet, “[was] abuse as serious as sexual or physical abuse.” One man posted that he would “come and kick [me] out the classroom if [I] taught his kids” and another agreed with him. A former teacher, who identifies herself in her Twitter biography as an “Islamophobe” who believes “Islam is evil” told me that teachers like me were the reason the real teachers – like her – had retired. I was also surprised to learn that I am the reason a lady called Ana whom I’ve never met home-schools her child.
In addition to the abuse flung my way, the Twitter community began to flesh out the details of my indoctrination programme. It transpired that there was a lot more to the sentence I had tweeted than met the eye. A young lady stated that I had “abandoned the curriculum” and “brought in my own materials”. I had then used these materials to get the children to make protest banners for me. Yes, they were “Anti-Trump” posters another tweet confirmed. That was instantly retweeted. In Katie’s world it became a “fact”. Katie’s followers kept retweeting these “facts”. Before long, UKIP Party Members started to retweet the material, Suzanne Evans among them. It was a textual version of Chinese whispers, in which the “facts” now circulating about my Year 8 class and I have no basis in reality.
Twitter is not a suitable platform to engage in a meaningful debate, so I’d like to take this opportunity to reflect on the best way to respond to some of the charges levelled at me. Not the personal attacks, which don’t warrant a response, but, rather, the ones in which non-teachers thought that they had the right to tell a teacher what – and how – it was acceptable to teach. Every one of the tweets that suggested that children should not be discussing President Trump’s actions could be discounted by referring to the Department for Education guidance booklet Promoting Fundamental British Values as part of SMSC in Schools. Schools, we are told should develop in students “an acceptance that other people having different faiths or beliefs to oneself (or having none) should be accepted and tolerated, and should not be the cause of prejudicial or discriminatory behaviour.” Banning people because of their nationality or religion runs completely counter to this. Should a parent ever question why their child is learning about a particular topic, then a teacher can accurately state that this is what the government prescribes.
Other tweets seemed to suggest that it was beyond the remit of a teacher to teach about topics that were either deemed to be political in nature, or might be considered as current affairs. To put it bluntly, if we were to remove topics that were political in nature, then not much of the National Curriculum for History would remain. The introduction of the National Curriculum in 1991 marked a defining moment in education across England and Wales. Each revision to the history curriculum reduced the amount of prescribed content – the ‘what to teach’ element. In its current iteration, the only mandatory topic for study in the key stage three history curriculum is the Holocaust. This means that teachers have a greater level of autonomy in selecting what content to teach their classes across the rest of the key stage. Teachers will make decisions concerning what to teach based on a variety of factors, including, but not limited to, the resources they have available, the strength of their subject knowledge and passion for individual topics, what they have taught previously and the ability of their classes.
Students typically begin their study of history at secondary school, learning about the Middle Ages. The first time that they will really encounter political ideas is when they learn about the barons confrontation with King John at Runnymede in 1215. They’ll start to develop their substantive knowledge so that when they later learn about other examples of protest they will have some knowledge to ‘think with’, which they can then transfer and apply to different contexts. Students – with guidance from their teacher – are thus equipped to draw out the similarities and differences between political struggles in different eras. For example, one of my Year 8 classes has recently looked at the question of when Britain became a democracy, and while the enquiry is mainly based on events in the nineteenth-century, they were also encouraged, and most were able, to make comparisons with the sixteenth-century. While students are able to recognise the similarities across periods and to understand political problems in different eras, I would suggest that those who posted tweets to say that politics should not be taught in the classroom shows a comprehensive failure in this respect. Is there, for example, a qualitative difference in the nature of the learning taking place when students design campaign propaganda for the contender to the throne of England after the passing of Edward the Confessor, to those completing the same activity to suggest who would make the best leader of the Conservative Party in 1979? I would suggest not, but then I doubt that anyone has ever accused a teacher of trying to brainwash a class into electing Harald Hardrada instead of William of Normandy.
We would also do well to remember that school students are perfectly capable of forming and expressing their own political viewpoints, and indeed protesting government policy. The 1985 School Student Strike, which saw students take to the streets to voice their opposition to the Conservative government’s threat to make the exploitative Youth Training Scheme compulsory, is just one historical example of valid and effective student protest.
Finally, the question of making use of current news stories in the classroom can also be justified on the following grounds: one of the most effective ways to get students engaged in a historical topic is to make the contemporary relevance of it explicit to them. Sometimes the best place to start a history lesson is in the present – with a current news story – and then ask how we got here. A few of my Year 8 students went on the women’s march on 21st January. When they came into class the next week and we were learning about the Suffragettes, it all made far more sense to them. I will make no apologies for adopting exactly the same approach over the coming weeks as I teach about the Civil Rights Movement and we ask why there are still protests demanding that we Stop Racism and confirming that Black Lives do Matter going on at the moment. Is this brainwashing or political indoctrination? No, it’s just one way of introducing a topic to students.
Jackie Teale is a secondary level history teacher and doctoral student at Royal Holloway, University of London. Her thesis is supervised by Professor Dan Stone and focuses on the ways in which press photography has shaped public responses to genocide.
There has been some considerable debate in recent days about the public purse bearing the cost of refurbishing Buckingham Palace to the tune of £369 million while the rest of the country endures austerity, the NHS budget is trimmed and the number of homeless rises. Unsurprisingly, an exploration of the historical the relationships between Crown, finance and the public purse provides us with some valuable context for this discussion.
In the past, kings and queens exercised their majesty through a conspicuous display of wealth and power. The right to rule was represented and reinforced by the sheer opulence of the monarch’s dwelling places and possessions. Magnificent courts, sumptuous homes, golden carriages, the largest jewels, the finest horses, the most splendid paintings, were not just the trappings but the foundations of regal power. Wealth was the cause rather than a symptom of power. Historically there were intellectual dimensions to this material majesty. Kings were thought to be appointed by divine right, the keystone in a natural hierarchy celebrated in a culture of deference. Simply put, the king stood just beneath god in the natural order, and this exalted position was reflected by his extravagant wealth.
The same question is more difficult to address with clarity today as debates about heritage and national identity get mixed up with constitutional issues. Whether the current monarchy is a greater asset for the nation as part of the tourist industry, or simply because defending it seems to save us the time of thinking of an alternative, is still a debatable point. For some, even raising the question implies sedition, subversion and immorality. Certainly beyond those die-hard loyalists, besotted with the ineffable mystery of the crown, and firmly wedded to principles of deference reinforced by royal hierarchy, it is difficult to contrive a robust philosophical defence of the institution today. If we pose the same question in the historical sense – what were monarchs for? – it is perhaps easier to arrive at plausible and persuasive answers.
In the past kings and queens were warriors, symbols and enactors of military might, dispensers of justice, makers of law, and, very commonly, representatives of God on earth. From its foundations in the act of William I’s conquest, through to the Imperial majesty of Victoria, the English monarchy has acted as if they were the centre of political power. Competing with the Papacy and later the Church of England, the monarchy erected a powerful jurisdictional claim to be not only the source of morality but also the arbiter of true religion. Despite two revolutions in the seventeenth century, (one, in 1649, which saw the most radical act of anti-monarchical reform in the act of decapitating Charles I; and the other in 1689, a more decorous affair, but nevertheless clear evidence that kings and queens were reliant upon a broader political constituency than simply God) claims to divine right legitimacy have still not been discarded by ardent monarchists, even though the political constitution finally abolished the notion in 1701. It is quite clear that one cannot engage with the English past without considering the nature and power of the monarchy. The powerful material remnants of the institution lie all across the land: castles, forests, parks and ancient and relatively modern placenames. The Royal imprimatur can also be found on everything from caviar to toilet paper. Almost silently these monuments, buildings and spaces plead a royalist cause.
In the past apologists for monarchy adopted a number of justifications. Many of these were based on appropriating the most effective political document of the pre-modern world – the Bible – to the case for the defence. Arguments in favour of legitimacy included radical claims for political dominion based on conquest, the Biblical figure of Nimrod being a particular favourite. Others claimed even as late as the seventeenth century that since God had given all dominion over the world to Adam, and all kings were direct descendants of the first father, so they had supreme power. Although kings might be morally bound to govern in the reasonable interests of the community, the subject had no claim against arbitrary behaviour. The difference between absolute authority and arbitrary power was quite subtle. Despite a conceptual distinction between ‘tyranny’ and monarchy, many defenders of kingship furiously underscored the principle of both passive obedience (put up with what ever happens without complaint) and non-resistance (never, even in the most extreme circumstances, even imagine raising a finger against the king). This world was shattered in 1649 with as much cultural trauma as the attack upon the Twin Towers in New York. Killing the King was understood by contemporaries as a blasphemy equivalent to the sacrifice of Christ. English republicans have struggled with this legacy ever since.
The strongest claims for the monarchy appear to be those that invoke tradition, historical continuity and the sanctity of the ancient constitution. A thousand years of regal majesty, evident in the still robust and bewitching spectacle of Royal funerals of Lady Diana and the Queen Mother, seems an almost unanswerable argument. We should remember that despite invoking the sonorous authority of tradition, the past is as much a projection of present-centred aspirations, an invented tradition, as it is a persisting truth. Put another way, celebrating the past does not necessarily mean living in it. Turning to the past may however give us something with which to compare current institutions. By asking historical questions – What did kings and queens do in the past, what was their function in the polity? How did subjects understand their duties and obligations to regal figures? Where did their authority to rule come from? – it might be possible to raise legitimate questions about the nature and function of the modern institution. However, the mere act of broaching these issues has commonly been dismissed as insolent and inappropriate mischief.
It has never been fashionable to be a republican in England, even in the heady days of the 1650s when the country was ruled by a Lord Protector in the name of the sovereignty of the people. The history of English republicanism, despite the persistent charges of conspiracy levelled against successive figures such as Oliver Cromwell, Thomas Paine, the Chartists and Willie Hamilton, has not been a lineage of subversive king-killers. In fact, the history of English republicans have traditionally been more interested in making good citizens than neutering extravagant monarchs. The persisting cultural memory of 1649, mixed in with nightmarish images of French sans-culottes, the guillotine and 1789, as well as twentieth Russian and Spanish revolutionary traditions, has always successfully tainted republicanism with regicide. This political stereotyping has its origins in an eighteenth century tabloid creation known as the ‘Calves Head Club’ – a clandestine fraternity that gathered each January 30th to celebrate the execution of Charles the Martyr, in a drunken and impious manner. The Calves Head men, almost certainly an invention of the fevered imaginations of loyal clergymen became a powerful way of neutering any public political discussion of the rights, prerogatives and power of the monarchy. Any such discussion would lead inevitably to regicidal action. The same logic still applies today in many quarters. That it does provides simple proof of the enduring centrality of the institution of monarchy in England.
English republican thinking, even in its most radical years, was very rarely regicidal. Admittedly the great apologist of the English Republic, John Milton, defended the execution of Charles I in robust and comprehensive arguments. But the intellectual origins of the majority of these arguments were not necessarily ‘republican’. A defence of popular sovereignty (that the safety and common good of the community is more important than that of the monarch), a description of both the rights and duties of resistance to illegal government, were all central components of a ‘democratic’ theory of government which we all subscribe to today. Very few people realise that the trial of Charles I was as much a religious matter as a political one. Indeed, the Old Testament was as important a document in his condemnation as the writings of republicans.
Although Kings were subject to restraint, it should be noticed that one of the major targets of this republicanism was ‘tyranny’ – corrupt courtiers, corrupt government, self interested and immoral ministers were the targets of hostility. Bad government irrespective of institutional form, was tyrannous. In other words, one did not have to live under a monarchy to experience tyranny. The same is true today when any of us can have republican values and aspirations without starting with the immediate issue of kingship. Republicanism in the British Isles since the 1650s has very often been more concerned with making good citizens than punishing wicked kings. Indeed, for many thinkers and writers, the issue of monarchy after 1689 was a side show to the bigger business of eradicating inequality, deference and oppression. The distemper of deference, and the extravagance of the costs of supporting a monarchy, were sometime perceived as contributing to a more general political oppression, but the focus of ambition was turned (and continues to turn) on providing the political institutions that cultivate an active, virtuous, tolerant and just community. The problem with monarchy was not personal, or even financial; rather, it was resisted and criticised because it was a symptom of an ancient constitution riddled with anachronistic prerogatives and privilege. Given the extraordinary year that is now slowly drawing to a close, we may do well to remember that, in this view, the Houses of Parliament are just as corrupted and ‘monarchical’ as the Royal Family.
The First World War is not quite ancient history, but it is very much part of the past. All those who fought in the war have now died and even those with hazy childhood memories of the conflict are very few in number. So the ‘war to end all wars’ is no longer part of living memory. Yet in the summer of 2014, the outbreak of this seminal clash of empires was commemorated, with varying degrees of enthusiasm, across Europe and the wider world. Since then, the centenary anniversaries of many of the major milestones of the war have been marked by national governments and local communities and received extraordinary levels of popular, political and media attention. In Britain, the well-established rituals of Remembrance Sunday and the 1st of July have been imbued with yet greater symbolic weight and the Imperial War Museum has been completely reinvigorated, its lavishly overhauled galleries now offering a flawed but highly inventive and visceral impression of the British experience of the war. We’ve also seen the staging of some truly imaginative and moving commemorative projects, including the ‘Poppies in the Tower’ installation, which has become a traveling exhibition, and the remarkable ‘we’re here because we’re here’ living memorial ‘unveiled’ to mark the centenary of the first day of the Battle of the Somme in July. Alongside these, there have been a myriad of smaller but no less affecting community projects, which have seen schoolchildren born in the new century remember the dead in the company of senior citizens whose fathers fought in the war.
As we continue to journey through this period of intense commemorative activity, it’s worth emphasising just how unprecedented all of this is. In the history of the commemoration of conflict, nothing on the scale of what we have witnessed over the past three years has ever occurred before. The very early date at which the British government announced its intention to commemorate the centenaries and the sheer amount of government money being spent on commemoration were both quite new. As early as October 2012, almost two years before the centenaries formally began, David Cameron pledged to devote at least £50 million to commemorative projects, a substantial amount of money at a time when many communities in the UK were experiencing severe economic hardship. The very fact that the outbreak of the war was commemorated by several national governments in August 2014 was a very new departure indeed. Traditionally, of course, the start of the war has not been commemorated so this was a new, and not uncontroversial, move.
But why has the British government been so determined to demonstrate its commitment to commemorating an extraordinarily violent war that took place a hundred years ago? A key reason for the political interest in centenary commemoration is the fact that remembering the First World War remains central to British identity and popular culture. Indeed, arguably no other historical event retains the same emotional resonance for the British as the so-called ‘war to end all wars’, and the commemoration of the conflict is something many British people take very seriously. This emotional attachment to the war ensures that even those who know little about what occurred between 1914 and ‘18 tend to have strong feelings about how the war should be interpreted and commemorated. We see evidence of this every year in the weeks before Remembrance Sunday, when there is always a certain amount of discussion about commemoration in the British media. There is nothing new in this, but the centenaries have really thrown both the positive and negative aspects of British commemorative culture into sharp relief.
In the year or so before the centenary commemorations ‘broke out’ in the summer of 2014, there was a particularly interesting and often heated public debate in Britain about the real meaning of the war and how it should be appropriately commemorated. The debate intensified with the publication of an article by then British Secretary of State for Education, Michael Gove, in the Daily Mail on January 2nd that year. This all seems very distant now that we remember Gove for other things, but two years ago he was a rather controversial Secretary of State for Education seeking to promote a narrowly Anglo-centric version of British history. In the article, he was scathing of what he regards as the apparently dominant left-wing version of the war which, in his view, portrays the conflict as a ‘misbegotten shambles’ and thus denigrates the ‘patriotism, honour and courage’ of those who lost their lives in the conflict. His comments quickly met with robust and highly critical responses from his Labour counterpart, Tristram Hunt, and a range of other prominent commentators, including historians Richard J. Evans and Antony Beevor. A group known as the Stop the War Coalition was particularly critical of his views on the war.
The German press also weighed in, with Die Welt printing an article with the headline ‘Britischer Minister gibt Deutschen die Kriegsschuld’ on 9 January. But Gove was also publicly defended by the then mayor of London, Boris Johnson, who went so far as to demand Hunt’s resignation and, in an article in The Telegraph, insisted that ‘Germany started the Great War, but the left can’t bear to say so’. So although Gove’s Daily Mail diatribe was a charmless combination of ignorance and self-righteousness, it revived a debate that reveals a great deal about British understandings of the First World War.
In very broad terms, the debate, which continues, divides those who, like Gove, regard the war as a bloody but necessary conflict in which British servicemen heroically achieved a great victory, and those for whom the war was little more than a futile exercise in mass slaughter. Most British people probably stand somewhere between these opposing views or, importantly, have no view. Yet the image of the First World War as the ultimate example of the futility of war, which was reinforced during the 50th anniversary of the conflict in the 1960s, remains very persistent in the UK. One striking feature of this emotive public discourse is that while there is a great deal of political, academic and popular disagreement on the historical interpretation of the war, everyone seems to agree that those who lost their lives between 1914 and ‘18 should be remembered with reverence and respect. So, unlike in other states that experienced the conflict, notably Germany, in Britain literally no one publicly opposes the custom of remembering the war and those who died in it. Even groups that are very critical of official, government-driven acts of commemoration, such as the Stop the War Coalition and the No Glory in War Campaign, have never, to my knowledge, gone so far as to question the wisdom or morality of commemorating dead soldiers.
As an act of community remembrance, or a simple expression of solidarity with our ancestors, the commemoration of war is not necessarily political. The millions of British people who wear poppies every year in the weeks before Remembrance Sunday are not making political statements by so doing. Nor are they retrospectively endorsing or honouring the First World War, or any war since. What they are doing – at least on the face of it – is honouring the dead. And yet the intense and generally exclusive focus on the dead is perhaps the most obviously problematic aspect of British commemorative culture. An undivided emphasis on the sacrifice of those who died as a result of military service made sense in the 1920s and ‘30s when millions of people across the globe were still suffering intense bereavement. We should also remember that, whatever we feel about the cause for which they were fighting, the dead gave everything they had; they made the ‘ultimate sacrifice’ in the parlance of the times, and this alone makes them worthy of our interest. The relative youth of those who died makes their stories all the more poignant, and we are understandably moved– and disturbed – by the power of industrialised warfare to cut short so many lives. Indeed, reflecting on the fate of the dead helps us to appreciate the catastrophe of inter-state conflict.
And yet a commemorative culture that focuses exclusively on the dead arguably overlooks the vast majority of people who were affected by the war. Approximately 750,000 British and Irish servicemen died as a result of service in the First World War. But many more, in both countries, technically survived the conflict but were physically disabled or psychologically traumatised by their experiences. As soon as the wounded began returning from the fighting fronts in 1914, and for decades after the war, severely disabled veterans were a common sight on the streets of European cities. Men who were psychologically traumatised were perhaps less visible but no less numerous than the physically disabled. Allied and German soldiers were remarkably resilient and cases of shell-shock, although common, were not as widespread as we might think. But tens of thousands of soldiers from across the British Empire were traumatised to the point that they simply couldn’t function in civil society and most veterans suffered some form of psychological distress. The pain of witnessing this mental torment – and the various, sometimes violent ways in which it manifested itself – must have weighed very heavily indeed on the families of those who returned from the front, physically intact but visibly altered by what they had seen and done.
Which brings us to those who played no direct role in the fighting but were nonetheless deeply affected by the violence that raged around them. Millions of families across Europe and the wider world suffered from profound and enduring grief when their sons, husbands, brothers and friends were killed or mortally wounded in the theatres of war. Death, even the sudden death of young people, is by no means unique to war but during the First World War the pain of bereavement was often particularly traumatic. Many of those in mourning knew little of the circumstances in which their loved ones had died, and were denied the consolations of a funeral or a grave on which they could focus their feelings of loss. In a move that was unique to the British Empire, the grieving relatives of the dead that lay in identified graves were granted the right to pay for a customised epitaph. These personal inscriptions give us an extraordinary insight into the mentalities of those who neither fought nor died but on whom the violence of the war had a very direct and lasting impact. Indeed, the often moving personal messages from grief-stricken relatives remind us that every headstone we see today in a Commonwealth cemetery marks the grave of a dead serviceman, but it also represents a lost son or husband or brother and a family in mourning.
Many of us feel – on some level – that the bereaved, the disabled and the traumatised are central to the story of the Great War, and are as deserving of our attention as ‘the fallen’. And yet the popular and official language of 21st century commemoration rarely alludes to the plight of those who were left behind and there is little room for them in our official commemorative ceremonies. To its great credit, the Royal British Legion began incorporating disabled servicemen into its ad campaigns in 2013 when Lance-Corporal Cassidy Little, a Royal Marine who lost his leg while serving in Afghanistan, featured in the poppy appeal campaign that year. The sight of a soldier who had very clearly lost a limb on posters throughout the UK reminded us that war and those mutilated in combat remain a part of British life in the 21st century. Other organisations and individuals have also drawn our attention to veterans who have suffered life-changing injuries, including Prince Harry, who helped establish the Invictus Games in 2014. And yet no disabled servicemen were formally incorporated into any of the major centenary commemorative events that have thus far taken place. In a more historical but equally glaring omission, the hundreds of thousands of disabled soldiers who were demobilised during and after the First World War are simply never mentioned in the national discussion on commemoration.
Added to this rather one-dimensional retrospective understanding of wartime sacrifice is a superficial valorisation of all those who served. While it would be wrong to suggest that British commemorative culture glorifies war, it does arguably venerate dying in war to the point of glorification. The focal point of the official Remembrance Sunday ceremonies each year is Edwin Lutyens’ great Cenotaph in Whitehall, on the side of which are inscribed the words ‘The Glorious Dead’. During the war and in the decades that followed the Armistice, it was important for those in mourning to believe in the righteousness of the Allied cause. The belief that their son or husband or brother hadn’t died in vain, that the cause for which he had given his life was glorious, clearly gave comfort to many of those in mourning across the Empire. Indeed, our ancestors’ interpretation of the war as a just, necessary and even glorious conflict is quite understandable. The war was an extremely personal business for those who experienced it, either as civilians or servicemen, and they were very emotionally invested in it. That same interpretation should, however, be a lot less understandable today.
Expressing solidarity with our ancestors and empathising with their pain is perfectly human, and even healthy. For cultural historians who explore the mentalities and emotions of past generations it’s also a professional skill. But a completely uncritical retrospective understanding of soldiers who fought in this extraordinarily bloody conflict is problematic because it tends to suggest that all the morality during the First World War
was on one side. This widespread view was reflected in media commentary on the war throughout 2014, in which historians and others highlighted the German violation of Belgian neutrality and the mass killing of unarmed civilians by members of the German armed forces. Such mass killings of civilians definitely occurred during the invasion of Belgium and northern France in 1914 and in the U-boat and airship raids that began in 1915 and continued for much of the war. As uncomfortable as such historical incidents may be politically for German leaders, and indeed for ordinary German people, it is important that we become familiar with them if we want to understand the ‘total’ nature of the war. That the war was sold to and understood by the people of these islands as a righteous endeavour is key to understanding the British and Irish experience of the conflict. And German atrocities – both those that were fabricated and those that were all too real – were central to the process of cultural mobilisation across the United Kingdom in 1914 and ’15.
It is important, however, that we acknowledge that the armed forces of the Allied states were also directly involved in the taking of civilian lives between 1914 and ’18. The most obvious example of this was the Allied naval blockade of the German coast, which was orchestrated by the Royal Navy and thus very much a British enterprise. The North Sea was formally declared a ‘British military area’ in November 1914 and the blockade essentially lasted from then until German representatives signed the Treaty of Versailles in June 1919 (fully 7 months after the Armistice). By mid-1915, German imports had fallen by 55 per cent from pre-war levels, leading to major fuel shortages but also to serious and increasing scarcity of vital foodstuffs. By 1917, disorders related to malnutrition, including scurvy, tuberculosis and dysentery, were common across Germany. Official statistics gave a figure of approximately 760,000 for those who died of malnutrition as a result of the blockade.
The blockade was certainly a factor in German capitulation in 1918 and the outcome of the First World War. It also significantly influenced the experience of the war for ordinary Germans and evidently led to the deaths of many thousands of civilian men, women and children. Yet in all the commentary on the war that has pervaded the British media since the beginning of 2014, I do not recall a single reference to the Royal Navy’s role in the taking of German lives. Nor has there been any mention of the hundreds of unarmed civilians who were killed by British servicemen – invariably veterans of the Great War – in the immediate aftermath of the war in India, Mesopotamia and Ireland.
I do not raise this issue of British complicity in the killing of civilians to in any way denigrate the conduct of British soldiers on the Western Front or at Gallipoli or elsewhere during the First World War. Many British officers and men, and indeed soldiers from all sides, served in a plainly self-sacrificing and often heroic fashion, and we are understandably impressed by their stories. But there is a persistent popular belief in the UK that the British soldiers of the Great War were victims rather than perpetrators of violence – and not both – and that they are morally beyond reproach. If we genuinely want the centenaries to become a moment in which we improve our understanding of the ‘war to end all wars’, this should ideally change.
Finally, there is a striking and increasingly jarring contrast between the great dignity and reverence with which most British people remember the war dead and the tone of the national conversation about commemoration. Over the past number of years, press commentary during ‘remembrance season’ has become ever more antagonistic and we’ve seen the emergence of the particularly unpleasant practice of calling television personalities and other public figures to account for not wearing the poppy. Jon Snow is a well-known example, but there have been others. This year, the Daily Mail has led a campaign to force the FA to go against FIFA and allow English and Scottish footballers to wear poppies emblazoned on their shirts when they play against each other on Armistice Day. ‘Poppy War’ screams the front-page headline. At the risk of stating the obvious, this press-manufactured spat over football jerseys is not a war. It is nothing remotely like a war. And using shrill, deliberately exaggerated language of this kind arguably does a great disservice to the memory of the men who fought and died on the Western Front and elsewhere. Historians still disagree about the motivations that led so many men to enlist or seek commissions in the British Army in the first year or so of the war. We can be quite sure, however, that whatever they thought they were fighting for, it wasn’t self-righteousness, invented indignation or gutter jingoism.
The British have every reason to be proud of their highly distinctive culture of commemoration. The sincerity and dignity with which most British people – irrespective or class, or status, or race – remember the dead each year is truly impressive and admirable (particularly to a foreigner). But if we genuinely believe that the dead of the First World War are worth commemorating, we should seek not simply to remember them, but to understand them. We should thus take time to reflect upon the totality of their experiences, to think of those they left behind, and to appreciate the remarkable colour and complexity of the world in which they lived and died.
Edward Madigan is Lecturer in Public History and First World War Studies at Royal Holloway, University of London
The autumn is upon us. And Poldark is back! The images of the beautiful Cornish coast around Treen, Porthcurno, and St Michael’s Mount are a welcome visitor to the screen as the grimy dark nights draw in. The television series, reborn from the novels of Winston Graham and the earlier screen adaptations of the mid-1970s, continues to attract considerable attention from the general public and historians alike.
Recent posts on this blog from Sarah Crook and Graham Smith have raised some very interesting questions about gendered perceptions of public history, both in popular books and on television. Following on from this commentary, I’d like to consider the ways in which historical writing and research have inflected the production and reception of BBC’s very successful fictional history of eighteenth century Cornwall.
Hannah Greig, historical advisor on Poldark, and Greg Jenner, of Horrible Histories fame, have already offered some very insightful views regarding the role of the historical advisor in contributing to the ‘accuracy’ of fictional representations of the past. For both Hannah and Greig, the priority of a television drama is precisely that – dramatic structures must take priority: ‘drama is there to entertain us. Dramatists are there to spellbind us, to make us laugh and cry and fear for our favourite characters’. Hannah confirms and explains ‘the most important thing is to have great story. That has to be the priority. A historical adviser can help to drive that story forward, informed by what we know about the past’. Historians are thus not invited to ‘determine what that story is’, but to inform the ‘look’ and conduct of the action. And yet I feel we should at least consider the idea that complementing the programmes with further historical context might make the drama all the more compelling and resonant.
The historical preparation for the portrayal of mid- to late-eighteenth century rural, coastal and town life on Poldark has been meticulous: attention to the details of dress, commerce, urban sociability and gentry etiquette have been scrutinised by the learned and the expert. Whether a pasty contained rabbit or deer might depend on the local ecology or the skill of the poachers, and the details of costume, deportment or the pistols and rigging are useful markers of historical ‘accuracy’, if not necessarily carriers of truth. In strictly dramatic terms, the narratives in Poldark are compelling, blending the personal, the emotional and the political in a very challenging and provocative way. Over the course of successive Sunday evenings the heroes and villains, the scoundrels and the indigent, encounter each other in a variety of social, institutional, cultural and legal settings. These encounters also expose the deeper seams of eighteenth century life: the rule and administration of law by local elites, the impact of commerce on the routines of customary economic practises and the complexity of popular and parliamentary politics. Yet while much has been made of the visual reconstruction and the marvellous acting, the more profound themes of gender inequality, class war, the ‘old corruption’ of public politics before the days of the secret ballot, and the abject poverty of rural labour have not been teased out in the reviews, although they are, in effect, the sinews of the power of the narrative which keep us engaged.
As those familiar with the early-modern social history of ideas and crime may have realised, the narrative of Poldark conveys very powerfully one of the key insights of Edward Thompson’s work: that although the rule of law in the eighteenth century was contrived to protect property, it was also bound by its own authority. The role of the jury in freeing Ross Poldark from the noose, for example, represents the significance of the tradition of trial by jury in the administration of justice enshrined in the birthrights of freeborn men and women. The radical John Wilkes’ freedoms were preserved by this process in his defence of liberty in the 1760s when Middlesex juries repeatedly protected him from conviction. The current episodes of Poldark engage with the histories of the complex processes of social mobility which drove, and were driven by, the marriage market; the crises of familial relationships that shaped reputation and authority; the dangers of gambling and the financial markets, and the hard grind of the everyday lives of ordinary people.
Although the original novels were written in the immediate post-war contexts of the mid and later 1940s, they have been made more directly historically interesting by the growth of social history in the 1970s. The ages of Walpole, and then the Pitts, elder and younger, were not simply made up of stories of meticulously landscaped county houses, glittering society balls and the routines of polite culture: they were times of revolution and turbulent class struggle. The American wars of Independence saw a great diffusion of radical commonwealth ideas across the Atlantic. At home the popular resistance manifest in the campaigns of John Wilkes for the liberties of the freeborn English, and later in the French revolution, offered radical opportunities for protest and freedom in Europe, including in Ireland during the bloody United Irish rebellion of 1798, and for the ‘Black Jacobins’ of Haiti (see C.L. R. James’ powerful study of Toussaint L’Overture). In Britain, ‘riots’ prompted by political ideology or economic desperation reflected the increasing dominance of ‘King Property’, and the progressively rapid destruction of what the great historian Edward Thompson called ‘customs in common’. Labourers, artisans and skilled workers – both men and women – saw the traditional means of regulating their working hours, and providing for their families, constrained and disrupted by the demands of the market and the ever powerful coercive legal code which led to Douglas Hay referred to as ‘Albion’s fatal tree’. Smuggling, poaching, and wrecking were all subjected to criminal codes of brutal savagery.
Poldark addresses many of these themes in the social history of crime and society explored in the great and formative works of historians like Edward Thompson (Whig and Hunters, 1975), and the collection of essays exploring the lives and deaths of labourers and city workers (Albion’s Fatal Tree, 1975). Markus Rediker wrote a wonderful book on the Atlantic world of pirates and seamen some thirty years ago (Between the Devil and the Deep Blue Sea, 1987), while Peter Linebaugh’s London Hanged (1991) explores the lives of those that were victims of Tyburn in the struggle between rich and poor. The criminalisation of what were regarded as customary rights, in the name of defending property and order, is the backcloth to the struggle of Poldark and his friends. Commerce and maritime innovation may have brought new commodities to the banqueting tables of the gentry, but they also destroyed the system of regular employment which enabled the poor and labouring to survive by helping themselves to reasonable benefits of their labour (known then as perquisites, or in our modern world ‘perks’). Exploring these histories will make the viewing of the series even more exciting.
For those interested in the histories of smuggling, poaching and the highwayman there is an alternative fictional series which seems to have been forgotten. The ‘Dr Syn’ novels of Russell Thorndyke, written during and after the First World War, and set in the smuggling culture of Romney Marsh in Kent and Sussex, combined smuggling, piracy and politics. Thorndyke’s novels travel widely, involving Caribbean characters, pirates and American revolutionaries: this might provide a much more diverse palette for the modern viewer. These novels have been serialised on the radio (read by none other than Rufus Sewell who recently starred in Victoria), and indeed were turned into a series of graphic novels and reasonably gentle Disney films. There was a ‘Carry-on’ version in the 1970s, the great Led Zeppelin recorded a song, ‘No Quarter’, drawn from the stories, while ‘The Day of Syn’ is a festival held in the town of Dymchurch to fund-raise for local community activity. A modern script-writer might work with the novels, but also explore them alongside the new Atlantic history inspired by the landmark histories of Peter Linebaugh and Marcus Rediker, whose work, The Many Headed Hydra: The hidden history of the revolutionary Atlantic (2000) explores the communities of slaves, commoners and sailors who resisted economic and social oppression from elites and mercantile interests from the West Indies, Africa and North America. Weaving those ‘real’ narratives into a fictional narrative would be a great challenge but it would also produce a very attractive and diverse series that allowed different voices and characters to perform in mainstream viewing. Perhaps and enterprising commissioner at the BBC or Channel 4, will explore the possibilities of creating a further series? Let’s hope so.
Sexed-up television histories, it seems, are just for girls. Histories, that is, that embellish and dwell on human relationships, that exalt the tactile thrill of the inadvertent touch, that are attentive to the colour a frisson of desire can add to the way we tell stories. Or so a recent article in The Spectator by James Delingpole would have us believe. The article, which was subsequently amended to remove some of its more outlandish claims, put forward some quite brazen generalisations about how men and women approach the past. ‘Boys’, the author postulates, ‘being of a more trainspotterish disposition’ would be more critical of the recent ITV series Victoria, for they are ‘more jealous of their facts and period detail’.
The critical response to the unamended article from historians on Twitter was swift and brutal. This was followed by a more critical approach: Delingpole is a provocateur, it was claimed, and the academic community should not engage with his trolling. But as other historians pointed out, Delingpole’s view that women’s interests are less intellectually rigorous and factually oriented than men’s are less unusual than we might hope. My own view, and that of others, is that we have a responsibility to attack sexism as and where we find it. Beyond this, the article raises two issues for historians interested in the public representation of the past. First, and perhaps less controversially, that the purpose of popular portrayals of prominent figures is to inform as much as to intrigue and entertain. Second, that women are driving the ‘MillsandBoonification of history’ while men are the dispassionate stewards of historical fact.
It is impossible to lay the sexism of the article to one side. The dichotomy between men and women trespasses from their representation of history and into their representation of characters. Rufus Sewell ‘smoulders so tastefully’ as Lord Melbourne (he acts) while Jenna Coleman ‘looks gorgeous’ as Victoria (she exists). Moreover, a collective sigh surely arose at the declaration that the author ‘blamed the ongoing feminisation of culture’ for the direction of the series. This feminisation, the author suggests, drives an ‘irresponsible’ history. But if we scale out from the article to examine the landscape of historical dramas more broadly, can we really say that this attentiveness to desire and romance is a peculiarly feminine trait? Are women responsible for driving men off the sofa on a Sunday night? The author makes it clear that they are – and not to bed with a serious historical tome, either – rather, he argues, men are driven off the sofa by romantic dramas to ‘cavort with rent boys’.
The briefest of journeys through films situated in the (sometimes mythical) past suggests that men happily sex up their representations: Troy (dir. Wolfgang Petersen); Pearl Harbor (dir. Michael Bay); Braveheart (dir. Mel Gibson). Male directors and writers evidently find it just as easy to elaborate, extemporise and appeal to emotion. When men take the reins they are inclined to emphasise the human relationships that underpin events regardless of whether they envisage a male or female audience. In dramas emotions often do the work of explaining complex historical convergences. Does it matter if these desires are fictitious, that it is unlikely that Queen Victoria and Lord Melbourne ever exchanged looks charged with sexual intent? While this might grate for historians (we are, after all, ‘for History’), we must first be wary of doing a disservice to viewers by infantilising them as an uncritical public.
My own view is that the best historical dramas do inform as they entertain. As historians are only too aware, and at the risk of being platitudinous, the truth of history is often more scandalous and more intriguing than dramas allow. It is a shame that Victoria marginalises the genuine political and social tensions in favour of a fabricated romance. But laying the blame for this at women’s feet is simply laughable to today’s historians. Rather, Delingpole’s piece brings to mind the anxiety raised by novel reading in the eighteenth century or the consternation over the proliferation of girls’ magazines in early twentieth century Britain. A culture that indulges fictive representations of lust is often considered risky and threatening. As for his claim that ‘mostly men … value history’? Just imagine how he’d react if he found out that some women not only value it, we also teach it.
Sarah Crook is the Cox Fellow in History at New College, Oxford. She completed her PhD on mothers and depression in post-war Britain at Queen Mary, University of London in September 2016.
Just over a week ago The Guardian published an interview with Rebecca Rideal whose narrative history 1666: Plague, War and Hellfire has just been published. The interview provoked a number of historians to Tweet criticism of Rideal, a PhD student and former TV producer who founded The History Vault. Her assertion that ‘The time of the grand histories that are all about male figures is coming to an end’ seems to have touched a particularly raw nerve. The common complaint was that Rideal had failed to acknowledge that the fight against great men histories had been waged for over three decades.
I have some sympathy with these grumblings. Back in 1982, I returned from completing an MA in Social History at Essex to my first university armed with a poster for Leonore Davidoff’s course. I was just pinning it to a noticeboard when the department’s senior professor of economic history spotted me and declared, ‘Women in History, Graham? Whatever next?’
However, as others have pointed out, the fact that the struggle to go beyond hegemonic discourses continues suggests that winning once is not enough. My belief is that evidence of a new generation reinventing ways of taking up that fight should be a cause for celebration rather than condemnation. As tends to happen on Twitter, battle-lines were drawn, allies and enemies were quickly made and exchanges sharpened after those initial criticisms of Rideal. On one side were historians who clearly identified with Rideal, especially those aiming to make a living from producing popular histories. On the other, for the most part, were historians working in universities, some of whom began to question whether Rideal was even qualified to write early modern history.
Spiraling sub-fights, with supporters weighing in from various camps, fed a debate that became increasingly acrimonious. There was also the usual Twitter induced comic confusion – it is not always clear who is responding to what strand as arguments fork, overlap, separate and loop. Nevertheless, things were very evidently turning nasty.
Responses – public history
A common response during the course of the spat and afterwards has been to present the ‘history profession’ as broad enough to encompass both those working inside and outside of universities. Such claims were underpinned in most cases with the argument that Rideal is engaged in ‘public history’. Leaving aside the rather odd formulation of the ‘history profession’ with its Rankean pretensions, intellectual insecurities and constant discipline making, patching things up with another poorly conceived label seems like an inadequate way to proceed. Instead, moving the debate forward will require genuine reflection on the nature of ‘history’ as a profession; otherwise we will continue to periodically descend into bickering and trading insults.
One difficulty amongst historians in Britain is that public history is not as well developed or understood here as it is elsewhere, especially in North America and Australasia. We tend to talk about public history as history that is produced outside of university departments; an activity, such as a television history. Or sometimes we stretch this base definition to include public history as impact, especially the influence of historical policy research translated for the consumption of publics or politicians. But the roots of public history are older and the acrimony of the recent Twitter battle reminds me of a wider war.
In those very early and heady years of the 1980s, I had left Stirling University to learn about oral history at Essex. At the time, oral history was despised by ‘professional historians’, rather than generally misunderstood or dismissed as is the case now. The economic historian I referred to above taught me in my final undergraduate year and on being approached for a reference recommended I should continue to study with him. By so doing, he insisted, I would be able to take, ‘A panoramic view of the past, rather than going down in the dirt with the yokels’. My response to such unashamed elitism, was to attend his final seminars dressed in a top hat and frock coat bought from the local Oxfam.
These days, the battles within ‘the profession’ are mainly over resources and too often fuelled by egotism. With its proponents organised into warring tribes according to the periods and places they study, or corralled into sub-disciplinary groupings, History is fractious even within the academy. In all of this sound and fury, and despite constant internal sniping, the discipline has been traditionally slow to innovate and much of the sparring is about maintaining rather than extending boundaries. It is worth noting, for example, that those pioneering courses in women’s history and oral history at Essex were taught in the Sociology Department. While members of other disciplines frequently offer support for new ideas, historians – too often operating as lone scholars – revel in knocking lumps out of one another, reserving particular spite for those who try to innovate. The result is that in open competition for resources, most obviously for research grant income or in the formation of mutually beneficial research partnerships, historians do not achieve the same results as, say, political scientists or human geographers. Nor are we as prepared to look after our researchers or early career colleagues as would be the case in economics or sociology.
So what can public history offer? In answering that question I’m alluding to historians who actively research and publish as reflective public historians and are not only making up numbers in the history commentariat. Drawing on the work of early oral history, at least some public historians have developed a greater sense of working in partnership, and have come to genuinely appreciate the notion of ‘shared authority’. This at its most basic is the recognition of different forms of expertise and was developed in response to the simple question of who were the authors of oral history interviews. Was it the (oral) historians conducting the interview? Or the interviewees who were specialists in their own lives and always much more (otherwise why bother interviewing them)? Or both?
It cannot be beyond the capability of historians, irrespective of where they work or what they work on, to collaborate on projects in a spirit of shared authority. While rigour in handling evidence, of broader interpretation and writing should be upheld, there is much to be gained by recognising that we may all be engaged in a common project that goes beyond individual conceptualisations or where we work. Just recognising that connecting with members of the public involves a different skill set and that the ways in which we communicate should become the subject of historical study would be a major step forward. Even more pressing is the need for greater recognition that large numbers of people, especially in Britain, are often deeply invested, passionate and knowledgeable about history. The notion that ‘we’, whether ‘we’ are in community or academy settings, are the arbiters or the sole traders of the past is pure delusion. The idea that there still a great deal to do within our imagined profession even after a peace treaty is declared, should keep us all busy and out of Twitter trouble.
Political scientists are already mining Twitter for research, most notably on its use in revolutionary situations. One recent study has pointed to the significance of Twitter as a means of ‘collective sense making’ during times of instability. It will be interesting to see what historians make of Twitter in the future. As an echo-chamber for congratulatory thought collectives or as a means to conduct acrimonious debate, the 140-character-a-time medium will offer rich evidence of the historiographical making and unmaking of ‘us’ and ‘other’.
 Oh, O., C. Eom, and H. R. Rao, “Role of Social Media in Social Change: An Analysis of Collective Sense Making During the 2011 Egypt Revolution,” Information Systems Research, vol. 26, no. 1, pp.210–223, 2015.
The recent photos in the media showing armed police apparently forcing a Muslim woman wearing a burkini on a French beach to remove it, or alternatively some of her outer clothing, in public, and then seemingly fining her, highlight beautifully the challenges facing historians in a post-modern historical world. What the ‘facts’ of the matter really are is no longer relevant. It is what we believe to be happening that counts, and so it is our interpretation of those facts that matters. Whether or not it was really a dreaded burkini (an outfit “not respecting good morals and secularism”) – at best, unhygienic, or at worst, to quote the French Prime Minister, part of the “enslavement” of (Muslim) women, this episode underlines yet again how central Muslim women’s bodies are to wider questions of identity, community and ‘modernity’. For the last couple of centuries Muslim women have been under close scrutiny in terms of what they wear, or do not wear. Their sartorial choices have not been individual choices. Rather they are so often the litmus test for ideas about progress versus non-progress, however these two terms might be understood. Interestingly, back in the 1850s, when the US activist Amelia Jenks Bloomer pioneered the wearing of the loose pantaloons that came to bear her name, western women followed the example of so-called eastern (Muslim) women and adopted ‘the Turkish dress’ in order to liberate themselves from the restrictive clothing – complete with bone-crunching corsets – that dominated at that time. Of course, bloomers in due course retreated to the private world of Western women’s underwear. Burkinis, like other supposedly threatening forms of covering worn by twenty-first century Muslim women, have to be seen, like the women who wear them, in public. Surely their public presence is a good thing?
See here for an informed discussion of when and why items of clothing have caused political storms.
Sarah Ansari is Professor of South Asian History and head of the Department of History at Royal Holloway, University of London.
The Brexit referendum was about something far bigger than Britain’s political and economic relationship with the rest of Europe. School pupils and university students spontaneously broke into tears on the morning the results came in, seeing their future life prospects destroyed. At the same time, people who look or sound different were told by triumphant leavers to ‘pack your bags and go back where you came from’, across the country and without any apparent coordination or official political backing. Such happenings are ominous, and they become more ominous still if serious incidents such as the murder of an MP as a perceived ‘traitor’ to the nation are factored in. Many of us historians have developed a special sense for such moments because we are trained to connect the dots intuitively and imaginatively. We have seen similar outbursts of collective emotion in the past and know what they can harbour – situations like 1789 in France, 1947 in India, 1990 in Yugoslavia.
There has been a widespread sense of disquiet about the state of the world for some years now. The ability to visualize a better future has never in living memory seemed so remote. It feels as if Francis Fukuyama’s much maligned ‘end of history’ has been stripped of its messianic optimism and then never gone away. A dark cloud of ‘there is no alternative’ has being hanging over us. There have been global pandemics like swine flu or Ebola, environmental disasters and unusual natural events, and the diffuse threat of Islamic terrorism has combined with a sense of economic crisis to produce a generalized climate of fear and foreboding. Yet until Brexit struck, this sense of impending doom still seemed to be somewhat intangible; perceptible below the surface but not powerful enough to disrupt the order of everyday life. Now, it feels as if things are at last kicking off for real. One historian on Twitter began to wonder half-jokingly whether people a century hence will speak of the ‘generalized crisis of the early 21st century’, others whether the year 2016 would be remembered as the date when the dissolution of the world as we know it began in earnest.
In times like these, history can be a great consoler. By standing back and contemplating larger connections and storylines, the febrile mind can at last find a grip, a resting place that offers some sense of ‘taking back control’. But such consolation should come with a health warning. Getting a grip is not the same as optimism, let alone offering a workable vision for political action. Some of the best long term analyses have been driven by the experience of defeat. Think of Fernand Braudel, who discovered the agency of geographic features over the longue durée when incarcerated in a German prisoner of war camp. Or of Antonio Gramsci who wrote his exceptionally perceptive interpretations of history from a fascist prison cell. It is Gramsci’s ‘pessimism of the intellect’ rather than his ‘optimism of the will’ that colours the way we see the world today. Making sense of Brexit within a larger historical framework is like staring into the abyss in order to make one’s fears more manageable, an exorcism by anticipation, perhaps.
The books that I felt most compelled to revisit in response to the Brexit crisis all deal with the historical sociology of capitalism. I had come across some of this material as a student in the early and mid 1990s, but until recently, lost sight of it to explore other intellectual territories. Most of it is of Marxist provenance broadly construed – more precisely of North American Marxist provenance, where big picture analyses of the global economic system have received particularly careful attention. Relevant names include (among others) Immanuel Wallerstein, Saskia Sassen, David Harvey, Royal Holloway’s own Sandra Halperin, and somewhat peripheral to this tradition, the German historian of capitalist crises, Robert Kurz.
There are several reasons why this body of literature seemed to be particularly appealing when trying to make sense of Brexit. In the first instance, the referendum has been accompanied by the ongoing self-destruction of the British Labour Party, and a general reassessment of left theory and practice going back to first base seemed appropriate and pressing for the moment. Before we can even argue about what kind of politics we now need, we need to know where we stand in terms of big-picture stuff. In addition, the emotional flavour of this historical sociology chimed with the post-Brexit blues. The authors involved were all more or less shaped by the experience of belonging to an intellectual tradition – Marxism – which had not the slightest chance of wider political relevance where they lived or worked, the USA. But they carried on writing regardless, and with a heightened sense that at least intellectually they could defeat an otherwise overwhelming system. And, perhaps most importantly, there seemed to be an immediate fit with the empirical evidence. The map of how Britain voted over Brexit – with ‘remain’ areas coloured yellow and ‘leave’ areas coloured blue – was an almost perfect illustration of some key arguments that had been made in this body of scholarly writing.
The historical sociology of global capitalism, then, is the grand narrative which can help us situate Brexit, its causes and consequences. It allows us to read the referendum result as an outward sign – local and specific to the UK – of a much wider structural contradiction which is currently transforming the world as we know it. What is at stake is enormous, and holds the frightening but real possibility that our future may be a good deal less democratic than our present.
Let us begin with a key contention: over its long historical formation, capitalism has only occasionally and in very particular locations marched in step with the two other trademark institutions of modernity, the nation state and democracy. For most of the modern period, capitalism has produced economic and political geographies that cut across the nation state in various ways, and relied on means of political organization that involved some degree of authoritarianism and coercion.
This is immediately evident when we take a global view of capitalism in its formative period from the 17th to the 19th centuries. This was never just a story of an industrial (or as some would have it, an ‘industrious’) revolution in one particular territory such as Britain, operating in tandem with the creation of new citizens equal before the law, and their gradual incorporation in political decision making. There was always another side to it: first, what Marx originally called ‘primitive accumulation’, the forcible appropriation of peoples, places and goods in the first rush for capital accumulation. It was exemplified by the enclosure of common lands, the infamous highland clearances or the robber-baron colonialism of the East India Company. Then came the no less brutal but more systematic disempowerment of the majority of the world’s population on racial grounds under European empires. In some places, and for three centuries or more, this included capitalist slavery, the most coercive system of labour management imaginable. The link between authoritarianism and capitalism did not end there. The twentieth century brought forth capitalist dictatorships around the world, including communist dictatorships. Against other sections of the Left, much of the literature under review argues that communists were politically successful in many parts of the global South and underdeveloped East not because they were anti-capitalist, but, on the contrary, because they offered a turbo-charged version of capitalist development directed and monopolized by the state. This is what Lenin’s famous celebration of ‘electrification of the whole country’ and Mao’s ‘great leap forward’ were all about. The untold horrors committed in the name of communist development are as much part of Robert Kurz’s ‘black book’ of global capitalism as modern slavery.
Whatever may have changed over these centuries, capitalist development was rarely confined within national boundaries (the communist path to development is a possible exception). More often than not, the global capitalist elite was transnational in orientation. The members of this elite shared a common culture built around such things as opera, a love for renaissance art and classical education. They intermarried across borders. They owned assets in several countries. Friedrich Engels, German industrialist with strong British connections, was by no means unique; even a quintessentially German company like Siemens had British as well as German family branches before the First World War. Capitalism constituted, as Immanuel Wallerstein once famously called it, an emergent ‘world system’ linking metropolitan core areas in Europe with colonial and semi-colonial peripheries around the globe. Colonial Guyana in South America provides a perfect example: labour came from Africa (initially as slaves), from India (as part of the indentured servitude system) and from China; they worked plantations owned by international shareholders to produce cash crops like sugar which had to be shipped across the Atlantic to be sold to mostly working class consumers in Europe’s industrial heartlands. The profits went into anything from railway companies to city banking houses and village church renovations. It should be noted that this self-fuelling and highly exploitative system continued, and become more efficient, in the 150 years after slavery was formally abolished in the region. At their most productive, around the mid-twentieth century, the sugar plantations of British Guiana were unique in that they produced not one but two annual crops.
There was only one relatively brief period when the common-sense picture of capitalism applied, when a flourishing ‘national’ economy coincided closely with the borders marked on a political map, and when economic reproduction went hand in hand with democratic governance. This was the time between the end of the Second World War and the emergence of a neo-liberal economic system in the late 1970s. Over these three decades – but even then not everywhere – an economic logic of industrial manufacture, mass consumption based on rising incomes across the working and the middle class, of state planning, and a more or less consensual form of politics prevailed. This is capitalism as it is most familiar to us: of workers manufacturing goods like cars or TV sets and earning enough to then buy these same items back for their personal enjoyment. This system still relied on the basic Marxist category of exploitation, but nevertheless functioned for some time as a self-sustaining engine of prosperity creation for the many.
All this began to change in the aftermath of the 1973 oil crisis, which ushered in a global recession. Capitalism survived and reinvented itself, but with a new modus operandi that had become firmly entrenched by the 1990s, and is ultimately responsible for the dislocations that Brexit brought to the fore. The main method of surplus creation shifted from labour exploitation to financial speculation and ‘securitisation’, a new form of primitive accumulation by stealth, as Saskia Sassen describes it. This new system no longer requires people to be turned into capitalist labourers or consumers for it can create wealth without people. For the first time, this means that capitalism is no longer expanding – seeking to bring more people and territories under its control as it had done for the last many centuries. Instead it grows richer by ‘expulsions’ (again Sassen’s term), by getting rid of people in order to speculate with what they leave behind. Sub-prime mortgages foreclosed in dying American cities, landscapes ransacked by fracking or mining, bodies plundered for organ donations and patented for medical copyright, whole populations killed or displaced while the international arms trade makes a fortune.
Even though the wealth concentrated at the top has increased enormously since the days of the welfare state, this is not a self-sustaining system. As some historians of capitalism have pointed out, capitalism may well have entered a final crisis mode. The current slowing of growth around the world may be the first concrete evidence of this disturbing trend. While the rich economies of the North are in or close to recession, countries that have previously been held up as hopes for a globalised future are in deep trouble, too. China sits on a mountain of real estate debts while the economy slows, and India had to falsify official data by its Central Bank to maintain any semblance of economic growth outperforming population growth.
The geographic shape of the new system once again transcends national boundaries. There are new and reinforced global links cutting across the old division between a rich North and a poor South, exemplified by new areas of global ‘outsourcing’ and glittering global cities on all inhabited continents. The new structures also create growing regional disparities within national economies, between the nodal points of a still thriving global network and areas of ‘expulsion’ for which the system no longer has any use. This brings us directly back to the referendum map mentioned above. The yellow areas voting for ‘remain’ in England and Wales coincided almost perfectly with areas that still have a stake in the global economy: London and its wealthy hinterlands, the M4/M40 corridor of Berkshire and Oxfordshire, Britain’s knowledge cities like Manchester, Cambridge, Leeds or Aberystwyth. Those areas that voted ‘leave’ by the largest margins, in contrast, were the old industrial heartlands and rural areas now left behind.
This explanation of the geographic shape of the Brexit vote is powerful but not in itself particularly original. Many have made this point without recourse to Marxist meta-history, not the least ex-Prime Minister Gordon Brown who demanded that globalization had to work for everyone not just the few. Other commentators believe that Brexit will usher in a wider people’s revolt against the excesses of neo-liberalism. It is here that the perusal of the historical sociology literature offers a starkly different perspective. If the likes of Harvey, Sassen and Kurz are right with their grand narratives of capitalist development then it is unlikely that a political upset like Brexit can alone reverse deep structural developments. Unless it is forced into a yet completely unknown new modus operandi – for which there is little evidence – global capitalism will continue to rely on ‘expulsion’ as its main method of surplus extraction. That it has also entered crisis mode can only mean that regional disparities between core areas and left-behind areas will grow further still. Tax and spend, or a politics of redistribution, will no longer work as a remedy. Insofar as Brexit was a vote to take us back to the days of a national economy it cannot fulfil its core promise. (It is worth noting that there was also a leave argument at play that argued for more rather than less globalization – David Owen’s new ‘blue water diplomacy’, for instance, or Andrea Leadsom’s new trade deals, but this theme was quickly overshadowed by a rhetoric of ‘taking back control’ over national borders.)
This is not simply speculation. In countries with longstanding regional disparities across Europe – Italy with its industrial North and mezzogiorno South, say, or East and West Germany – the divide has become sharper over recent decades, despite long-standing and hugely expensive ‘development efforts’ by the countries themselves or by the EU. Meanwhile, across Europe, the gap between the networked core – located mostly in the North – and the expulsions areas in the South has also intensified. It is happening elsewhere, too, from Nigeria’s increasingly unbridgeable split between North and South to India’s great divergence between the Western and Southern coastal regions and the Northern ‘cow-belt’. It is very unlikely that the people of Sunderland or the Welsh Valleys will be any luckier than those in Greifswald in East Germany, Trapani in Sicily, or indeed Patna in India’s Bihar, when it comes to their state’s ability to overcome inequality through redistribution or protectionism.
This brings us to the crux of this exercise in historical analysis: what will be the political effects of these structural contradictions? A steady growth in the number of people who are of no use to the system, not even good enough to be exploited or to buy useless commodities, and a simultaneous crisis of the system as a whole will produce a colossal amount of discontent. Those who still have a stake in global capitalism, meanwhile, will seek to protect their life chances tooth and nail – as the emotional reaction of so many ‘remainers’ to Brexit demonstrated beyond doubt. The interests of those living in the still thriving network core and those in left-behind areas have become irreconcilable. One’s dream has become the other’s nightmare, while there is still no alternative political economy that could overcome such divisions in sight. How is this conflict going to be managed through democratic institutions? How is system compliance and consent going to be generated within a geographic framework – the nation state – that no longer fits the shape of the political economy?
One can think of several possibilities here. Discounting the unlikely event that the people of Britain collectively decide to leave the capitalist order altogether and try out some other system on their own, two alternatives stand out. First, areas that are small enough, well-connected enough and have the kind of identity politics in place to sustain such a move, may seek to become small independent nation states that play the new global system for what it is worth. Catalonia in Spain is a good example. Scotland is clearly weighing up its options to follow this path.
Where such secessionist moves are less feasible – as in England and Wales – some kind of artificial consent will have to be manufactured by means of an authoritarian political order. The Chinese Communist leadership is perfectly open about the need to manage regional disparities in their own country through repression, and justifies it with reference to a Confucian political culture. Elsewhere – as developments in Hungary and Poland, in Erdogan’s Turkey, Modi’s India and Putin’s Russia suggest – there is a trend towards majoritarian pseudo-democracy. A climate of radical nationalism prevails, post-truth politics becomes the norm, universities and the media are purged of opposition, perceived minorities are used as scapegoats, an artificial and hollow ideology of consumption and development spectacle papers over deprivation. But people are still able to vote, in fact, are even invited to vote to periodically consecrate the holy union of popular will and populist leadership.
It is not even necessary for a rabidly authoritarian party with genocidal urges, such as Narendra Modi’s BJP, to gain power for such a system to work. It is sufficient if such a party is strong enough to compel everybody else to rally behind a pro-establishment alternative that governs solely on the promise of keeping the barbarians at bay. Such a perennial party of power would have a free hand to resort to authoritarian measures as long as they remain marginally less off-putting than those demanded by the other side. Discontent will remain high, but has nowhere else to go but to the radical nationalists who stand in perpetual opposition, thereby only reinforcing the dominance of the ‘centre’.
Such a situation is by no means inconceivable in post-Brexit Britain. In fact, one can already see the contours of it taking shape: witness the implosion of the Labour Party in line with what is happening to other social democratic parties around the world, the emergence of the Tories as the only ‘centrist’ alternative in a world where the ‘centre’ has moved very considerably to the right, and an entrenchment of UKIP and assorted right-wing extremists as the attack dogs that prop up the system from the outside.
You have been warned – historical analysis in times like these is likely to yield depressing results. The only consolation is that even the best structural analysis does not fully accommodate human agency which will always remain unpredictable. The best historians can hope for at this juncture is that they are wrong.