The Climate Emergency and the Inadequacy of the Historical Nation State by Markus Daechsel

Flooding in Sindh, Pakistan, 2022 (Ali Hyder Junejo)

The recent COP27 conference was dominated by discussions about the creation of a ‘loss and damage’ fund financed by the world’s richer nations to help poorer countries cope with the devastating effects of the climate emergency. The global South has been suffering disproportionally from rising sea levels, extreme weather events, crop failure and potentially deadly rises in temperature, despite having contributed far less to carbon emissions over time than the prosperous industrialised North. In Pakistan – a country with a relatively small carbon footprint – this summer brought the most devastating monsoon floods in historical memory, inundating as much as one third of the country’s populated area. Much of this year’s harvest has been destroyed, and millions of Pakistanis remain stuck in temporary camps. In the words of Sherry Rehman, the country’s new climate change minister and veteran ‘liberal’ voice, Pakistan has become the ‘ground zero’ of the climate catastrophe. While the international fossil fuel industry was raking in record profits and deceiving the public with ‘greenwashing’ spin, her country had the right to ask for ‘reparations’ to make up for the loss to infrastructure and economic productivity caused by others. This fiery rhetoric was subsequently dialled down by Rehman’s own government, but a tentative step towards a ‘loss and damage’ fund turned out to be the only tangible outcome of COP27. While the question of how to quantify and apportion responsibility to individual countries remained hotly contested, the assembled world leaders seemed to find it easier to negotiate ways to pay for the effects of greenhouse gas emissions than to make any meaningful commitments to limiting emissions themselves.

It is clear that the global North has to accept responsibility for global warming and is in a moral bind to the people of the global South. But whether government-to-government transfers akin to ‘reparations’ – if indeed they ever materialise in substantial amounts – should stand at the heart of a solution is a more complicated question. While the COP format by necessity regards negotiations between nation states as the most meaningful way to address global problems, historians should be much more sceptical about approaching global heating in this manner. The nation state may still be the most readily accepted building block of global politics, but it is worth remembering that it started its career as a legal construct designed to bring an end to the devastating religious wars of early modern Europe. When it comes to understanding large-scale processes of economic and social change such as the development of fossil fuel capitalism, a ‘methodological nationalism’ – thinking in terms of national units – can actually be less than helpful. And this is not even asking difficult questions about when and where the nation state concept ever adequately reflected real structures of power. It certainly only applied to a small proportion of humanity during the age of the modern European Empires from the 18th to the 20th centuries, and its conceptual validity remains at best limited in our own age of multinational corporations, international tax havens, transnational elites and global production chains.

Woman working in a Cotton Mill, Witney, Oxfordshire, 1898 (Historic England)

Histories of global capitalism such as the world systems theory of Immanuel Wallerstein have alerted us to the need to understand global capitalism as a single integrated whole, where development in one place always has immediate and causal connections with ‘underdevelopment’ in another. While the debate about the local roots of Britain’s original industrial revolution rambles on, a more globalist vantage point suggests that industrialisation has rarely exclusively or even predominantly been a ‘national’ story. The flourishing of the Lancashire cotton industry, for example, was dependent on a supply of cheap cotton produced on the slave plantations of the American South. The workers toiling in Britain’s factories could not be fed, moreover, without cheap imported food from the sugar plantations of the Caribbean, or a little later, the beef herds of Argentina and Uruguay. And British textiles would not have been so profitable if British colonialism had not first ruined highly accomplished artisan weavers in India, and then ensured tariff free access to the Indian mass market for British goods. Industrial capitalism is not an ‘advanced’ form of production, while plantation slavery, cattle-ranching or small-holding agriculture are ‘backward’. They are all equally ‘up to date’ in their time, and integral to the same stage of global development.

Indian Cotton Weavers, gouache drawing, c. 1840 (Wellcome Collection)

What is more, the winners and losers of the global capitalist system were never neatly divided between North and South, even if on the final balance sheet the system was decisively rigged in favour of Northern interests. It is easy to identify the biggest beneficiaries: European and North American capitalists, and the most exploited: racialised slaved labour. But there were a large number of social groups that fell somewhere in between, and whose balance of benefit and loss is not always straightforward to calculate or compare. Who profited more from industrial capitalism – the increasingly comfortable but still exploited working classes of Great Britain or the newly prosperous import-export trader conducting his business under British overlordship in colonial Bombay? In the area of British India that later became Pakistan, farmer-landowners producing rice or wheat for export could only become as politically and economically powerful as they did because British colonial policies restricted the ability of local bankers and financiers to become successful industrialists and create a new capitalist ruling class.

In recent decades, the growing interconnectedness of the global economy and the large-scale outsourcing of industrial production to countries with cheaper labour have made calculations of who benefits and who is responsible for harmful externalities like carbon emissions even more complicated. What about a chip manufacturer in Taiwan or an iPhone factory in China? Or, to return to flood-stricken Pakistan, a garment factory in the country’s own ‘Manchester’ – Faisalabad – or a sports good manufacturer in Sialkot, where most of the world’s footballs are stitched? In a remarkable irony of history, many international fashion brands are now producing their wares in unsafe and overcrowded factories in Bangladesh, only a few hundred miles to the East of where British colonialism had driven the world’s most accomplished makers of luxury cotton fabrics into ruin about 200 years earlier. Most of these outsourced products end up under global brand names in the shops, where the greatest share of profit accrues. But the governments of Pakistan, Bangladesh and many other countries, whose efforts to achieve their own ‘national’ versions of industrialisation had often spectacularly failed by the 1970s, have nevertheless courted these multinationals with soft-touch labour and environmental laws to maintain economic growth and support growing middle class aspirations.

The outsourcing of labour to South Asia and other parts of the world went hand in hand with the outsourcing of greenhouse gas emissions, environmental degradation and urban overcrowding. Sherry Rehman could increase the moral pressure for climate ‘reparations’ by pointing out that some proportion of Pakistan’s modest global greenhouse emissions are not in fact Pakistan’s own, but emissions produced ‘on behalf’ of Europeans or North Americans. But such an argument shifts the overall thrust of policy making in the wrong direction. The interconnectedness of global capitalism as a single system, which easily goes as far back as the time of the first coal-fired steam engines, means that emissions must be cut everywhere rather than being outsourced with a conscience-calming ‘loss and damage’ payment thrown in as compensation. A fund to help the poorest nations still has an important role to play, but the most meaningful way the beneficiaries of centuries of capitalist development could address their historical debt to the losers is to genuinely eliminate carbon-burning from global supply chains.

Markus Daechsel is Reader in the History of Modern Islamic Societies at Royal Holloway, University of London. He has published widely on the history, politics and social dynamics of South Asia.

‘Rather Hoped I’d Get Through the Whole Show’: In Defence of Blackadder Goes Forth by Edward Madigan

       It was good to see a clip of the final scene of Blackadder Goes Forth doing the rounds on Twitter last week. If the thousands of likes and positive comments are anything to go by, the series clearly still resonates warmly with the public over 30 years after it was first broadcast in 1989. But not everyone’s a fan. Indeed, the show is something of a perennial bugbear for historians and other commentators who feel British people fundamentally misunderstand the First World War. Gary Sheffield’s hugely insightful 2001 book, Forgotten Victory, opens with a discussion of the series in which the author rightly notes that Blackadder ‘reflected and reinforced the majority of the public’s views and emotions about the Great War.’ Prof. Sheffield is a sort of revisionist’s revisionist and there’s a great deal of truth in his critique. The adventures of Capt. Edmund Blackadder and his comrades were a comedic and thus very potent expression of the ‘lions led by donkeys’ myth, which has been around in one form or another since the war itself but really crystalised during the 1960s. In this vision of the conflict, the British campaign on the Western Front was conducted with supreme incompetence by Douglas Haig and the other generals, who were brutally callous with the lives of their men. The staff officers in Blackadder are certainly portrayed as incompetent and callous, and, as Sheffield points out, the series’ portrayal of British strategy and tactics ‘is funny because everybody ‘knows’ that British generals were incompetent and their battles were invariably bloody failures’. And he is by no means the only military historian to take issue with the series. Max Hastings has dismissed the popular perception of the war as ‘the Blackadder take on history’, and the late Richard Holmes decried the extent to which ‘Blackadder’s aphorisms have become fact’.

       Blackadder Goes Forth has also been invoked in political commentary on public perceptions of the past. In an article anticipating the centenaries of the war that was published in the Daily Mail in January 2014, Michael Gove was scathing about what he regarded as the apparently dominant ‘Blackadder’ version of the war, which portrays the conflict as a ‘misbegotten shambles’ and thus denigrates the ‘patriotism, honour and courage’ of the fallen. Gove was Secretary of State for Education at the time and his article was essentially an attack on the way history is taught in British secondary schools. His comments quickly met with robust responses from his Labour counterpart, Tristram Hunt, and a range of other prominent commentators, including historians Richard Evans and Antony Beevor. Tony Robinson, who played Pte. Baldrick in the series, also weighed in, passionately defending British teachers. This particular quarrel has now been forgotten, but it reveals the degree to which ‘Blackadder’ has become short-hand in certain quarters for a crude misunderstanding of the ‘war to end all wars’. In very broad terms, the debate divides those who regard the war as a bloody but necessary conflict in which British servicemen heroically achieved a great victory and those who see the war as little more than a futile exercise in mass slaughter.

       And yet despite often heated disagreements about the meaning of the 1914 – 1918 conflict, and the persistence of the view that the war was futile, there is remarkable consensus in Britain that the war dead should be remembered with reverence and respect. Even groups that have been very critical of the official, government-driven culture of commemoration, such as the Stop the War Coalition and the No Glory in War Campaign, have stopped short of suggesting that the dead should not be honoured. And this is where Blackadder Goes Forth offers some valuable – and usually overlooked – insights into the British relationship with the First World War. The tension in the British memory of the war is quite unmistakable in the shift in tone we see in the final scene of the six-part series. For five and a half episodes, the series brilliantly lampooned the British war effort, and then, in the last ten minutes of the final episode, it paid a remarkably sincere and poignant tribute to the men who lost their lives on the Western Front. As Dan Todman notes in his own revisionist classic, The Great War: Myth and Memory, the closing scene rather undermines the more irreverent and cynical mood of the rest of the series. It is as though the writers, Ben Elton and Richard Curtis, felt that they couldn’t in conscience make a television show about the war, no matter how satirical, without ultimately honouring the dead.

The closing scene of the final episode of Blackadder Goes Forth (Goodbyeee), first aired November 1989 (BBC)

       And the final scene is indeed moving. All of the main characters, with the exception of General Melchett, clamber over the parapet and advance heroically into German machine-gun fire. A slow, haunting piano plays the theme music in lieu of the brassy marching band of previous episodes, and, in the end, the image of the fallen heroes in 1917 fades to a tranquil scene of the Western Front in ‘our own time’, accompanied by the sound of birdsong. It’s an undeniably well-crafted piece of television. Yet, in much of the commentary on the scene, one of its most poignant and revealing elements is usually overlooked. The clip currently circulating on Twitter begins as Blackadder and Baldrick and the rest of the characters step into the trench and prepare to go ‘over the top’. In the originally-broadcast version, however, this moment is preceded by an unusual exchange in the dug-out between Blackadder and Capt. Darling:

Capt. Blackadder: How are you feeling, Darling?

Capt. Darling: Erm, not all that good, Blackadder – rather hoped I’d get through the whole show; go back to work at Pratt & Sons; keep wicket for the Croydon gentlemen; marry Doris … Made a note in my diary on my way here; simply says, ‘Bugger’.

Capt. Blackadder: Well, quite.

I honestly think you’d struggle to find a more moving passage in the whole canon of post-1960s First World War fiction. The scene is played with real pathos and feeling by Tim McInnerney and Rowan Atkinson and its power is heightened by the fact that their characters have been arch-enemies for the rest of the series. In the other episodes, Capt. Darling is a toadying junior staff-officer, skulking about in the regimental HQ, a foil for Blackadder’s cutting wit; now he finds himself very much at the sharp end. His wistful lament for the life he thought he’d live after the war is all the more poignant because this formerly unappealing character now has his back against the wall. His words also seem to capture what must have been a common state of mind for the men who fought and died during the war. ‘Rather hoped I’d get through the whole show’ would be a fitting epitaph for any number of the dead.

British soldiers eating on the Somme front, October 1916 (IWM Q1580)

        When I discuss this scene with my public history students each year, I also encourage them to consider Darling’s fiancé, Doris. She’s a fictional character, mentioned in passing in a sitcom, but she arguably represents all those left behind when their loved ones failed to return from the front. And her predicament is more relatable to most of us today than that of the men who served on the Western Front and in the other theatres of war. None of us will have to fight in an industrialised war, and few of us will experience military service. Yet all of us will suffer loss and know bereavement. The pain of grief is a fairly universal experience, and in it we can empathise with our ancestors more than we sometimes appreciate. Blackadder Goes Forth does not of course tell us much about the military dynamics of the First World War. Yet in reflecting the tension in Britain’s relationship with the conflict, and dramatically evoking the fate of the men who fought and died, the series offers us some genuinely valuable insights.

Dr Edward Madigan is Senior Lecturer in Public History at Royal Holloway, University of London and co-editor of Historians for History.

So You Want to be a Public Historian? Forging Careers in the History and Heritage Sectors by Edward Madigan

       I don’t suppose it’s ever been easy to earn a living in public history, but the last few years have been particularly brutal. Museums, archives and heritage sites across the UK and the wider world struggled to survive throughout 2020 and ’21 as the pandemic led to extended closures, plummeting visitor numbers and extreme uncertainty in funding. We seem to be slowly emerging from the plague woods now, but tourist numbers still haven’t recovered, and the global cost of living crisis will ensure that it may be some time before they get back to pre-pandemic levels. All of this has been compounded by ongoing cuts in government funding for anything to do with history or heritage, and a growing right-wing hostility to public history that departs from patriotic bedtime stories.

        And yet, despite all of these challenges, this is a fascinating time to be a public historian. The re-emergence of the Black Lives Matter movement in the weeks after the murder of George Floyd in May 2020 brought the questions of white supremacy, slavery, colonial conquest and imperialism very much to the fore in heritage and public history. The ensuing wave of protests in the United States and elsewhere forced many institutions and individuals who create narratives of the past for public consumption to pause and sincerely consider those narratives in a different light. In Britain, the dumping of the statue of the 17th century slave-trader, Edward Colston, into Bristol harbour reflected a pressing and popular demand for a more open, inclusive and honest public confrontation with the past. Over that tumultuous summer, the History Matters project documented evidence of a groundswell of popular support for a reimagination of the way the past is represented in public in light of these developments. The work of those who were already countering dominant historical narratives, such as the Museum Detox network, was given fresh urgency and new projects designed to historicise imperialism and the experiences of minoritised peoples were launched with real enthusiasm. Two years on, the public conversation about Britain’s past seems genuinely more critical and more dynamic than ever before. This makes for a very stimulating and intellectually vibrant atmosphere for anyone who hopes to create public narratives of the past, and the field is increasingly driven by younger historians who wear the ‘public’ prefix with pride. And despite the generalised doom and gloom, jobs in history and heritage are still routinely advertised in the UK, as the University of Leicester museum jobs web-page demonstrates.

Becky Tabrar speaking at the LCPH Public History Early Career Workshop, 9 September 2022

       The prospect of establishing a career in this burgeoning field nonetheless remains daunting and budding public historians need all the help they can get. With this in mind, the London Centre for Public History recently held an early career workshop at Stewart House in Bloomsbury. Over the course of an afternoon, three established public historians spoke about their experiences of forging careers to a highly engaged group of people working, or aspiring to work, in the history and heritage sectors. The first speaker, Becky Tabrar, is a graduate of Royal Holloway’s pioneering MA in Public History programme and has been a curator and public engagement officer at the Windsor & Royal Borough Museum since 2017. In a fascinating presentation, Becky really emphasised the value of starting out in a relatively small local museum, where there’s likely to be much more scope for gaining experience and making an impact than in a larger institution. She also spoke of the need for self-care and balance in a career in which a junior curator, new to the team and eager to show willing, can easily spend too much of their own personal time working on museum projects.

Becky was followed by Dr Ayshah Johnston, the Learning and Engagement Officer at the Black Cultural Archives in Brixton. The BCA is the only national institution dedicated to preserving and disseminating Black British history and Ayshah has played a pivotal role in its outreach and engagement work over the past number of years. In a very rich and varied talk, Ayshah reminded us that the public historian has the power to inspire groups and individuals as well as to educate them. This is perhaps especially true for the historian who engages with traditionally marginalised communities. With their distinctive experience and expertise, Becky and Ayshah really complemented each other in the ensuing discussion, chaired by Dr Matthew Smith, which was full of insights from both the speakers and the floor. One of the themes that emerged was that the public themselves can sometimes be challenging, particularly when you’re threatening established narratives that they hold dear and they respond accordingly. The public historian is therefore often required to be confident and authoritative on the one hand and tactful and diplomatic on the other.

Dr Ayshah Johnston, Becky Tabrar and Dr Matthew Smith in discussion at the workshop.

The second session was led by Hannah Greig, who, over the course of a career that no doubt evoked a certain amount of envy in the room, has worked as the main historical consultant on a whole series of major TV and film productions, including The Duchess, PoldarkBridgerton, and the academy-award winning The Favourite. In a hugely enlightening presentation, Dr Greig stressed how much she enjoys working in film and television, an industry which has become notably diverse in recent years, especially by comparison with academia. She also emphasised the valuable insights she has gained into the history and culture of early-modern England through working with directors, script-writers, set-designers and actors whose questions have inspired research that she probably wouldn’t have otherwise undertaken. This struck me as a particularly useful point for early career public historians, who often understand their role as a case of bringing their expertise to the uninformed man or woman in the street. It is of course partly that, but at its most meaningful the process of creating public narratives of the past involves a mutual exchange of insight and understanding.

            We’ve only very recently begun to use the term ‘public history’ in the UK and Ireland, so there’s no tried and tested blueprint for this sort of a career. Yet something that came across quite strongly in each of the presentations, and in the discussions that followed them, was the importance of adopting what Hannah referred to as a ‘portfolio approach’ to career progression and development. For the up-and-coming public historian, this means having an open mind about your ultimate destination and keeping as many irons in the fire as possible. And for better or for worse, networking is crucial. But what does this mysterious term mean in practice for the newly-minted public historian? Well, making yourself and your work known to more established practitioners in the field is of course helpful. Influential movers and shakers can certainly provide opportunities for more junior colleagues. But a willingness to form relationships and collaborate with your peers is just as important. A circle of friends and colleagues who work in public history can provide valuable support, advice and commiseration. And as they progress in their careers, they may be in a position to offer employment and funding opportunities. Perhaps more than anything, though, your peers invariably share your passion for the public interpretation of the past, so they can act as a healthy source of inspiration and collaboration as you make your way in this often daunting but always exciting field.

Dr Edward Madigan is Senior Lecturer in Public History at Royal Holloway, University of London and Co-Editor of Historians for History.

Sweet Memory: Extracting Oral Histories from Unwritten African American Recipes by Andre Taylor

You can only preserve the elements of cultural heritage that you share. One way of hiding cultural heritage and natural history in plain sight is through foodways and recipes. Long before their arrival in the Americas by way of the Trans-Atlantic Slave Trade, Black people have transmitted histories and other pertinent information orally. This tradition was passed on by generations of enslaved Blacks and, ultimately, morphed into something more than even the ancestors imagined. The emergence of the internet and cell phones has made most information readily available to anyone with a handheld device; however, the oral tradition of sharing in close quarters remains closely guarded. Not every family member gets grandmom’s recipes. It’s only the individuals who have proven they have a passion for cooking, and they can flat out burn (slang for cook).  

The kitchen, cookhouse, mess, or galley are more than just areas in which we create delectable meals, they’re places of sharing. Stories pour out of the kitchen, especially around the holidays, and into the minds of generations of family within the space. However, these innate stories mask the real histories that are hidden in plain sight. With every whisk, churn, flip, and fry in the kitchen, is an oral history of untold family knowledge. These stories, hidden in the unwritten recipes passed down orally within families, provide understanding of the very people we call family.

An oral history project I began in May seeks to capture the stories of African American families, as well as the unwritten recipes circulating in their families. Titled ‘Black folk and our food: Extracting traumatic memory hidden in unwritten family recipes through oral histories’, the project is a deliberate mission to preserve recipes, cooking methods and histories. Utilizing oral history collection as my primary methodology, the challenge is extracting the stories trapped between a pinch of salt and a handful of flour. The goal of this project is to use orally transmitted recipes as the vehicle to narrate often taboo family stories to better understand the functionality and adaptability of African American families. The project will also develop a narrative of the contemporary southern African American family.

Dr. James Avery, a Portsmouth, Virginia resident, had a very special relationship with his grandmother, Ollie Pearl Hassell. Faith, family, and food were at the heart of their relationship. As a child in the late 1990s and early 2000s, Dr. Avery was his grandmother’s sous chef, studying her ways of preparing foods: deviled eggs, baked chicken wings and her cabbage. It was the cabbage he prepared during his oral history interview for the project. Seasoned meat, diced onions, salt, pepper, ground mustard, black pepper, garlic powder, Adobo seasoning (an addition made by his aunt), and pepper flakes (his own addition to the recipe) all ended up in a pot atop his stove, and all were added without a single measurement. While this may be considered an unorthodox way of cooking by trained culinary experts, in the African American culture, it is the direct connection to their ancestors and heritage. For Dr. Avery, sharing these moments in that space with his grandmother were a training ground and listening session.

Dr. James Avery prepares cabbage next to a picture of his late grandparents Clyde William and Ollie Pearl Hassell (Andre Taylor).

One of ten grandchildren to Ollie Pearl, Dr. Avery learned about the lives of people in the church, the neighborhood, and sometimes about his grandmother’s sister, who she’d considered mean, all the while, she was passing down family recipes to the chosen grandchild. She never discussed what it was like growing up in Greensboro, North Carolina, and there was a lot to unpack regarding her city. Greensboro, the epicenter of the sit-ins in the United States, has always grappled with its dark, racist past that existed long before Ollie Pearl’s birth on June 12, 1924. In Greensboro, the color line is simple: whites to the west of Elm Street and Blacks east of the street. On February 1, 1960, on Elm Street, an F.W. Woolworth lunch counter became the center of attention as four freshmen male students from North Carolina Agricultural and Technical State University, marched from campus to the store and staged the first sit-in to desegregate the lunch counter. They were successful in desegregating Greensboro in July of that year, and their actions spawned a wave of similar acts around the country. By the time the sit-ins had begun in Greensboro, Ollie Pearl had already left the city and was residing in Virginia.

Avery removes smoked neckbones from a pot following a recipe he received from his late grandmother, Ollie Pearl Hassell (Andre Taylor)

Ollie Pearl never spoke of who taught her to cook, but she basked in her culinary talents, preparing food to take to church and having a meal prepared for anyone who stopped by the house. As she aged and her health began to deteriorate, and Dr. Avery matured, Ollie Pearl would instruct him from the living room in preparing either breakfast or lunch. She could tell if he was using a pot that was too small, if water was boiling too fast or the pan was too hot, all by sound. From the living room, she would tell him to turn the burner down, and he’d follow every instruction. The reward was more than the appreciation of knowing he had prepared a meal for his grandmother; it was the time spent and the lessons learned. Those deeply personal meetings came to an end in February 2008 when Ollie Pearl died. She was 83.

The memory of Ollie Pearl lives on in the recipes she left behind that yielded awe-inspiring stories. In particular, the cabbage recipe tells a love story. Before he passed, Clyde William Hassell, Ollie Pearl’s husband, grew vegetables in their backyard. Green cabbage was one of the more easily-cultivated crops and there was plenty of it. Even after his death, Ollie Hassell still prepared cabbage, though she had to buy it from the grocery store. But although his grandmother continued to prepare the dish, Dr. Avery still wasn’t a fan. It wasn’t that he didn’t like her cooking, he revered it; he just wasn’t big on cabbage. That feeling, too, would change.

The aromas of cabbage cooking conjure memories of Ollie Pearl. The windows in his home don’t sweat like they did in his grandmother’s kitchen, but the smells of her recipes being prepared in his spacious kitchen, with air vents, prompt memories of conversations with her. The smell of boiling neckbones brings forth stories of Ollie Pearl talking about her uncle who made moonshine in the woods of North Carolina and her message of don’t let people see what goes into your food for seasoning. The kitchen was not just a sacred place where meals were prepared, it is a repository for the stories, lessons, and memories of Ollie Pearl.

Dr. Avery was not forced to be in the kitchen with Ollie Pearl, he pushed his way into that space. He wanted to be in there to learn and spend time with his grandmother and by default, has become the “neo-griot,” for his family, equipped with the stories that exist in the unwritten recipes that she passed onto him. By participating in this project, Dr. Avery not only preserved this recipe for future generations of his family, but he also offered further, critical insight into the functionality and hierarchy that exists in African American families, whereas individuals in the family have specific roles. Though she shared many stories with him, there were stories he says he wished she would have told about her relationship with her parents and other relatives.

At its best, comfort food can deliver us from the ills of the world and brings forth a euphoric state in which we can escape and live in a blissful, stress-free world. This concept of comfort food is critical to this project in that the very vessels for the stories I am searching for rests in the recipes and dishes of the participants. Several other narrators have, like Dr Avery, shared elements of their lives and those of their ancestors through recipes. These recipes are far from average fare. They are glimpses into a time when Black people were subjected to segregation, marginalization and, in some cases, enslavement. We cannot simply look at recipes as resources that fulfill our hunger, we must look at them for what they are: cultural heritage, natural history, and oral histories. Without her recipes, we cannot fully contextualize Ollie Pearl. Within her recipes lies the pragmatic origins of a family, a story of triumph, love, and faith. At the core, these are the makings of African American families and offer an extraordinary window into our past and present. The “Black Folk and our Food,” project will continuously bring forth the stories of people like Ollie Pearl as examples of the courage and strength of Blacks in America represented through our food.

Further Reading

Smith, Graham, “The Making of Oral History: Sections 1-2”, University of London

Twitty, Michael. “The Cooking Gene: A journey through African American culinary history in the old south,” 2017 (Harper/Collins Publishers)

Andre L. Taylor is the oral historian and Ph.D. student in American Studies at the College of William & Mary in Williamsburg, Virgnia.

‘This is all that Australia has left of my people’: The Trailblazing Aboriginal Activist of 1920s London by Ruby van Leer

‘Against the solid stone of Australia House stands … a black man, hatless and with a grey beard, a mere handful of a man, with the fine bones of an Australian Aborigine. [On his] great coat is pinned from top to bottom … scores of those little white penny skeletons that the street vendors sell to children….Good Lord- the man is a walking graveyard! Yet his eyes are on fire. He points to the penny skeletons and shouts as the people pass: ‘This is all that Australia has left of my people.’

So Dharug man Anthony Martin Fernando is said to have cried out to passers-by during his visceral ‘penny skeleton’ protest, intended to draw attention to the lethal implications of British settlement on Aboriginal Australians. Such a protest would not appear out of place in Australian popular historical and geographical memory, side-by-side with Charles Perkins’ Freedom Rides of 1965 through northern New South Wales, or the pitching of the Aboriginal Tent Embassy outside Parliament House in Canberra in 1972. However, the location is London, the corner of the Strand and Aldwych, outside the Edwardian façade of Australia House; and the year is 1928. While more than 120,000 Australians live in the UK today, Fernando blazed the expatriate trail at a time when few from his nation would have been traversing the globe, particularly Indigenous Australians, who were increasingly confined to reserves and missions on the Australian mainland and were hence largely absent historically and politically. Fernando’s extraordinary story of global travel and solitary protest is therefore an important statement of survival and resistance that confronts imperial imaginings of movement and activism in the early twentieth century; ideas that have very much (mis)shaped our understanding of the scope and geography of Aboriginal activism to the present day.

Indigenous Rights Activist Bob Maza addresses the Crowd, Canberra, 1972.

Much of Fernando’s life can only be reconstructed from fragmentary evidence; three small notebooks, some surviving letters and government petitions, an interview with a Swiss newspaper, court reports from his brushes with the law, and the recollections of people such as Indigenous rights activist Mary Bennett, who witnessed his London protests. This perhaps goes some way towards explaining why his story, while remarkable, has remained relatively unknown. According to his own account, he was born in 1864 in Woolloomooloo, Sydney, to an Aboriginal woman named Sarah. He recalled his “bitter education in white brutality” through separation from his mother at a young age, an unfortunately common experience for Indigenous children well into the twentieth century. Letters reveal he spent some time in Western Australia, observing the profound cruelties of the mission system, before leaving for Europe in the late nineteenth century. He attributed his departure to being refused the right to give evidence against a white man accused of murdering several Aboriginal people, due to his own indigeneity.

Australia House, The Strand, London, c. 1920 (London Metropolitan Archives)

Fernando was already blazing a trail of activism long before he arrived in England. After a period living in Austria, where he was interned as an enemy ‘alien’ during the First World War, he moved to Geneva, where he hoped to petition the newly formed League of Nations’ General Assembly to assist in the establishment of an autonomous Indigenous state in the north of Australia. Although barred from speaking at the assembly, Fernando did secure an interview with Swiss newspaper Der Bund, in which he countered popular assumptions that indigenous populations were ‘primitive’ or ‘less than human’, emphasised the intelligence and intellect of Aboriginal Australians and further promoted his request for the League of Nations to intervene in securing an autonomous state for Indigenous Australians. He next emerged in Italy, where he attempted to petition the Pope to support his cause, but was turned away. Indeed, he was arrested in Italy in 1923 for handing out pamphlets accusing the British of the extermination of Indigenous Australians, and was subsequently deported to Britain, where his most memorable form of protest was to begin.

Fernando was in his sixties by the time he began picketing at Australia House, a pertinent location as the London headquarters of the Australian government. European perceptions of Indigenous people at this time had been largely formed from the collection and display of Indigenous bodies as exhibits, either living or dead, on the peripheral stage of the museum. Notably, the population of London in the inter-war years was also overwhelmingly white. Yet here Fernando stood as a ‘living ghost’ of colonial enterprise at the precise location in the capital where metropolis met the colonial frontier. At this point he had taken up one of his previous occupations as a toy-maker, and was selling the toy skeletons that he attached to his coat as part of his ghostly protest. On a bleak London street in mid-winter, Fernando must have cut a haunting and striking figure as he implored would-be customers, gesturing to the skeletons, “this is all that Australia has left of my people”.

Imagined portrait of Anthony Martin Fernando by Raj Nagi (ABC News Australia).

Fernando’s death scene protest shrank the distance between the suffering of Aboriginal Australians and the heart of the empire, forcing accusations of genocidal activity to the forefront of imperial consciousness. He garnered enough attention for the embarrassed employees of Australia House to have him arrested on multiple occasions, and even to attempt to have him certified insane, a common tactic of political silencing. Doctors refused, however, stating his views were a sign “not of insanity but of an unusually strong mind”. His diaries describe the racial abuse he received daily as a street vendor selling his toys, which led to successive court appearances from 1929 to 1939 as he fought back, once pulling a handgun on a fellow street vendor who taunted the colour of his skin. During this time, a burgeoning Aboriginal rights movement in Australia, headed by activists such as Jack Patten, William Ferguson, William Cooper and Pearl Gibbs, began conducting protests including the first Day of Mourning in 1938. Indeed, Pearl Gibbs even saved newspaper clippings of Fernando’s court testimonies reported in the Australian press. However Fernando was never to return home to join their ranks; he remained in England, and by 1948 he had been admitted to Claybury hospital, suffering from senile dementia. It was here that he passed away in 1949, aged 84.

Startling in its trailblazing and audacious nature, Fernando’s remarkable story predates by several decades, and extends by half the globe, our commonly held understandings of Aboriginal activism. He challenges images of passivity and victimhood that tend to arise from this period; just as the Aboriginal Land Rights movement coined the phrase “always was, always will be Aboriginal land”, Fernando reminds us that there always have been defiant acts of Indigenous self-representation and activism, even in the most unlikely of settings. Sadly, his activism is no less relevant today than it was in 1928, as Aboriginal Australians continue to advocate for adequate cultural and historical recognition of the violence and displacement that accompanied British invasion and settlement, not least by being formally acknowledged in Australia’s Constitution; a victory that is still yet to come.

Further Reading

Browning, Daniel, ‘Fernando’s Ghost’, Awaye!, ABC Radio National, May 2010

Paisley, Fiona, ‘Death Scene Protester: An Aboriginal Rights Activist in 1920s London’, The South Atlantic Quarterly, 110.4 (2011), 867–83

Ruby van Leer is a student on the MA in Public History programme at Royal Holloway, University of London 

‘The Shadow of What Was About to Come’: Watching YouTube and Building Historical Context by Sheridan Sylvester

Old-fashioned film footage. Shaky. Poor lighting. Silent. The person behind the camera starts inside, then moves outdoors. Suddenly, an unknown boy pops his head into the frame, clearly excited about the camera. As the camera captures footage of a town’s buildings and streets, it records unknown people chatting and smiling children attempting to stay in its view. Most of the adults, completely unaccustomed to being filmed, just stare at the camera.

Someone watching this film footage, not knowing anything about where or when it was created, would likely be intrigued by the clothing, so different from our current trends, or how the people in the film interact, which seems much more familiar, but they could not learn much more from the video alone. When considered in context, however, the film offers much very real and meaningful insights.

Short film about Marcy Rosen’s recognition of her grandfather in the original Kurtz footage
(US Holocaust Memorial Museum)

This briefly described footage was captured in 1938 by a couple from New York named David and Liza Kurtz who were spending their summer traveling through Europe with a few friends. They brought a camera along to film their trip, including a stop in Nasielsk, a small Polish town where David was born and the setting of this video. A year later Poland would be invaded by the German forces, marking the outbreak of the Second World War, and over the next six years, the Holocaust would change Nasielsk forever.

Seventy-one years after David and Liza visited Poland, Glenn Kurtz, the couple’s grandson, rediscovered this home movie in a closet in their house in Florida, and realized the historical significance of this glimpse into everyday life in pre-war Poland. Thus began a journey for Kurtz—a journey very different from the one taken by his grandparents in 1938. He spent the next few years learning as much as he could about Nasielsk in the years before and during the Holocaust, and he published his discoveries and story in his book, Three Minutes in Poland: Discovering a Lost World in a 1938 Family Film.

When Kurtz started studying the film, he asked his family what they knew and concluded that the town in the video was Berezne, where his grandmother was born. Kurtz showed the video to a distant relative and Holocaust survivor from that town, who insisted that the town shown in the film was not Berezne. Kurtz then concluded that the town must be Nasielsk, his grandfather’s birthplace, which was confirmed as he did more research. He learned that before the war, Nasielsk had about 3,000 Jewish residents, nearly half of the town’s population. The German army entered Nasielsk shortly after the invasion of Poland in September 1939. In December, the Jewish residents were forced into cattle cars and sent to two ghettos, which were emptied in 1942 when Nazi soldiers sent the residents to a concentration camp. By the end of the war, only eighty Jews from Nasielsk were still alive.

Civilians Taken Prisoner during the Nazi Invasion of Poland, September 1939 (Library of Congress)

Kurtz was able to learn some information about Nasielsk and its residents, but was not able to identify any of the people in the film, as he had hoped. That changed, however, in 2011. After finding the video, Kurtz had donated the footage to the United States Holocaust Memorial Museum, who later posted it on their website. In 2011, a family saw the video on the museum’s site and immediately recognized their grandfather, Maurice Chandler, as one of the boys whose face briefly appears in the film. Kurtz was able to meet with Chandler, who shed more light on the town, the film, and his own experiences during the war.

The stories Kurtz learned about Nasielsk and its people cannot be fully outlined here; however, the brief information shared above starts to give more context to the film. With more detail, the film becomes part of a story, and asking questions can lead to more learning. The United States Holocaust Memorial Museum provides a lesson plan based on the Kurtz family film, in which it lists questions to help viewers engage more critically with the video. For example, the museum asks viewers to consider why the film was created and what is not shown in the footage.

Context, questions, and critical thinking are important whenever we watch historical film footage, whether we watch the footage on a museum’s website or stumble across it on YouTube. However, many videos offer no explanation. Without context, we may enjoy the video and feel a sense of connection with the past, which are both positive outcomes, but we might not actually learn anything about history.

Glenn Kurtz in 2015 (Vera de Kok)

A person can spend hours on YouTube watching random historical footage from around the world. They can watch footage of everyday life in England in 1901, Amsterdam in the 1920s, or early, twentieth-century Beijing. My personal favourite is a minute-long clip of a French snowball fight in the 1890s—it’s thrilling! Many of these videos have been colorized, which is not necessarily accurate, but can help the past feel more real and immediate, and certainly feels authentic. Reading the video comments, it is clear that many viewers have a romanticized view of the past and the random footage on YouTube can help confirm that view, even if it is inaccurate. So how do we counter this?

To learn something meaningful about the past and really connect with history, we have to follow Kurtz’s example: learn the context and ask some questions. What was happening in Amsterdam in the early 1920s? Who is represented in this footage? Who and what is left out? Why was it created? How is it similar to or different from today?

It is also important to remember L. P. Harley’s maxim that the past is a foreign country. We can view overwhelming amounts of materials that have survived through history—film, photographs, writings, artwork, and objects. All quickly accessible on our devices. Despite this unprecedented access, we can still never truly know the past nor understand how it really was. Our knowledge of history is limited to surviving sources and shaped by popular memory and present-day understandings.

We may not be able to identify anyone in a given film, or learn as much as Kurtz did, or gain a perfect knowledge of history, but by doing a bit of research, asking questions, and actively engaging with the footage, rather than passively watching, we can gain a slightly clearer understanding of the past. This step from passive consumption to critical thinking and active learning can give more meaning to the videos we watch and, ultimately, help us feel more connected with the past.

Further Reading

Kurtz, Glenn, Three Minutes in Poland: Discovering a Lost World in a 1938 Family Film, (New York: Farrar, Straus, and Giroux, 2014).

Sheridan Sylvester is a student on the MA in Public History programme at Royal Holloway, University of London.

The Maligned Queen: The Life and Legacy of Katherine Howard by Grace Beattie

It was a cold, clear February morning on which a queen came to die. The king’s privy council and other dignitaries had assembled in the shadow of the White Tower to bear witness. The wind stirred, rustling the cloaks of the waiting crowd. In the doorway of the queen’s apartment, a slight figure appeared. The pale, frightened girl who had become Queen of England only sixteen months earlier stepped into the sunlight. Assisted slowly up the steps towards the scaffold, the young queen looked out to the crowd. She gave a short customary speech, acknowledging her sins and praising the goodness of the king. The girl who had won the heart of an aging king then took one last look at the dazzling winter sky before kneeling. With shaking hands, the young girl laid her head upon the block. One swift stroke of the axe and the deed was done. Katherine Howard, Queen of England and fifth wife of King Henry VIII, breathed her last.

Miniature portrait of a Lady, believed to be Catherine Howard,
Hans Holbein the Younger, c. 1540.

Born between 1519 and 1525, Katherine was the daughter of Edmund Howard and Jocasta Culpepper. After her mother passed away, the adolescent Katherine went to live with her grandmother, the Dowager Duchess of Norfolk. The duchess provided lodging and education for countless noble girls. The teaching these women were offered was sparse, however, and with little supervision and teenage hormones at their peak, sexual exploration became commonplace.

In 1540, Katherine’s uncle secured her a position as Lady-in-Waiting to Queen Anne of Cleves. Katherine’s appearance at the time she arrived at court is a matter of some debate among historians. No portrait depicting the young queen has ever been verified, but it is believed that she was petite and had light auburn hair. It wasn’t long before the young teenager began attracting the attention of the aging king. Time had not been kind to Henry VIII. When he ascended the throne in 1509, everyone praised him for his vigour, intelligence, and attractiveness. Yet now, nearing 50, the king was incredibly obese and had an ulcerous leg. Not the ideal suitor for a vivacious teenager.

Within six months, Henry’s marriage to Anne was annulled and he married Katherine in August 1540. We can never know what Katherine felt as she walked down the aisle to her obesely towering, middle-aged bridegroom. Today, the image is uncomfortable, to say the least; a dazzling beauty sacrificed to the whims of a monstrous king. Yet, Henry was infatuated with Katherine and it was reported that he “caresses her more than he did the others”[i]

‘Death of Catherine Howard’ engraving, 1864.

Katherine’s first months as queen were highly successful. She fit the standards of the time for a queen and her kindness and generosity were widely discussed by her contemporaries. Katherine would intercede on behalf of prisoners (a trademark act by queens at the time) and sent clothes to prisoners in the Tower of London. Most importantly for Henry, she was obedient and agreeable, taking on the motto “No other will but his” after their marriage. Her early success as queen tends to be overshadowed, however, by her dramatic downfall.

Katherine’s downfall came as rapidly as her rise. A single rumour from a single source proved to be her undoing. On 1 November 1541, Henry VIII found a letter in his privy chapel claiming that before their marriage, his young wife had sexual encounters with two men. Katherine’s music teacher, Henry Manox, had boasted about seeing a secret mark on her body. A nobleman, Francis Dereham, was accused of having consummated his relationship with Katherine. Henry, horrified, quickly left Hampton Court, and Katherine would never see him again.

After intense torture, Dereham confessed to having carnal knowledge of the Queen, while also naming another of Katherine’s supposed lovers, Thomas Culpepper. More damning, the liaison with Culpepper occurred during her marriage to the king. Culpepper and Katherine denied consummating their affair, but their fates were sealed. They had dared to cuckold a king, and they had to pay. Dereham and Culpepper were executed December 10th, 1541. Two months later, Katherine followed them to the scaffold. She may not have even been 21.

Katherine Howard is perhaps the most maligned and misunderstood of Henry VIII’s six wives. Since the Victorian era, there has been a predominantly unsympathetic view of Henry’s youngest wife. She has been labeled by historians such as Alison Weir, David Starkey, and Alison Plowden, among others, as a “frivolous, empty-headed young girl”, “a good-time girl” and “a natural-born tart”. In the era of the #MeToo Movement, she tends to be represented as a victim of sexual violence. Both perspectives – whether whore or victim – reduce the young queen to nothing more than a sexual object. In reality, she was an altogether more complex young woman.

Most of what we know about Katherine Howard comes from court documents compiled during her downfall. During that time, men told her story with every reason to paint her as the creator of her demise. Henry’s other wives have had their reputations rehabilitated in recent years, while Katherine’s has remained essentially unchanged. Anne Boleyn is praised as a feminist icon, a woman who knew her own mind and developed her own sexual ethic. Yet Katherine is condemned for the same. And while the innocence of Anne Boleyn is persistently discussed and debated, the condemnation of Katherine seems unworthy of discussion.

Tamzin Merchant as Catherine Howard in The Tudors TV series (Showtime)

In her fictional portrayals, Katherine is either relegated to the characterization of dim-witted tart warranting judgment or a naive victim deserving of pity. In neither case is she given agency or sympathy for her real-life choices. One of the most recent and best-known portrayals of Katherine Howard was in the successful TV series The Tudors (2007-2010) in which she was depicted as an over-sexualized, frivolous, uneducated, foolish girl. While only appearing in six episodes, Katherine appears naked on screen more than any other character. In the series, her affair with Culpepper begins out of boredom and lack of sexual satisfaction from Henry, leaving viewers with little sympathy for her downfall. Her depiction in this widely watched series is just the latest in a long line of fictional portrayals that reinforce her stereotype as a sexual deviant who brings about her own destruction.

Most of Henry VIII’s wives have been constantly reinvented. But not Katherine Howard. The legend and the myth often outweigh the actual woman’s story—a multifaceted woman who was both a victim and an agent of her own fate. She was a real teenage girl living in the moment, tragically unaware of the significant consequences lurking in the shadows. Katherine deserves a new analysis in an age in which a woman’s sexual choices are no longer held against her – one that acknowledges her imperfections but does not condemn her for them.

Grace Beattie is a student on the MA in Public History programme at Royal Holloway, University of London.

Further Readings:

Fraser, Antonia, The Six Wives of Henry VIII (Oxford, England: Weidenfeld & Nicolson, 2002).

Russell, Gareth, Young and Damned and Fair: The Life and Tragedy of Catherine Howard at the Court of Henry VIII (London, England: William Collins, 2018).

Schutte, Valerie, “The Fictional Queen Katherine Howard,” Early Modern Women, 12.2 (2018), 146–50

[i] (Correspondance Politique, ed. Kaulek, p. 218).

The Eternal Fascination of the Slightly Morbid: H&M, that Tweet, and the Victorian Mourning Dress by Dilara Scholz

When it comes to popular history, the past two years have shown that there are two things that grab people’s attention just that bit more than other topics: fashion history (see, for example, the near-hysteria over the costuming in Bridgerton) and the morbid and slightly obscure. An interesting case that seemed to combine the two recently washed up on my timeline in the form a simple tweet that happened to be directly related to my research. The photo in question showed a rack of black, frilly and lacy dresses from H&M’s autumn 2020 collection, with the caption ‘seems like H&M is expecting a rush on Victorian funerals’.

This was surprising in two ways. Firstly, as a researcher of Victorian mourning culture, I would not have expected this rather niche topic to be so present in the public mind that a display at a popular fashion store would evoke it so readily. And the tweet, with its tongue-in-cheek reference to Victorian mourning, really did seem to resonate with the public in Britain and beyond. Indeed, as of 31 August, the original post had garnered an incredible 22,400 retweets and 297,800 likes, and it can only be assumed that the majority of people who liked this post are not necessarily enthusiasts for this relatively obscure topic. As with any specialist subject, sudden popularity invariably leads to demands for comment from hobbyists and hobby historians, many of whom shared their views on the post by judging not only the quality of the dresses but also by attempting to clear up popular misconceptions about Victorian mourning dress and what this actually entailed. Other ‘onlookers’ expressed their fascination with the topic as something they had either not seen or heard of, while also expressing the desire to learn more.

A surprising encounter with the culture of the past

Aside from the general fascination with Victorian fashion or the slight morbidity of the topic of funeral dress, the season we’re about to enter also have played a role here. Some Twitter comments mentioned “coven” and other witchy themes which would match the end of summer and the beginning of “Pumpkin Spice Latte” season and the coming of Halloween. Every year, Halloween-season seems to re-invigorate people’s interest in magic and everything mystical and witchy, even in Britain, where this originally Irish and now strongly American ‘feast of the dead’ has no long-standing tradition; our popular visions of historic funerary culture seems to be closely tied to this, and in line with All Hallows Day, where the veil between the world of the living and the non-living is said to be lifted. Victorian mourning culture would certainly fit in into this theme, as it is high-season for all things gothic and time for this subculture to stand centre-stage rather than in the periphery of fashion culture, within subcultures such as Gothic, Lolita and Steampunk. Another factor that might explain this apparently sudden fascination with Victorian mourning dress may also be that society currently finds itself in a state of collective grief as a result of the pandemic, which may make some of us more conscious of death and customs linked to death and bereavement.

John Everett Millais ‘A Widow’s Mite’ (1870) [Birmingham Museum & Gallery)

It is also important to note our 21st century alienation with mourning rituals and grief. In a departure from centuries of tradition, for example, black is less and less often the choice of colour at funerals and is also no longer directly associated with mourning and death. In fact, even before the pandemic, mourning and death had become much less visible in everyday life, at least in most Western societies, than they had been for our ancestors. A certain fascination with something that is the material expression of a phenomenon that is much less visible to us these days is not surprising.

Having observed popular fashion history for a while now, I had always regarded Victorian ‘mourning dress’ inspired garments as being on the periphery of the ever-popular corset top and Bridgerton-inspired empire dresses that have become popular since the launch of the show in the lockdown winter of 2020. This has shown how fashion history provides an unusual case of a historical theme in which academics and fans come together as one, vividly exchanging opinions and ideas and learning from one another. Suddenly, a topic comes alive through entertainment, encouraging the interest of more and more people, as revealed by popular Youtube channels such as those run by Mina Le and Karolina Zebrowska, who are both followed by academics and fashion enthusiasts alike. As fashion historian Lou Taylor, author of the most exhaustive monograph on mourning dress to date, writes ‘The study of dress (…) is a key which opens the door to a deeper understanding of the developments that take place in society and its social ambitions and aspirations’. This should not be underestimated, especially when it comes to dress connected to rituals of the past, as material culture can help us to understand the emotional language of the past, reflected in fabric, colour and design, and shaped by a number of other factors, including intricate gender dynamics at play.

Victorian Mourning Culture

Mourning and half-mourning dresses as depicted in Myras’s Journal of Fashion and Dress, December 1875 (D. Scholz)

It is no coincidence that the creator of the original tweet nailed the period when writing ‘Victorian funerals’ without (presumably) being a historian. Victorian mourning culture in general and the Victorian funeral in particular have become somewhat ‘iconic’ among writers and others interested in such things, and are often referred viewed as a ‘celebration of mourning’ because grief was anything but muted. Over the course of the second half of the 19th century, there were major changes not only in rituals of mourning but also in burials and the design and function of cemeteries, cremation became ‘a thing’ for the first time, and the cultural impact of all this was enormous. No other period in British history is known for its funeral culture in quite the same way as the Victorian era. The Victorian funeral was all about material ‘expression’, but this certainly does not mean that it was a superficial business. A number of historians, as well as some those who witnessed these funerals, would argue that all of the ritualistic pageantry was ‘show’, but having done extensive research on the topic so far, I have noticed patterns and certain rituals not only seem to have been highly practical, but may also have served as a comfort to the mourner, with mourning dress working as a signifier to let society know that this person needed a different type of attention.

The emergence of this more demonstrative culture makes sense, as, due to industrialisation, mass production of goods (even ephemeral ones) was finally possible, partly enabling the rise of the so called mourning craze. Suddenly, all goods were available to purchase with a black border and all types of clothing (even children’s clothes and underwear) were available with a black ribbon. James Stevens Curl has emphasised that, these mourning accessories were naturally not only signifiers of grief but also of social position and status, and the rapidly expanding middle class was interested in doing what the gentry did; displaying prosperity and gentility in life as well as in death. Of course, the extent to which mourning was ‘performed’ also somewhat represented how much the person was loved, so mourners tried to outdo each other in the funeral, burial, and memorial arrangements they organised for their departed loved ones.

Mourning Dress

The actual description used here, the “funeral dress”, is incorrect as the dresses shown and brought up in the context and aftermath of the post were not.

As rightly said, not every black dress seen in the context of Victorian fashion was a mourning dress, as the ‘mourning craze’ and peak of the materialisation of mourning culture almost coincided with an increase in popularity of the black dress as such around the 1880s. This was due to developments in the qualities of dyes and while black dyes were still expensive, the quality of synthetic dyes had much improved but was still not perfect. Some black dyes also tended to go ‘rusty’ which is why we can often find ‘brown’ mourning dresses that were not of course meant to be brown.

The amount of mourning garments and related-items (dresses, jewellery, stationery) that have survived is also curious. Curl notes that this is simply due to the vast amount of mourning items that were produced, so this ‘survivor’s bias’ certainly shapes our perception of what we think is “Victorian”, and the iconic Victorian funeral culture, which was majorly influenced by Queen Victoria – the Widow of Windsor – has shaped our idea of what the Victorians were about. This especially makes sense if we look at how much fashion and music subcultures such as Goth are inspired by ‘Victorian’ style items. While fashion and style were rather diverse and differed from decade to decade in the nineteenth century, mourning dress was special, and the ‘mourning craze’ even led to the establishment of specialized stores – mourning warehouses – that sold “all things mourning” in the different shades of grief.

Queen Victoria in ‘widow’s weeds’ by André-Adolphe-Eugène Disdéri (1870)

Importantly, mourning for the Victorians was more than mere sentiment and had to be carefully performed in stages – starting off with ‘first mourning’, then moving on to ‘half-‘ or ‘second mourning’. This progress through the different stages was reflected in the colours as well as the cuts and fabrics that were permitted. A female mourner would go from the deepest black to lighter shades of purple and ultimately reaching a stage at which she could dispense with mourning dress, thereby declaring her full re-entrance into society. Wearing mourning dress was not really a matter of choice but more part of the prevailing social etiquette, with most fabrics and cuts being restrictive and therefore attempting to reflect how the mourner felt. Mourning for men was more minimalist, often just expressed through a dark suit and hat- or armband. While it was never really en vogue to keep mourning short, mourners had the option to decide to stay in mourning indefinitely – just like Queen Victoria did, as a signal to society that the process had not yet been concluded.

While this is just a snapshot of the complex culture of Victorian mourning, it may help to put our own ways of dealing with grief into perspective. And that amusing tweet, initially shared for entertainment, has certainly led to a rich discussion about separation, loss, and bereavement, at a time when these things are very much in the public mind. 

Further Reading:

Pat Jalland: Death in the Victorian Family (1996)

Lou Taylor: Mourning Dress – A Costume and Social History (1983)

James Stevens Curl: The Victorian Celebration of Death (2000)

John Morley: Death, Heaven and the Victorians (1971)

Dilara Scholz is a public historian and PhD student at Royal Holloway, University of London. Her doctoral project focuses on gender, emotion and mourning in 19th century England.

The Evolution of a Banksy and Graffiti as Public History by Catriona Cooper

In May 2020, mid-pandemic, Street Artist Banksy paid a visit to Southampton General Hospital and installed a new artwork. The piece, entitled ‘Game Changer’, was auctioned off in March to raise money for NHS charities, and sold for a record £14.4m. But this was not the first Banksy to appear in Southampton. In 2010, I was a student in this vibrant coastal city and the news went round that a Banksy had been spotted near a local nightclub. As students, we thought this was the height of cool and many of us went to view the piece. However, within 24 hours it had been scrubbed over. Returning to the city ten years later, I was walking the dog past the same nightclub and I noticed the graffiti had been repainted and altered to reflect the ongoing pandemic.

Original Banksy ‘No Future’ piece, Mount Pleasant Rd., Southampton, 2010 (Metro)

Banksy’s distinctive graffiti has become a well-recognised form of social commentary and aesthetic protest, from paintings on the side of sexual health clinics to images of war. And while his work may divide opinion, it’s become an important part of urban material history. Access to these very public pieces of art is not restricted, meaning that they can be amended over time, as in the case of the piece I saw as a student in Southampton. It’s a transitory art form; there are communities of practice that develop street art, and the areas where it is produced regularly change. This piece is one that has been adapted over the eleven years it has been in place to reflect changing social feeling. Indeed, a look through historic Google Street View imagery shows how the social commentary the artist is offering has changed as the area around it has changed.

Updated Banksy piece, Southampton, 2020 (Southern Daily Echo)

It is placed on a wall behind a nightclub in an area of Southampton in which the gentrified student zone borders one of the poorest districts of the city. The image is of a child sat on a pavement holding a balloon with the words ‘No Future’ in red letters above. The text was thought to be a statement about environmental issues, but could equally be a commentary on poverty within the city. If we move forwards in time, we see butterflies being added to the piece as the child is scrubbed out, more tags are added along with other statements, such as ‘Expand your horizons’. The child can still be seen, but she is increasingly overshadowed by other motifs and images.

Banksy piece updated with coronavirus germ, March, 2021

In 2020, we can see that the wall appears to have been repainted white; the same image of the child is added but instead of a balloon forming the ‘o’ in ‘No Future’, the string leads to a balloon with the words ‘Our Future’ above. The logo of the environmentalist protest group Extinction Rebellion sits within the O, suggesting this is another comment on the climate crisis. As with the previous piece, within months it had been adulterated with a virus being added at the end of the balloon string, the words ‘Our Future’ being painted over and a speech bubble claiming ‘Fuck Banksy’ added.

Etching of a ship on wall of St. Thomas the Martyr Church, Winchelsea, East Sussex

While Banksy’s pieces are largely celebrated, they sit within a modern form of graffiti that has often been condemned as vandalism. However, we shouldn’t separate the study and consumption of Banksy’s work from generally less-admired piece of historic graffiti. Both the mostly overlooked art form of modern tagging and culturally celebrated street art are still the results of people from a variety of walks of life making a comment about the world around them. And people have been marking their environments in this way for much longer than we tend to assume. Historic sites of incarceration are often adorned with inscriptions from the smallest initial to much more detailed engravings that tell stories of the lives of the imprisoned. The brickwork of Southampton likewise became a drawing board for soldiers awaiting deployment as part of the D-Day landings, and their initials are now being studied for clues about who they were. The study of medieval graffiti has also become the subject of significant research over the last decade (with Champion, Cohen and Wright leading the field). In almost every conceivable historic space, we find people etching little parts of their lives into their built environment. An inscription of a ship in a church might be a sailor praying for safe passage, or an outline of a shoe (Figure 5) giving a nod to the ritual practice of concealing shoes within walls. In these and other intriguing examples, we see the lived experiences of past peoples playing out through their inscriptions, or as Wright puts it “capturing their hopes and fears” using a visual language.

Engraved shoe, Wressle Castle, East Yorkshire

Historians, archaeologists and others increasingly view graffiti as pieces of our collective past and efforts are made to record and study them. However, they have often gone through a process of change and removal. Indeed, medieval graffiti usually has to be studied with the help of raking light sources and close observation. The ship and shoe etchings were originally carved into painted walls, making them stand out, but during the reformation the iconoclastic destruction of religious art in British churches saw the removal of paintwork, and etchings went from being highly visible to faint lines in the stonework.  

I find the ‘girl with the balloon’ piece originally produced by Banksy fascinating as the quick adulterations to both the original and the new piece reflect local discontent and highlight fraught social issues within this area of Southampton. While the original might have been a statement on what in 2010 were termed ‘environmental issues’, I feel the choice of location is important. The siting of the piece, between the student run of bars on Bevois Valley and the deprived area of Northam, has meant that we are seeing social tensions play out in the form of addition and removal of layers of paint to a wall. In the same way, we see the city of Southampton gentrifying and residents being pushed out from areas close to both the University of Southampton and Southampton Solent University. Are we seeing here the same inscribing of fears from the artist drawing attention to the climate crisis or the residents of the area being forced to relocate to other districts of the city? Is the quickly changing nature of this piece of street art being subject to political iconoclasm presenting changing values of the local population and a social commentary on an area of change within a city bidding to become the UK’s City of Culture 2025?

Catriona Cooper is Senior Fellow in History, Heritage and Media in the Department of History at Royal Holloway, University of London.

From Enslaved Princess to Royal Protégée: Queen Victoria’s Forgotten Black Goddaughter by Emily Murrell

Anyone who visited Osbourne House, Queen Victoria’s holiday home, in October 2020 wouldn’t have been able to miss Hannah Uzor’s elegant portrait of Sarah Forbes Bonetta, the Queen’s forgotten goddaughter.  The painting, based on the portrait above by Camille Silvy, was commissioned by English Heritage to commemorate Black History Month and is the first instalment of their developing scheme to reintroduce historical Black figures into their heritage sites.

Starting their campaign with Bonetta is no coincidence considering the remarkable life she led as a Black woman in Britain. Within a matter of a few months, her life as an orphaned slave was completely transformed as she became goddaughter and protégée to the most powerful ruler in the world. Her life was full of both tragedy and affluence, but, more significantly, she became a symbol for race culture in the Victorian era and should be offered the greater recognition within Victorian history today. 

Drawing of a very young Sarah Forbes Bonetta in F.E. Forbes, Dahomey and the Dahomens, 1851.

Bonetta is believed to have been born in the Yoruba Empire – now Nigeria and the adjacent territories – in around 1843. Her misfortunes began at the age of five as she became an orphan during the Okeadon War and was taken as a slave by King Ghezo of Dahomey. Inter-tribal warfare was rife between West African kingdoms, fighting for land and a monopoly over the Atlantic slave trade. The slave trade was an indication of power and the source of wealth for many rulers in the region, particularly Ghezo, who claimed that he would do anything for the British aside from giving up his trading networks.

After being held as a slave for two years, her life in Britain began in irony as a ‘diplomatic gift’ to the Queen of a country that had recently abolished slavery. In 1850, Queen Victoria ordered Captain Frederick E. Forbes to travel to Dahomey and negotiate a deal with King Ghezo that would stop his involvement in the Atlantic slave trade. The details of Forbes’ journey to Africa and these unsuccessful negotiations are outlined in his published journals, Dahomey and the Dahomens. It is here, listed among other gifts of ‘rum’ and ‘cloth’, that we first learn of the ‘captive girl’ that would be returning to Britain with him as a gift for the crown. Forbes later notes that she quickly became a favourite among the crew and emphasises his moral efforts to ‘save’ her from a worse fate in the hands of the Yoruban King. The speculation regarding her ‘noble’ origins also started in Forbes’ writings as he suggested that she must have been of ‘good’ blood to justify keeping her as a slave to the King. Before her journey to Britain, the girl was taken to be baptised under a new name, Sarah Forbes Bonetta, after the Captain and his ship, the HMS Bonetta.

Upon arriving in Britain, Forbes arranged an audience with the Queen who quickly took a liking to Bonetta and, as the captain had expected, offered to support her education and middle-class upbringing. In her diary, Queen Victoria describes her as ‘an intelligent little thing’, an opinion expressed with a surprise that many shared due to the common belief that Black people were incapable of being educated to a high standard. Bonetta’s life thus became something of a social experiment to either prove or disprove the scientists who created these misconceptions. Bonetta became a hot topic in phrenology circles, the members of which had long used skull measurements as evidence for intellectual inferiority in Black people. The Brighton Herald even noted her skull formation as ‘almost Caucasian in its regularity’ when discussing her life on the eve of her wedding, reflecting long-standing racial biases and the inability to move past stereotypical assumptions.

Portrait photograph of Sarah Forbes Bonetta by William Bambridge, 1856 (Royal Collection Trust)

Much of her early education took place at an all-girls missionary school run by the Church Mission Society in Sierra Leone. She was moved there after it was suggested that the British climate was damaging her health, yet another common racial stereotype. Sierra Leone already had an established history for Black British populations after the ‘Province of Freedom’ mission sought to resettle London’s ‘Black Poor’ there in 1787. This scheme was introduced to eliminate the new financial burdens that emerged after they became liberated from their slave positions. This first colony was doomed by disease, but the subsequent colony founded Freetown, where Bonetta would later be sent. As a missionary, her education became a tool for the British to exaggerate their new moral mission in liberating Black populations. Without choice Sarah would become the model for the ideal Black Victorian woman and the image of radical change the British wanted to display. After returning to England, Bonetta lived with a Christian family in Kent, who spoke fondly of her ‘lively disposition’, and she later moved to Brighton to live with Miss Sophia Welsh who oversaw her final introductions into society.

Now a ‘civilised’ member of British society, Sarah easily made a name for herself in Brighton’s social circles with her lively character and musical talents. Her popularity drew the attention of James Davies who requested to meet her with a view to making a proposal of marriage. As her guardian, it was Queen Victoria’s final task to find her a suitable husband, and the 33-year-old wealthy merchant from Lagos fitted perfectly. The wedding took place in August 1862 and was seen as one of the most diverse events of its time. The Brighton Guardian reported the wedding in a short article which was subsequently reprinted nationwide. Most notably, the report dedicated a chunk of its short account to comment on the ‘absence of that abruptness’ in Sarah’s features. The attention to her appearance continued as she was described to lack the characteristic ‘ferocity’ of the stereotypical African Victorian women. These sentences reveal the media’s efforts to seek a justification for Bonetta’s status by using racial stereotypes to highlight her separation from white society, but also to portray her an unusual within the black community. Defined as a ‘pleasant confusion’, the way she was present and perceived by British society was an indication of the deeply ingrained racism that continued to thrive during her lifetime. In signing the marriage certificate, Sarah wrote her full name to be ‘Ina Sarah Forbes Bonetta’, the Yoruban name being a nod to her African heritage and the culture she had lost along her journey to status in England. After the wedding, renowned photographer Camille Silvy captured several portraits of Bonetta and her new husband, permanently confirming her status in Britain and placing her as a symbol against the racial stereotypes at the time.

Bonetta with her husband, James Davies, 1862 by Camille Silvy (National Portrait Gallery)

Over the next decade, Sarah and James had four children. Their firstborn was named Victoria after the Queen, who also became goddaughter to her namesake. The final decades of Bonetta’s short life were much less remarkable than her early ones. Her husband came into frequent financial difficulties, ultimately losing much of their business and estate to bankruptcy. Despite moving to Madeira with her three youngest children to escape the stress, the burden continued to chase her. This, combined with her incurable tuberculosis, eventually drove her to her death bed in 1880.

Like many historical Black figures, Bonetta’s history has been overshadowed by her white counterparts, and even as we reintroduce her to the narrative we remain unaware of her private thoughts. What seems most striking is that as a member of the Victorian middle-class, her education and upbringing were not unique at all. The position she assumed in Victorian high society as a Black woman is therefore pivotal in understanding how race relations were changing in the Victorian era. In Black and British: A Forgotten History, David Olusoga describes the ‘great Victorian moral mission’ that came from the inflation of the British ego on the eve of the abolition of slavery. Her life therefore developed as something of a social experiment to see if the British were capable of acting as world moral leaders and she accordingly became the poster child for this changing racial conversation in Britain.

At the cost of her past and her cultural identity, Sarah Forbes Bonetta became a token representative of the Victorian efforts to lead the change in social and racial liberalism, though the reality of her treatment was far from enlightened. She’s an important figure representing the historic diversity of British culture and heritage, yet her story, along with those of many other Black men and women, has been largely ignored. The inclusion of her portrait as part of English Heritage’s Black History Month tributes is a step toward filling this void, but we must also be aware of its temporary nature and continue to work for more permanent incorporations of Black history within British history.

Emily Murrell is a student on the MA in Public History programme at Royal Holloway, University of London.

Further Reading:

  • Caroline Bressey, ‘Of Africa’s Brightest Ornaments: A Short Biography of Sarah Forbes Bonetta’ Social & Cultural Geography, 6:2 (2005) 253-266.
  • Joan Anim-Addo, ‘Bonetta [married name Davies], (Ina) Sarah Forbes [Sally]’, Oxford Dictionary of National Biography (17 September 2015).
  • Sarah Young, ‘English Heritage Unveils portrait of Queen Victoria’s African Goddaughter to mark Black History Month’, The Independent, October 7th 2020.