The Eternal Fascination of the Slightly Morbid: H&M, that Tweet, and the Victorian Mourning Dress by Dilara Scholz

When it comes to popular history, the past two years have shown that there are two things that grab people’s attention just that bit more than other topics: fashion history (see, for example, the near-hysteria over the costuming in Bridgerton) and the morbid and slightly obscure. An interesting case that seemed to combine the two recently washed up on my timeline in the form a simple tweet that happened to be directly related to my research. The photo in question showed a rack of black, frilly and lacy dresses from H&M’s autumn 2020 collection, with the caption ‘seems like H&M is expecting a rush on Victorian funerals’.

This was surprising in two ways. Firstly, as a researcher of Victorian mourning culture, I would not have expected this rather niche topic to be so present in the public mind that a display at a popular fashion store would evoke it so readily. And the tweet, with its tongue-in-cheek reference to Victorian mourning, really did seem to resonate with the public in Britain and beyond. Indeed, as of 31 August, the original post had garnered an incredible 22,400 retweets and 297,800 likes, and it can only be assumed that the majority of people who liked this post are not necessarily enthusiasts for this relatively obscure topic. As with any specialist subject, sudden popularity invariably leads to demands for comment from hobbyists and hobby historians, many of whom shared their views on the post by judging not only the quality of the dresses but also by attempting to clear up popular misconceptions about Victorian mourning dress and what this actually entailed. Other ‘onlookers’ expressed their fascination with the topic as something they had either not seen or heard of, while also expressing the desire to learn more.

A surprising encounter with the culture of the past

Aside from the general fascination with Victorian fashion or the slight morbidity of the topic of funeral dress, the season we’re about to enter also have played a role here. Some Twitter comments mentioned “coven” and other witchy themes which would match the end of summer and the beginning of “Pumpkin Spice Latte” season and the coming of Halloween. Every year, Halloween-season seems to re-invigorate people’s interest in magic and everything mystical and witchy, even in Britain, where this originally Irish and now strongly American ‘feast of the dead’ has no long-standing tradition; our popular visions of historic funerary culture seems to be closely tied to this, and in line with All Hallows Day, where the veil between the world of the living and the non-living is said to be lifted. Victorian mourning culture would certainly fit in into this theme, as it is high-season for all things gothic and time for this subculture to stand centre-stage rather than in the periphery of fashion culture, within subcultures such as Gothic, Lolita and Steampunk. Another factor that might explain this apparently sudden fascination with Victorian mourning dress may also be that society currently finds itself in a state of collective grief as a result of the pandemic, which may make some of us more conscious of death and customs linked to death and bereavement.

John Everett Millais ‘A Widow’s Mite’ (1870) [Birmingham Museum & Gallery)

It is also important to note our 21st century alienation with mourning rituals and grief. In a departure from centuries of tradition, for example, black is less and less often the choice of colour at funerals and is also no longer directly associated with mourning and death. In fact, even before the pandemic, mourning and death had become much less visible in everyday life, at least in most Western societies, than they had been for our ancestors. A certain fascination with something that is the material expression of a phenomenon that is much less visible to us these days is not surprising.

Having observed popular fashion history for a while now, I had always regarded Victorian ‘mourning dress’ inspired garments as being on the periphery of the ever-popular corset top and Bridgerton-inspired empire dresses that have become popular since the launch of the show in the lockdown winter of 2020. This has shown how fashion history provides an unusual case of a historical theme in which academics and fans come together as one, vividly exchanging opinions and ideas and learning from one another. Suddenly, a topic comes alive through entertainment, encouraging the interest of more and more people, as revealed by popular Youtube channels such as those run by Mina Le and Karolina Zebrowska, who are both followed by academics and fashion enthusiasts alike. As fashion historian Lou Taylor, author of the most exhaustive monograph on mourning dress to date, writes ‘The study of dress (…) is a key which opens the door to a deeper understanding of the developments that take place in society and its social ambitions and aspirations’. This should not be underestimated, especially when it comes to dress connected to rituals of the past, as material culture can help us to understand the emotional language of the past, reflected in fabric, colour and design, and shaped by a number of other factors, including intricate gender dynamics at play.

Victorian Mourning Culture

Mourning and half-mourning dresses as depicted in Myras’s Journal of Fashion and Dress, December 1875 (D. Scholz)

It is no coincidence that the creator of the original tweet nailed the period when writing ‘Victorian funerals’ without (presumably) being a historian. Victorian mourning culture in general and the Victorian funeral in particular have become somewhat ‘iconic’ among writers and others interested in such things, and are often referred viewed as a ‘celebration of mourning’ because grief was anything but muted. Over the course of the second half of the 19th century, there were major changes not only in rituals of mourning but also in burials and the design and function of cemeteries, cremation became ‘a thing’ for the first time, and the cultural impact of all this was enormous. No other period in British history is known for its funeral culture in quite the same way as the Victorian era. The Victorian funeral was all about material ‘expression’, but this certainly does not mean that it was a superficial business. A number of historians, as well as some those who witnessed these funerals, would argue that all of the ritualistic pageantry was ‘show’, but having done extensive research on the topic so far, I have noticed patterns and certain rituals not only seem to have been highly practical, but may also have served as a comfort to the mourner, with mourning dress working as a signifier to let society know that this person needed a different type of attention.

The emergence of this more demonstrative culture makes sense, as, due to industrialisation, mass production of goods (even ephemeral ones) was finally possible, partly enabling the rise of the so called mourning craze. Suddenly, all goods were available to purchase with a black border and all types of clothing (even children’s clothes and underwear) were available with a black ribbon. James Stevens Curl has emphasised that, these mourning accessories were naturally not only signifiers of grief but also of social position and status, and the rapidly expanding middle class was interested in doing what the gentry did; displaying prosperity and gentility in life as well as in death. Of course, the extent to which mourning was ‘performed’ also somewhat represented how much the person was loved, so mourners tried to outdo each other in the funeral, burial, and memorial arrangements they organised for their departed loved ones.

Mourning Dress

The actual description used here, the “funeral dress”, is incorrect as the dresses shown and brought up in the context and aftermath of the post were not.

As rightly said, not every black dress seen in the context of Victorian fashion was a mourning dress, as the ‘mourning craze’ and peak of the materialisation of mourning culture almost coincided with an increase in popularity of the black dress as such around the 1880s. This was due to developments in the qualities of dyes and while black dyes were still expensive, the quality of synthetic dyes had much improved but was still not perfect. Some black dyes also tended to go ‘rusty’ which is why we can often find ‘brown’ mourning dresses that were not of course meant to be brown.

The amount of mourning garments and related-items (dresses, jewellery, stationery) that have survived is also curious. Curl notes that this is simply due to the vast amount of mourning items that were produced, so this ‘survivor’s bias’ certainly shapes our perception of what we think is “Victorian”, and the iconic Victorian funeral culture, which was majorly influenced by Queen Victoria – the Widow of Windsor – has shaped our idea of what the Victorians were about. This especially makes sense if we look at how much fashion and music subcultures such as Goth are inspired by ‘Victorian’ style items. While fashion and style were rather diverse and differed from decade to decade in the nineteenth century, mourning dress was special, and the ‘mourning craze’ even led to the establishment of specialized stores – mourning warehouses – that sold “all things mourning” in the different shades of grief.

Queen Victoria in ‘widow’s weeds’ by André-Adolphe-Eugène Disdéri (1870)

Importantly, mourning for the Victorians was more than mere sentiment and had to be carefully performed in stages – starting off with ‘first mourning’, then moving on to ‘half-‘ or ‘second mourning’. This progress through the different stages was reflected in the colours as well as the cuts and fabrics that were permitted. A female mourner would go from the deepest black to lighter shades of purple and ultimately reaching a stage at which she could dispense with mourning dress, thereby declaring her full re-entrance into society. Wearing mourning dress was not really a matter of choice but more part of the prevailing social etiquette, with most fabrics and cuts being restrictive and therefore attempting to reflect how the mourner felt. Mourning for men was more minimalist, often just expressed through a dark suit and hat- or armband. While it was never really en vogue to keep mourning short, mourners had the option to decide to stay in mourning indefinitely – just like Queen Victoria did, as a signal to society that the process had not yet been concluded.

While this is just a snapshot of the complex culture of Victorian mourning, it may help to put our own ways of dealing with grief into perspective. And that amusing tweet, initially shared for entertainment, has certainly led to a rich discussion about separation, loss, and bereavement, at a time when these things are very much in the public mind. 

Further Reading:

Pat Jalland: Death in the Victorian Family (1996)

Lou Taylor: Mourning Dress – A Costume and Social History (1983)

James Stevens Curl: The Victorian Celebration of Death (2000)

John Morley: Death, Heaven and the Victorians (1971)

Dilara Scholz is a public historian and PhD student at Royal Holloway, University of London. Her doctoral project focuses on gender, emotion and mourning in 19th century England.

The Evolution of a Banksy and Graffiti as Public History by Catriona Cooper

In May 2020, mid-pandemic, Street Artist Banksy paid a visit to Southampton General Hospital and installed a new artwork. The piece, entitled ‘Game Changer’, was auctioned off in March to raise money for NHS charities, and sold for a record £14.4m. But this was not the first Banksy to appear in Southampton. In 2010, I was a student in this vibrant coastal city and the news went round that a Banksy had been spotted near a local nightclub. As students, we thought this was the height of cool and many of us went to view the piece. However, within 24 hours it had been scrubbed over. Returning to the city ten years later, I was walking the dog past the same nightclub and I noticed the graffiti had been repainted and altered to reflect the ongoing pandemic.

Original Banksy ‘No Future’ piece, Mount Pleasant Rd., Southampton, 2010 (Metro)

Banksy’s distinctive graffiti has become a well-recognised form of social commentary and aesthetic protest, from paintings on the side of sexual health clinics to images of war. And while his work may divide opinion, it’s become an important part of urban material history. Access to these very public pieces of art is not restricted, meaning that they can be amended over time, as in the case of the piece I saw as a student in Southampton. It’s a transitory art form; there are communities of practice that develop street art, and the areas where it is produced regularly change. This piece is one that has been adapted over the eleven years it has been in place to reflect changing social feeling. Indeed, a look through historic Google Street View imagery shows how the social commentary the artist is offering has changed as the area around it has changed.

Updated Banksy piece, Southampton, 2020 (Southern Daily Echo)

It is placed on a wall behind a nightclub in an area of Southampton in which the gentrified student zone borders one of the poorest districts of the city. The image is of a child sat on a pavement holding a balloon with the words ‘No Future’ in red letters above. The text was thought to be a statement about environmental issues, but could equally be a commentary on poverty within the city. If we move forwards in time, we see butterflies being added to the piece as the child is scrubbed out, more tags are added along with other statements, such as ‘Expand your horizons’. The child can still be seen, but she is increasingly overshadowed by other motifs and images.

Banksy piece updated with coronavirus germ, March, 2021

In 2020, we can see that the wall appears to have been repainted white; the same image of the child is added but instead of a balloon forming the ‘o’ in ‘No Future’, the string leads to a balloon with the words ‘Our Future’ above. The logo of the environmentalist protest group Extinction Rebellion sits within the O, suggesting this is another comment on the climate crisis. As with the previous piece, within months it had been adulterated with a virus being added at the end of the balloon string, the words ‘Our Future’ being painted over and a speech bubble claiming ‘Fuck Banksy’ added.

Etching of a ship on wall of St. Thomas the Martyr Church, Winchelsea, East Sussex

While Banksy’s pieces are largely celebrated, they sit within a modern form of graffiti that has often been condemned as vandalism. However, we shouldn’t separate the study and consumption of Banksy’s work from generally less-admired piece of historic graffiti. Both the mostly overlooked art form of modern tagging and culturally celebrated street art are still the results of people from a variety of walks of life making a comment about the world around them. And people have been marking their environments in this way for much longer than we tend to assume. Historic sites of incarceration are often adorned with inscriptions from the smallest initial to much more detailed engravings that tell stories of the lives of the imprisoned. The brickwork of Southampton likewise became a drawing board for soldiers awaiting deployment as part of the D-Day landings, and their initials are now being studied for clues about who they were. The study of medieval graffiti has also become the subject of significant research over the last decade (with Champion, Cohen and Wright leading the field). In almost every conceivable historic space, we find people etching little parts of their lives into their built environment. An inscription of a ship in a church might be a sailor praying for safe passage, or an outline of a shoe (Figure 5) giving a nod to the ritual practice of concealing shoes within walls. In these and other intriguing examples, we see the lived experiences of past peoples playing out through their inscriptions, or as Wright puts it “capturing their hopes and fears” using a visual language.

Engraved shoe, Wressle Castle, East Yorkshire

Historians, archaeologists and others increasingly view graffiti as pieces of our collective past and efforts are made to record and study them. However, they have often gone through a process of change and removal. Indeed, medieval graffiti usually has to be studied with the help of raking light sources and close observation. The ship and shoe etchings were originally carved into painted walls, making them stand out, but during the reformation the iconoclastic destruction of religious art in British churches saw the removal of paintwork, and etchings went from being highly visible to faint lines in the stonework.  

I find the ‘girl with the balloon’ piece originally produced by Banksy fascinating as the quick adulterations to both the original and the new piece reflect local discontent and highlight fraught social issues within this area of Southampton. While the original might have been a statement on what in 2010 were termed ‘environmental issues’, I feel the choice of location is important. The siting of the piece, between the student run of bars on Bevois Valley and the deprived area of Northam, has meant that we are seeing social tensions play out in the form of addition and removal of layers of paint to a wall. In the same way, we see the city of Southampton gentrifying and residents being pushed out from areas close to both the University of Southampton and Southampton Solent University. Are we seeing here the same inscribing of fears from the artist drawing attention to the climate crisis or the residents of the area being forced to relocate to other districts of the city? Is the quickly changing nature of this piece of street art being subject to political iconoclasm presenting changing values of the local population and a social commentary on an area of change within a city bidding to become the UK’s City of Culture 2025?

Catriona Cooper is Senior Fellow in History, Heritage and Media in the Department of History at Royal Holloway, University of London.

From Enslaved Princess to Royal Protégée: Queen Victoria’s Forgotten Black Goddaughter by Emily Murrell

Anyone who visited Osbourne House, Queen Victoria’s holiday home, in October 2020 wouldn’t have been able to miss Hannah Uzor’s elegant portrait of Sarah Forbes Bonetta, the Queen’s forgotten goddaughter.  The painting, based on the portrait above by Camille Silvy, was commissioned by English Heritage to commemorate Black History Month and is the first instalment of their developing scheme to reintroduce historical Black figures into their heritage sites.

Starting their campaign with Bonetta is no coincidence considering the remarkable life she led as a Black woman in Britain. Within a matter of a few months, her life as an orphaned slave was completely transformed as she became goddaughter and protégée to the most powerful ruler in the world. Her life was full of both tragedy and affluence, but, more significantly, she became a symbol for race culture in the Victorian era and should be offered the greater recognition within Victorian history today. 

Drawing of a very young Sarah Forbes Bonetta in F.E. Forbes, Dahomey and the Dahomens, 1851.

Bonetta is believed to have been born in the Yoruba Empire – now Nigeria and the adjacent territories – in around 1843. Her misfortunes began at the age of five as she became an orphan during the Okeadon War and was taken as a slave by King Ghezo of Dahomey. Inter-tribal warfare was rife between West African kingdoms, fighting for land and a monopoly over the Atlantic slave trade. The slave trade was an indication of power and the source of wealth for many rulers in the region, particularly Ghezo, who claimed that he would do anything for the British aside from giving up his trading networks.

After being held as a slave for two years, her life in Britain began in irony as a ‘diplomatic gift’ to the Queen of a country that had recently abolished slavery. In 1850, Queen Victoria ordered Captain Frederick E. Forbes to travel to Dahomey and negotiate a deal with King Ghezo that would stop his involvement in the Atlantic slave trade. The details of Forbes’ journey to Africa and these unsuccessful negotiations are outlined in his published journals, Dahomey and the Dahomens. It is here, listed among other gifts of ‘rum’ and ‘cloth’, that we first learn of the ‘captive girl’ that would be returning to Britain with him as a gift for the crown. Forbes later notes that she quickly became a favourite among the crew and emphasises his moral efforts to ‘save’ her from a worse fate in the hands of the Yoruban King. The speculation regarding her ‘noble’ origins also started in Forbes’ writings as he suggested that she must have been of ‘good’ blood to justify keeping her as a slave to the King. Before her journey to Britain, the girl was taken to be baptised under a new name, Sarah Forbes Bonetta, after the Captain and his ship, the HMS Bonetta.

Upon arriving in Britain, Forbes arranged an audience with the Queen who quickly took a liking to Bonetta and, as the captain had expected, offered to support her education and middle-class upbringing. In her diary, Queen Victoria describes her as ‘an intelligent little thing’, an opinion expressed with a surprise that many shared due to the common belief that Black people were incapable of being educated to a high standard. Bonetta’s life thus became something of a social experiment to either prove or disprove the scientists who created these misconceptions. Bonetta became a hot topic in phrenology circles, the members of which had long used skull measurements as evidence for intellectual inferiority in Black people. The Brighton Herald even noted her skull formation as ‘almost Caucasian in its regularity’ when discussing her life on the eve of her wedding, reflecting long-standing racial biases and the inability to move past stereotypical assumptions.

Portrait photograph of Sarah Forbes Bonetta by William Bambridge, 1856 (Royal Collection Trust)

Much of her early education took place at an all-girls missionary school run by the Church Mission Society in Sierra Leone. She was moved there after it was suggested that the British climate was damaging her health, yet another common racial stereotype. Sierra Leone already had an established history for Black British populations after the ‘Province of Freedom’ mission sought to resettle London’s ‘Black Poor’ there in 1787. This scheme was introduced to eliminate the new financial burdens that emerged after they became liberated from their slave positions. This first colony was doomed by disease, but the subsequent colony founded Freetown, where Bonetta would later be sent. As a missionary, her education became a tool for the British to exaggerate their new moral mission in liberating Black populations. Without choice Sarah would become the model for the ideal Black Victorian woman and the image of radical change the British wanted to display. After returning to England, Bonetta lived with a Christian family in Kent, who spoke fondly of her ‘lively disposition’, and she later moved to Brighton to live with Miss Sophia Welsh who oversaw her final introductions into society.

Now a ‘civilised’ member of British society, Sarah easily made a name for herself in Brighton’s social circles with her lively character and musical talents. Her popularity drew the attention of James Davies who requested to meet her with a view to making a proposal of marriage. As her guardian, it was Queen Victoria’s final task to find her a suitable husband, and the 33-year-old wealthy merchant from Lagos fitted perfectly. The wedding took place in August 1862 and was seen as one of the most diverse events of its time. The Brighton Guardian reported the wedding in a short article which was subsequently reprinted nationwide. Most notably, the report dedicated a chunk of its short account to comment on the ‘absence of that abruptness’ in Sarah’s features. The attention to her appearance continued as she was described to lack the characteristic ‘ferocity’ of the stereotypical African Victorian women. These sentences reveal the media’s efforts to seek a justification for Bonetta’s status by using racial stereotypes to highlight her separation from white society, but also to portray her an unusual within the black community. Defined as a ‘pleasant confusion’, the way she was present and perceived by British society was an indication of the deeply ingrained racism that continued to thrive during her lifetime. In signing the marriage certificate, Sarah wrote her full name to be ‘Ina Sarah Forbes Bonetta’, the Yoruban name being a nod to her African heritage and the culture she had lost along her journey to status in England. After the wedding, renowned photographer Camille Silvy captured several portraits of Bonetta and her new husband, permanently confirming her status in Britain and placing her as a symbol against the racial stereotypes at the time.

Bonetta with her husband, James Davies, 1862 by Camille Silvy (National Portrait Gallery)

Over the next decade, Sarah and James had four children. Their firstborn was named Victoria after the Queen, who also became goddaughter to her namesake. The final decades of Bonetta’s short life were much less remarkable than her early ones. Her husband came into frequent financial difficulties, ultimately losing much of their business and estate to bankruptcy. Despite moving to Madeira with her three youngest children to escape the stress, the burden continued to chase her. This, combined with her incurable tuberculosis, eventually drove her to her death bed in 1880.

Like many historical Black figures, Bonetta’s history has been overshadowed by her white counterparts, and even as we reintroduce her to the narrative we remain unaware of her private thoughts. What seems most striking is that as a member of the Victorian middle-class, her education and upbringing were not unique at all. The position she assumed in Victorian high society as a Black woman is therefore pivotal in understanding how race relations were changing in the Victorian era. In Black and British: A Forgotten History, David Olusoga describes the ‘great Victorian moral mission’ that came from the inflation of the British ego on the eve of the abolition of slavery. Her life therefore developed as something of a social experiment to see if the British were capable of acting as world moral leaders and she accordingly became the poster child for this changing racial conversation in Britain.

At the cost of her past and her cultural identity, Sarah Forbes Bonetta became a token representative of the Victorian efforts to lead the change in social and racial liberalism, though the reality of her treatment was far from enlightened. She’s an important figure representing the historic diversity of British culture and heritage, yet her story, along with those of many other Black men and women, has been largely ignored. The inclusion of her portrait as part of English Heritage’s Black History Month tributes is a step toward filling this void, but we must also be aware of its temporary nature and continue to work for more permanent incorporations of Black history within British history.

Emily Murrell is a student on the MA in Public History programme at Royal Holloway, University of London.

Further Reading:

  • Caroline Bressey, ‘Of Africa’s Brightest Ornaments: A Short Biography of Sarah Forbes Bonetta’ Social & Cultural Geography, 6:2 (2005) 253-266.
  • Joan Anim-Addo, ‘Bonetta [married name Davies], (Ina) Sarah Forbes [Sally]’, Oxford Dictionary of National Biography (17 September 2015).
  • Sarah Young, ‘English Heritage Unveils portrait of Queen Victoria’s African Goddaughter to mark Black History Month’, The Independent, October 7th 2020.

From Vanity to Insanity? The perilous battle for female beauty in Georgian Britain by Iona Bentley

“The woman to whom nature has denied beauty, in vain endeavours to make it by art”

The World , 2 January, 1755

Stepping through the doors of time, and into the shoes of the Georgian lady, one may expect to be adorned with elegant dresses, glittering jewels, and striking accoutrements. The walls of the National Portrait Gallery and the cabinets of the V&A show us exactly that. For the Beau Monde – a particularly fashionable faction of eighteenth-century elite society – this was their reality. But, if by chance you did happen to find yourself fastened into the buckled straps of the eighteenth-century silk heel, you would most likely also find yourself smeared in grease, painted with poison, and coated in a crust of perfumed powders. Your eyes may itch. Your skin may burn. Your teeth may ache from rot. Ultimately, the battle for beauty in Georgian Britain was fraught with danger, sacrifice, and pain.

‘Six Stages of Mending a Face’, Thomas Rowlandson, 1792 [British Museum]

The women of the rarefied British elite during this period had access to a variety of cosmetics from across the globe. Cosmetics for personal hygiene were, however, used by both men and women, of the upper and middling ranks alike. Requiring time, resources, and an absence of physical labour unattainable to the lower sorts, cleanliness became a sign of social distinction. During the eighteenth century, cosmetics were often used in lieu of regular bathing, as hot water was believed to disturb the humours and foster disease within the body. Instead, Georgians applied substances such as oil of turpentine (now used in industrial solvents), beeswax, and bear fat to clean themselves. Powders were then required to soak up the excess residue produced both by the natural body and the cosmetic products. It is not hard to imagine (and yet difficult to bring yourself to imagine) how cloying the thick layers of oily product and scented powders would feel against the skin. Heavily perfumed products were necessary to conceal unpleasant odours – which would undoubtedly be present in abundance with such a poor hygiene regime. For the Georgians, however, this was particularly crucial, as bad air (miasma) was believed to cause disease. There were some practices of cleanliness that would be very recognisable in our own time, including tooth scraping, ear syringing, nail trimming, and plucking, although these were performed alongside other cosmetic techniques that would unlikely appear in the twenty-first century salon, such as bloodletting – popularly used to detoxify the blood in both cosmetic and medical procedures.

After cleansing the body, the second half of the Georgian lady’s cosmetic regime would be to paint her face. The early modern obsession with the white English self – as a representation of civility, virtue, and purity – heavily influenced notions of beauty and practices of cosmetics. Throughout the eighteenth century, these ideals of whiteness endured, whilst tanned skin and freckles continued to be negatively associated with the ‘lower’, labouring ranks. As a complete antithesis of the white English self, there was a common notion that those with skin coloured by the sun could only belong to the unrefined, uneducated, and unfortunate because they could only support themselves by toiling outside. To possess porcelain white skin was to exude social and ethnic superiority. The most popular means of whitening the skin was the highly toxic ceruse. Commonly associated with the sixteenth-century monarch, Elizabeth I, the white lead and vinegar paste not only acted as a whitening ‘foundation’, but also helped conceal smallpox scars, acne, and freckles. It was the perfect solution to achieving a smooth, porcelain white complexion; or at least perfect enough when overlooking the severe medical ramifications of coating oneself in lead. Rouge was then added to the cheeks and lips to accentuate the whitened face. Made of cochineal (crushed beetles), fucus (seaweed), and red mercury, the rouge was just as poisonous as the ceruse. Final cosmetic embellishments included: darkening eyebrows and lashes with lead shavings; inserting porcelain veneers over rotting teeth; stuffing balls of cork into the mouth to plump up sagging cheeks; and covering scars with beauty spots made of silk or velvet. Ranging from mildly uncomfortable, to categorically dangerous, these beautifying techniques no doubt had a daily impact on the lives of the women who used them. Peering into the portraits of eighteenth-century women, it is hard to deny their striking beauty. And yet we can only imagine the anguish and pain that this beauty must have caused them, lying just beneath their painted façade of serenity.

Matthew Darly, ‘The City Rout’, 1776 [British Museum]
Satire of two aged, unattractive women as ‘mutton dressed lamb’.

Surprisingly, the dangers of lead and mercury were recognised even before the eighteenth century. As early as 1598 Doctor Fierovant claimed that women who applied these poisonous cosmetics “to grow more beautiful, become disfigured, hastening olde age before the time” with “black teeth … an offensive breath, with a face half scorched, and an uncleane complexion”. The horrific realities of lead and mercury poisoning are now ever clearer to modern medicine: neurological failure, blindness, breathing difficulties, mood disorders, miscarriages and hallucinations, to name but a few. Georgian women were permanently disfigured, or ultimately perished, in their battle for beauty. Kitty Fisher, a courtesan widely celebrated for her beauty, is one of many who reportedly died as a result of the toxic cosmetics she often donned. Yet, despite the notorious dangers, elite women continued to be drawn in by the addictive allure of cosmetics. Horace Walpole, like many of his contemporaries, feared the woman’s deadly fascination with “white lead, of which nothing could break her”. Only death. Many also proclaimed the immoral implications of artificial beauty. Face painting was linked to sexual promiscuity, self-indulgence, vanity, and deceit. In turn, ridiculing satires of ‘mutton dressed lamb’, ‘the world turn’d upside down’, and the absurdities of cosmetics lined the pages of periodicals and plastered the windows of print shops.

Kitty Fisher, oil on canvas, Nathaniel Hone, 1765
[National Portrait Gallery, London]

So, what drove elite Georgian women to ignore the social admonishments and lethal perils of cosmetics, and ultimately sacrifice their health and life for beauty? Despite the common misconception (or possibly even blatant condescension) of eighteenth-century critics and modern historians alike, this endeavour for beauty was clearly not attempted in vain – or in fact, solely for vanity’s sake. Appearance defined one’s identity: communicating wealth, rank, morality, marital status (or sexual availability), and temperament. As an “invisible standard” of pedigree and civility, to don the correct exterior was to ultimately claim and maintain social status. For elite women, beauty and power became inextricable. In a world of political patriarchy, physical beauty became a formidable weapon, and cosmetics acted as its ammunition. For Georgiana Cavendish, Duchess of Devonshire, beauty was instrumental in her political authority. For Kitty Fisher and Maria Gunning (later Countess of Coventry), beauty brought them wealth, fame, and the independence that followed – but it also brought them death. Historians suggest that the fatal illnesses of both Kitty Fisher and Maria Gunning share a striking resemblance to the symptoms of lead poisoning, and when considering the prominent use of lead-based ceruse within their respective roles as courtesan and actress, it is highly likely that these toxic cosmetics were the cause of their deaths. Women sought beauty not only for beauty’s sake, but for the social inclusion and influence it afforded them. Yet, just as surely as beauty brought popularity, prestige, and power, pain, sickness, and disfigurement invariably followed. This was evidently a sacrifice many eighteenth-century ladies were willing to make. Ultimately, to argue that insanity or simple vanity drove women to such lethal ends of beauty, would be to entirely disregard the agency, admiration, and clout they fought so hard to attain. So, if you did happen to find yourself in the shoes of the Georgian lady, would that be a sacrifice you’d be willing to make?

Iona Bentley is a student on the MA in Public History programme at Royal Holloway, University of London.

Further Reading:

Marcereau DeGalan, Aimée, ‘Lead White or Dead White? Dangerous Beauty Practices of Eighteenth-Century England’, Bulletin of the Detroit Institute of Arts, vol. 76, no. 1/2 (2002) pp. 38-49.

Phillippy, Patricia, Painting Women: cosmetics, canvases, and early modern culture (Baltimore: Johns Hopkins University Press, 2006).

Schoel, Josie, ‘Cosmetics, Whiteness, and Fashioning Early Modern Englishness’, SEL Studies in English Literature 1500-1900, vol. 60 no. 1 (2020) pp. 1-23.

What Do We Do with the Inglorious Dead? Remembrance and the British Dead of the Irish War of Independence by Edward Madigan

This month sees the centenary of the interment of the Unknown Warrior at Westminster Abbey, which took place on the second-ever Armistice Day, and was, by all accounts, one of the great public spectacles in modern British history. Vast crowds of people, numbering in their hundreds of thousands, lined the route and stood in silence as the gun carriage carrying the unidentified serviceman wound its way through the ancient London streets. Many of those watching the procession had lost loved ones during the Great War and for them, and millions of others across Britain and the wider empire, the Warrior represented lost sons, brothers, husbands and comrades. Shortly before arriving at the Abbey, the cortège paused in Whitehall so that George V could unveil the newly permanent Cenotaph, a simple but towering memorial to the absent dead designed by Sir Edwin Lutyens. The laying to rest of one of the fallen among kings and princes in the great parish church of the empire seemed to confer profound meaning on the war. It also supported the consoling idea that the dead had given their lives in defence of honour, freedom and civilisation. For the millions still in mourning across the United Kingdom, it was reassuring to believe that the war had been worth fighting and worth winning and that their lost loved ones had not died in vain. This point was very clearly reinforced by the blunt, three-word inscription on the side of the Cenotaph, ‘The Glorious Dead’. The dead had been honourable, self-sacrificing and chivalrous, while their main battlefield adversaries – the Germans – had been ruthless, malevolent and wantonly destructive.

The Tomb of the Unknown Warrior, Westminster Abbey, 2017

Popular attitudes regarding the meaning of the conflict have shifted a great deal since the 1920s, but our perception of those who died hasn’t really changed much at all and remains quite simplistic. In common with a lot of historians of the Great War, I had hoped that the recent centenaries might lead to a popular re-imagining and re-appraisal of the conflict, but I don’t think in Britain we arrived at any particularly fresh critical understanding. One key reason for this was the persistent focus on the dead at the expense of other groups who were affected by the terrible violence of the war. Servicemen who survived combat but were psychologically traumatised or physically disabled, or both, received very little attention in commemorative ceremonies, centenary-related projects, or media representations of the war. Perhaps most glaringly, the bereaved themselves, with some notable exceptions, were written out of the commemorative narrative during the centenaries. On a more positive note, there was a good deal of new focus on the racial, ethnic and religious diversity of the men who served in the British forces during the war. David Olusoga’s fascinating book, The World’s War, published just as the centenaries began in August 2014, highlighted the degree to which the engine of war drew in millions of people from across Africa and Asia. South Asian soldiers received particular attention, with no fewer than four books on the Indian experience of the war published in 2018 alone. And just last year, Labour MP David Lammy presented a moving Channel Four documentary that exposed the way in which black soldiers and auxiliaries who died while serving with the British forces in Africa were consciously treated differently to their white ‘comrades’. These projects have introduced new characters to the public story of the ‘war to end all wars’ and greatly enriched our understanding of this devastating clash of empires. And yet these formerly forgotten groups of soldiers are now remembered and represented in much the same way as the servicemen of the Great War have always been remembered: either as victims to be pitied or heroes to be admired for their endurance and self-sacrifice. This rather two-dimensional ‘heroic victim’ view of the dead is complicated by the stories of the British veterans of the Great War who fought and died in Ireland in 1920 and 1921. These men continue to be almost entirely overlooked in British popular memory, but their experiences challenge the conventional narrative of the war dead and shed light on the human complexity of those honoured in Britain each November.

The cycle of violence known as the Irish War of Independence began on 21 January 1919 when Sinn Féin’s secessionist parliament met in Dublin for the first time and republican volunteers killed two members of the Royal Irish Constabulary (RIC) in Tipperary. The conflict escalated into a full-scale guerrilla war in the spring of the following year when paramilitary forces were deployed from Britain to reinforce the now seriously demoralised RIC and respond to the Irish Republican Army’s increasingly aggressive military campaign. These irregulars initially wore a mix of dark green Irish constabulary and khaki army uniforms and became known as the Black and Tans. About 10,000 of them were recruited between January of 1920 and the end of the war the following summer. A further escalation occurred in July 1920 with the deployment of a force known as the Auxiliary Division, which ultimately numbered some 5,000 men. The Auxiliaries were exclusively composed of ex-officers who operated in highly-mobile and heavily-armed units and wore distinctive Tam O’Shanter caps in the style of Scottish regiments of the British Army. In addition to these two paramilitary groups, the British campaign drew on units of the regular army, which were increasingly used in operations (and targeted by the IRA) in the last year of the conflict. The violence of the war intensified after the introduction of the Restoration of Order in Ireland Act in August 1920; by the beginning of the following month, dozens of combatants and civilians were being killed each week as both the IRA and Crown forces stepped up their operations across the island.

Prime Minister, David Lloyd George, and Chief Secretary, Thomas Hamar Greenwood, inspect temporary cadets of the Auxiliary Division in the grounds of the Foreign Office, London, 5 November 1920 (Irish Times)

Although there were Irishmen in both the Black and Tans and the Auxiliary Division, the great majority of those who served in these paramilitary formations were British, hailing mostly from England and Scotland. Importantly, in the context of British commemorative culture, virtually all of them were veterans of the Great War and many had served with distinction on the Western Front and elsewhere. Most of the regular army personnel stationed in Ireland during the war were also British, often serving in distinctively English or Scottish regiments, and many of these regulars had seen extensive active service between 1914 and 1918. Just under 200 British army officers and other ranks were killed by the IRA in 1920 and 1921. A further 180 British-born members of the paramilitary police forces were killed by republicans during the same period. Virtually all of the latter were veterans of the world war and they often had distinguished records of service. The men who served in the Auxiliary Division alone had received over 633 awards for gallantry during the Great War, including three Victoria Crosses. Many of them, then, would have easily conformed to the popular vision of the war hero by the standards of the time, and indeed by the standards of our own time.

The men killed in a particularly violent episode that took place a century ago this month highlight the direct connection between the Great War and post-war paramilitary violence in Ireland. On the morning of Sunday 21 November 1920, IRA gunmen shot dead 14 officers and ex-officers, and one civilian landlord, in boarding houses and hotels in different parts of central Dublin. Most of the dead men had been serving in some capacity with the Crown forces, but although they had been targeted as spies, only about half were working for British intelligence. All but one of the thirteen former or serving officers killed on the day had been deployed overseas during the world war and several bore obvious scars or disabilities sustained while on active service. One of them, Capt. Leonard Price, the London-born son of a stockbroker, was awarded the Military Cross for gallantry under fire while serving on the Western Front before being posted to Dublin for intelligence work in June 1920. In response to the killings of Price and his comrades, members of the Auxiliary Division and the RIC descended on a Gaelic football match being played at Croke Park in north Dublin that afternoon. They began firing on the crowd, ultimately shooting about sixty unarmed civilians, fourteen of whom were killed or mortally wounded, including Mick Hogan, a Tipperary player after whom the Hogan Stand is named. At some stage later in the day, two senior IRA officers and a civilian were summarily executed while in custody at Dublin Castle. The killings at Croke Park naturally evoked outrage among the nationalist population in Ireland while the assassinations of serving officers, some of whom were still wearing their pyjamas when they were shot, caused an outcry across the water in Britain. On 26 November, just two weeks after the Unknown Warrior had been solemnly laid to rest, there was another major military funeral procession in London for nine of the officers shot in Dublin. Six of the dead were carried to Westminster Abbey, while the other three, being Catholic, were taken a short distance further to Westminster Cathedral. Price and his dead comrades were lauded in the press for their gallantry and distinguished service while their killers in the IRA were denounced as the ‘Murder Gang’.

Tribute to officers killed on Bloody Sunday in the Illustrated London News (ILN Archive)

And yet although the killings of the officers on the morning of Bloody Sunday aroused widespread sympathy, much of the British commentary on the conduct of the forces of the Crown from the autumn of 1920 onward was deeply critical. A good deal of this criticism, which was mostly aired in the press and in the course of parliamentary debates, focused on the Crown Forces’ now open policy of ‘reprisals’ for IRA operations. Two months before Bloody Sunday, on the night of 21 September 1920, for example, a group of Auxiliaries and Black and Tans killed two unarmed civilians, set fire to numerous homes and wrecked a local creamery and hosiery factory at Balbriggan in north county Dublin. The ‘sack of Balbriggan’ was by no means the first or the worst incident of British paramilitaries killing unarmed civilians during the war, but it evoked a particularly negative popular reaction in Britain. Prominent journalists, clergymen and politicians, representing a curiously diverse variety of political opinion, condemned the events at Balbriggan and the men who had committed them in the strongest terms, and condemnation of British military policy continued over the following months. Importantly, both The Times and The Manchester Guardian made direct comparisons between the actions of the Crown forces in Ireland and those of German soldiers in Belgium and France in 1914. Former prime minister and Liberal MP Herbert Asquith echoed this view in the Commons, as did the Labour MP and former party leader, Arthur Henderson. The Archbishop of Canterbury, Randall Davidson, a pillar of the British establishment and a major figure in the House of Lords, was also notably outspoken in his condemnation of the policy of reprisals.

Mixed group of Black and Tans and Auxiliaries outside the London and North Western Hotel in Dublin after an unsuccessful IRA attack, April 1921 (National Library of Ireland)

The Dead of the Irish Revolution, a ground-breaking new book by Eunan O’Halpin and Daithí Ó Corráin, provides details on over 2,800 combatant and civilian deaths related to the struggle for independence in Ireland between the Easter Rising in April 1916 and the truce that ended hostilities in July 1921. The painstaking research the authors have carried out sheds new and often sobering light on the nature and extent of the violence that took place in Ireland both during and after the world war. Importantly, it highlights not only the intensity of republican violence, about which a great deal is already known, but also the sheer extent of the killing and destruction of property perpetrated by the forces of the British state. It seems clear, as O’Halpin observes in the introduction, that from early 1920 ‘the British government embarked on a policy of indiscriminate brutalisation of nationalist Ireland’. That process of brutalisation was spear-headed by British veterans of the Great War who had often served gallantly on the Western Front and elsewhere.

That such violence and intimidation – unleashed within the United Kingdom and in the name of the Crown – caused alarm in Britain should not surprise us. The criticism of the conduct of the Crown forces voiced by British commentators in 1920 and 1921 was directly informed by the strong sense that the recently-ended world war had been a righteous conflict in which the British had engaged in a moral crusade against the German enemy. The killing of thousands of unarmed civilians by invading German soldiers in Belgium and northern-France in 1914 greatly enhanced the British case for war in the minds of popular commentators and ordinary civilians. When British civilians fell victim to German military aggression in the U-boat campaigns and airship raids that began in 1915 and continued for the rest of the war, this crusading fervour was further reinforced. Ultimately, over three-quarters of a million British and Irish servicemen lost their lives in the conflict, but the bereaved found comfort in the notion that these extraordinary sacrifices had been redeemed by victory in a just war. The violence wreaked on the civilian population by British paramilitaries and regulars in Ireland in the years after the Armistice caused such discomfort in Britain partly because it undermined that consoling narrative. In an impassioned intervention during a debate on reprisals in the Commons in October 1920, Joseph Kenworthy, naval officer, war veteran and Liberal MP had the following to say:

In Germany the excesses in Belgium were excused in the Reichstag by stories of the Belgians firing from their houses on the brave German troops … The same defence is being made by the Government today for this system of burnings in Ireland. If we do not condemn it, we shall be as guilty as the German people, and worse. This House may not condemn it, but I hope the people outside will. If not, then Germany will have won the War. The Prussian spirit will have entered into us. The Prussian spirit at last will be triumphant, and the 800,000, the flower of our race, who lie buried in a score of battle-fronts will really have died in vain … and Germany has won and we have lost. That is the tragic, wicked part of it.

Hansard, Commons Debates, 20 October 1920, col. 962.

Those who fell in the Ypres Salient, the Somme valley and the other hallowed battlegrounds of the Great War are revered in British culture. Their stories are still told, their graves still visited, and the conflict in which they were killed remains the focus of intense popular interest. The Irish War of Independence, by contrast, has no place in British popular culture and its dead are not remembered. In a sense, this is perfectly understandable; the vast multitude who died between 1914 and 1918 are certainly more sympathetic than the few hundred who were later killed in a sordid fight to suppress a popular independence movement. Yet these men’s experiences in Ireland and the campaign they conducted there remind us of the valuable truths that our ancestors were rarely two-dimensional heroes or villains and that soldiers are invariably both perpetrators as well as victims of violence. Traditional commemorative culture doesn’t allow us to acknowledge such complexities but considering them may help us better understand the generation of the First World War.

Edward Madigan is Senior Lecturer in Public History at Royal Holloway University of London and co-editor of Historians for History.

‘Black Lives’ under the Raj by Sarah Ansari

The 2020 Black Lives Matter movement is rightly reinforcing the urgency of acknowledging in concrete and accurate ways the historical voices and experiences of Black people. That it is taking place against the backdrop of an effervescence in identity politics more generally also underlines the complex ways in which people see and label themselves in contemporary Britain. One of the current criticisms of the umbrella terms ‘BAME’/’BME’, for instance, is that these do not convey sufficiently clearly the spectrum of ethnic and other identities supposedly encompassed by them. What follows here, therefore, is not intended to suggest that people of African and South Asian heritage have had 100% identical past experiences, but simply to remind ourselves that, in the days of the British Raj, Indians could often be marginalised and oppressed as ‘black’. On the one hand, as historians, awareness of diversity is always important; on the other hand, so is the need to challenge ‘divide and rule’ narratives.

The whole business of ‘divide and rule’ (following hot on the heels of its close relation ‘divide and conquer’) is, of course, closely associated with the British Empire, and methods deployed there to limit and deflect the resistance of people subjected to imperial control. Such divisive tactics can be blamed for generating political responses that led – in due course, though never inevitably – to the creation of a separate state for Indian Muslims when South Asia secured its freedom in 1947. By officially sanctioning what were packaged as the ‘separate’ needs of Indian Muslims, British policy directly encouraged political separatism. In this case, the divisions nurtured and exacerbated by imperial policy-making were premised on religious difference, and they eventually helped produce Partition in August 1947: an event marked by enormous human suffering, around a million deaths, and something like 14-16 million displaced people moving between the two new states of India and Pakistan.

A tea party in Calcutta, 1890
Colonists and servants at a tea party in Calcutta, 1890 (British Library)

But back to ‘blackness’ under the Raj. Racial identity (albeit infused with the added complication of class) was always a sticky issue for imperialists, with ‘whiteness’ intimately associated with the ‘running of empire’. Contact between the rulers and the ruled was kept as minimal as possible: for instance, clubs – the social hubs of empire – with few exceptions held Indians at bay for as long as they could. Anyone who has ever read a nostalgic book or watched a nostalgia-tinged film about the Raj, for instance, is likely to have come across the institution of the ‘cantonment’. Originally associated first and foremost with the military, these garrisons often mutated into the ‘separate’ locality within a town or city where the British resided, worked and played, usually with as much physical and psychological distance as possible put between themselves and Indians (other than the legions of local servants they employed). This was not just about creating hypothetical ‘safe cultural spaces’ for themselves. In British-controlled South Asia skin colour was always highly politicised, with the ‘blackness’ of local people contrasted against the ‘whiteness’ of the British and their world.

Map of Madras c. 1710
Map of Madras c. 1710, featuring ‘White Town’ and ‘Black Town’

Look at Presidency port-cities such as Madras (Chennai) and Calcutta (Kolkata), two of the earliest commercial bridgeheads through which British interests established themselves in the subcontinent from the seventeenth century onwards. Both for many years contained so-called ‘black towns’, areas in which local people lived and worked, and where British people rarely lingered for long. In the case of Madras, as a British Library webpage tells us:

‘Black Town was originally the old native quarter and grew up outside the
walls of Fort St George to the north on the seafront. … As Madras grew,
Black Town became the commercial centre of the city and developed a very
high population density. … Its name was officially changed to George Town
after a visit by the Prince of Wales in 1906.’

Calcutta, too, in its early colonial heyday was divided into two main districts – White Town, which was where the British resided and conducted their business, and Black Town, where local Bengalis were to be found. Kolkata today offers a ‘Blacktown walk’, which takes visitors on a guided tour of the old-world ‘heritage’ dwellings that make up this historic part of the city. In a similar fashion, Karachi, with its increasingly cosmopolitan population, came in the second half of the nineteenth century to be ‘demarcated’ along colour lines too, with Saddar Bazaar and Empress Market frequented by the ‘white’ population, and the Serai Quarter serving the needs of the ‘black’ town.

‘Blackness’ was thus intrinsic to how the British and other Europeans viewed Indians, but it was also part and parcel of the derogatory ways in which they frequently described ‘the natives’. People with a mixed European-Indian ancestry (nowadays called ‘Anglo-Indians’) were also troubling, since their very existence embodied ‘racial’ mixing, and, as in the United States and South Africa, those among them who could pass as ‘white’ frequently did their best to do so.

More controversial still for us today was the use of the ‘N-word’ to describe Indians under the Raj. As Sam Fortescue has highlighted, in his exploration of material written by British men and women during or soon after the so-called Mutiny of 1857-8, this deeply racist term was bandied about by contemporaries. Take William Russell, the London Times special correspondent, who was sent out to India in early 1858 to report on the Uprising, and who provided many vignettes of the British whom he encountered, satirical no doubt but all the same indicative of perceived realities:

”By Jove! sir,” exclaims the Major, who has by this time got to the walnut
stage of the argument, to which he has arrived by gradations of sherry,
port, ale, and Madeira, – “By Jove,” he exclaims, thickly and fiercely, with
every vein in his forehead swol’n like whipcord, “these n*****s are such a
confounded sensual lazy set, cramming themselves with ghee and sweetmeats,
and smoking their cursed chillumjees all day and all night, that you might
as well think to train pigs. Ho, you! Punkah chordo, or I’ll knock –
Suppose we go up and have a cigar!”

Moreover, as Fortescue goes on to explain:

Atkinson’s famous satire of ‘Our Station’ hinted at a more serious, endemic
double standard in the British. Take, for instance, his sketch of ‘Our
Judge’: ‘There you see him in his court – n*****s – ten thousand pardons!
no, not n*****s, I mean natives – sons of the soil – Orientals – Asiatics,
are his source of happiness.’ The implication is, that in spite of British
evangelism and Utilitarian rhetoric, and notwithstanding Government’s heavy
reliance on servants, native soldiery and pundits, officials tended to feel
that at heart, they ruled a land of ‘n*****s.’

Bengali Babu 2British rule in South Asia, as elsewhere in the world, thus always hinged on the deployment of crude stereotypes, ethnic or otherwise, from the ‘salt-of-the-earth’ ‘martial races’ associated with the plains and mountains of the North-west, to the ‘too-clever-by-half’, inherently untrustworthy Bengali ‘baboo’. Indeed, according to 1890s Daily Mail journalist G. W. Steevens, whereas an Englishman possessed a straight leg, the Bengali’s was unquestionably that of a slave: “Except by grace of his natural masters”, Steevens duly asserted, “a slave he has always been and always must be”.

So whether Indians were “half-naked fakirs” (Winston Churchill’s insulting description of MK Gandhi in 1931) or loyal soldiers volunteering in enormous numbers on behalf of British interests in both world wars, the fact remains that – under the Raj – Indians could very often end up being labelled as ‘black’, with all the exceedingly negative connotations that such categorisations implied at that time.


Sarah Ansari is Professor of History at Royal Holloway, University of London. She has published widely on the history of South Asia and is currently writing a history of Pakistan for Cambridge University Press.

Facing Abe Lincoln: On Black History and Public History by Daisha Brabham

When I was about 10 or 11, I was fascinated with the American Civil War. Indeed, I once begged my mother to buy me a fifteen-hour documentary about the battles and generals. She was convinced that I would not watch the entire thing but nonetheless relented. And I did. History had been a passion of mine from an early age, but I can attribute this indelible fascination with the Civil War to my history teacher, Mr. T. He performed in military re-enactments and had even at one time re-enacted the Battle of Gettysburg. The battles were a consistent theme in the classroom as we examined the various uniforms, the generals, and political leaders. I particularly liked learning about Abraham Lincoln to the point that I memorized the ‘Gettysburg Address’ and watched countless documentaries on him. The Great Emancipator who had defied the odds and saved the nation represented a thread between myself and my country. In a childlike fashion, I liked Lincoln, because he would have ‘liked me’, which made him stand out among the figures we studied. In Lincoln’s eyes, I was an American; the war he had led to free my ancestors proved that for me. So, in his eyes, I was whole.

My love for Abraham Lincoln resurfaced as I happened upon some of my old DVDs at 22. I began to research the Lincoln I had met in my seventh-grade classroom using the skills that I had since acquired as a historian To my horror, though, I discovered the real Lincoln; the Lincoln who believed in separatism; the Lincoln who would have preferred that my ancestors return to Africa, leaving the county to mend in our absence. Ultimately, Lincoln began to resemble the rest of the characters in the story of the Civil War. In the words of Fredrick Douglass, ‘In his interests, in his associations, in his habits of thought, and in his prejudices, he was a white man’. It was only then that I began to revisit my childhood memories and realize a terrifying fact. The one aspect missing from Mr. T’s and my own vision of the American past: slaves. I still cannot remember a single lesson on slavery. It seems we both ignored that part, but for distinct reasons. I had overlooked this distortion of the past in the same way I had I ignored the fact that Mr. T had once told me ‘I was one of the good ones’ compared to my black and brown classmates. It was easier to disregard those moments than to confront them. I simply cannot describe the gut-wrenching feeling I felt as that childlike thread that had once sewn me securely to the fabric of my country was severed. I cried like a child for the loss of belonging that had grounded me.

Douglass-Lincoln Mural (1943)
Mural on Recorder of Deeds Building, Washington DC, depicting Frederick Douglass meeting Abraham Lincoln, painted by William Edouard Scott, 1943

My first reaction to this revelation was to shed all materiality of an American heritage to which I was no longer tethered. I began to identify as an African, abandoning that hyphen that never truly fit. I at once felt whole embracing a new, more authentic history. I turned my focus to the Black Diaspora and its unique past and wrote and produced an entire play to immerse myself and others in the stories that I had never learned about in school. In many ways, my experience of dealing with the past mirrors the major shifts now occurring in America and across the Western world. The resurfacing debate about race has resulted in commercial brands, sports teams, and even States trying to distance themselves from their sordid history. While some have dubbed these shifts corporate activism, the sentiment is a common reaction to dealing with a problematic history. We believe that if we can all at once shed the physical ties of the history in which our narrative has been unsettled, we can rectify a past that has left so many broken. This process of unlearning or confronting a wretched past is by no means unique to the United States, as European countries try to burn some of the uncomfortable iconography of their own histories. As statues across the West are defaced and hurled into the sea, there seems to be a widespread desire to divest ourselves from what were once considered the good old days.

This process of severance worked for me until I arrived in the UK in 2019 to pursue an MA in public history. In this new world, I was American, and my history was a valued asset in discussing race. During my time in England, I was struck by how much African American history was referenced. Our history was discussed in my courses through the slave narratives and the role of slavery in shaping American culture. Our figures were highlighted in the curriculum for secondary school students. About mid-way through my time in England, I began recognising a similar method of elision to the one to which I had been exposed at school, and I realised that outside of America, our history has often been used to discuss race from a comfortable distance. It is far easier to discuss the KKK South or the white backlash against the African American Civil Rights movement than to address the financial investments of the slave traders or the ribbon-tied racism that is still prevalent in the Europe. My narrative supplies a comfortable distance to discuss race without confronting the pervasive structures that continue to marginalize the black community in the UK. In conversations with my black and brown British friends and through my work with the Black Cultural Archives in Brixton, the results of this method of dealing with the past seem very similar to those I had experienced. For the first time in over five years, my closest friend Alex and I had a conversation about how dominant my African American narrative had been for him as a London boy, a dominance that meant he could never truly feel whole and a part of his country. His narrative could never be properly explored if ‘Black history’ was so inextricably linked to ‘African American history’. His past was in a sense erased by mine. History has a cost.

Daisha Brabham at the National Museum of African American History and Culture, Washington DC

But what does this journey through various approaches to the past provide in the way of instruction for the public historian? Can we genuinely help in the current process of dealing with the past? Or are we solely fire starters who cause havoc and leave our public to grapple with flames? If the traditional approaches to uncomfortable pasts that I and others encountered at school have no substance, then what innovative approaches should we employ? To these questions, I can supply no concrete answers concerning the way forward. Many of the issues I am raising are not new or even unique to the profession. It often seems, however, that many of my colleagues feel no particular duty to contemplate them. What I do know is that none of us are free from or above the debate. There is no history that has no power. There is no aspect of our field that should remain untouched by this conversation.

So, where do we go from here? In the words Edward Baptist, the first place to start will be in asking the right questions and imagining a new way of doing and telling history. It is my belief that much of this work should focus on the classroom, an area often neglected by historians. If Mr. T had been armed with the proper educational tools, would we have covered slavery and not just the uniforms? If Alex could have learned about OWAAD in his secondary school, would he feel whole? Decolonisation must be a verb. It is a process of tearing down, but also a process of repositioning what we once thought was fixed and settled.

The murder of George Floyd was familiar. I opened the video reluctantly because I had seen it before. I had seen it when Tamir Rice was gunned down for having a toy gun. I had seen it when Freddy Gray was killed. I had felt it when 7-year-old Aiyanna Jones was shot for sleeping on her grandmother’s couch. I needed no reminders of what it means to be black in America, especially amid a pandemic in which my demographic had been disproportionately affected. I needed no rendition of the cost of my skin. But I still cried. I still thought of the reality of what it feels like. This was the America from which Mr. T, and I were running; the result of a tumultuous past, that left victims in its wake.

DB & Lincoln
Daisha Brabham contemplates the Lincoln Memorial, Washington DC

When I think about my revelation about Lincoln, it was at that moment that I had finally reached adulthood, at least in a historical sense. I had finally faced the reality of my past. Against the backdrop of a nation on fire, it is still unclear if my country has, less to act on that revelation. But this work is inevitable, whether it is carried out peacefully or in turmoil. A couple of months after my discovery of the real Lincoln, I went down to Washington DC to take in the new African American History Museum constructed to remind the public of the impact of the African American past. The museum is massive, with over four centuries worth of history and culture within its walls. A visit can easily last a day, if not more. It stands as a physical reminder that African Americans have always been tethered to the fabric of the country. We have always been present. It is the legacy my ancestors fought for. But it is also offers method through which we can refocus our lens to see a different narrative. A different America. The retelling of the past.

After touring through the galleries, I made my way down to the Lincoln Memorial. I stood under the great statue of the man, rather than the myth of my childhood. I stared at him for a long while, taking in the weight of his existence and the hundreds of people who had come to visit him that day. The Great Emancipator. Eventually, I smiled and whispered, “It is great to finally meet you Mr. Lincoln.”


Daisha Brabham is a former teacher from New Haven, Connecticut, currently enrolled on the MA in Public History programme at Royal Holloway, University of London.

Justin Champion, Historian (1960 – 2020) by Graham Smith

What we need is radical social change that redistributes the wealth of the very few to the many, and that will change life. We are inheriting, at the moment, the consequences of a long form of decolonisation. The British Isles, throughout the 18th and 19th, and most of the 20th century, colonised the world, raped it in one sense, and we are now confronted with the consequences of that. (Justin Champion, ‘The Big Questions, BBC1, 12 June 2016; 22.16-22:40).

Justin Champion was a consummate scholar, producing works that offered new ways of seeing the early modern world that were based on novel approaches to reading the past. In his writings, many of which he made open access, he broke new ground in the study of early modern radical thought. His debt to Christopher Hill was obvious and acknowledged. In his 2018 memorial lecture he restated the long tradition of civil rights struggle in England, countering revisionist criticisms of Hill by pointing out that he had pioneered a new way of looking at neglected materials that, in turn, had opened new possibilities for further research into radical discourse and action. Indeed, in Justin’s work, Hill’s project is followed so that space can be found for the voices of the marginalised and for their language of dissent to be heard. This was a part of the golden thread of liberty that Justin grew ever fonder of talking about. It was also why he became a valued ally of oral history and a proponent of public history.

There are multiple versions of the Hill lecture. There is a published version, ‘Heaven Taken By Storm’, which is a fantastic read, but it is the earlier spoken presentation that I like best. Justin’s lecturing style, with its autobiographical asides and occasional self-deprecation, points to his belief in communicating complex histories with a sense of humility and fun. It is also testament to his conviction that the historian’s life experience influences their methods and their understanding of the past. Conversely, it was people’s lives in the past that sparked his curiosity. His radio programmes included biographies of well-known historical figures such as Francis Bacon, but also less recognised individuals, such as Jacques Francis. It was this interest in Francis and Southampton, where he went to school, that led to his interest in black history.

There will be still other varieties of the Hill lecture recalled by colleagues and friends, as he regularly tried out how he might express his thoughts by casually dropping nuggets of weighty historical analysis into everyday conversations. Indeed, there must be hundreds of people who have fragments of Justin’s developing radio, television, and lecture ideas lodged in their memories. His radical networks were not only social networks that existed in the history he studied, but also were part of his everyday social relationships.

The public reception of the past was of key importance to Justin. He proselytised against academic insularity and for university engagement as a public utility. As far as he was concerned, people’s deep passion for the past was an opportunity for historians to help enhance and improve contemporary debate. He often found ways to gently chide those historians who thought that the past was too foreign a land for the masses. This was the view of a man whose father had found his way to study at Cambridge through the Workers’ Educational Association. Good history, for Justin, explained prejudice and inequality; it was critical and provoked debate. This passion for public history led him to fight for the founding of a pioneering MA in Public History at Royal Holloway, University of London, the institution at which he was based for most of his career. The MA was ultimately launched in 2009 and is currently the most well-established programme of its kind in the UK. Justin’s collegiality and the warm-friendship he enjoyed with the editors of the site also led him to make numerous highly engaging contributions to Historians for History.

Justin in RHUL 2
Justin Champion discussing Black History Month, Founders Buildings, Royal Holloway, 2017

His search for historical truth was an inherently aesthetic and ethical project and Justin’s public history was as much a moral as an intellectual venture. History mattered, he told his peers in 2007, because it has the capacity to produce better human beings. History could help the public deal with the difficult and the unknown, and quoting Hugh Trevor Roper from five decades earlier: ‘History that isn’t controversial is dead history’. His art of history was not only to be found in uncovering past lives but in conveying their historical significance. He rejected empiricism and objectivity, yet he loved his numbers, not only in his earlier work on ‘epidemics and the built environment in 1665’, but in later years, when he could quote visitor numbers to heritage sites in support of his argument for a citizens’ history.

He could also note in passing in an article about history on television that in A History of Britain (BBC, 2000), Simon Schama had spoken 300 times directly to camera. Justin’s ability as a public intellectual was based on studying the limitations and opportunities of different media and how others presented historical concepts through them. He was interviewed for many programmes, and appeared in at least seven television mini-series and documentaries from Fire, Plague, War and Treason (2001) to Charles I: Downfall of a King, which aired just last year. He loved making television programmes and would enthuse on the craft of production and revel in reaching new audiences. He enthused about television as a ‘platform for communicating its learning and moral integrity in an energized and enthusiastic public sphere’.

He was engaging on television, but he truly excelled on radio in explorations of complex historical ideas. A regular on In Our Time, the BBC Radio 4 flagship programme presented by Melvyn Bragg, he contributed to episodes on miracles, Calvinism, the trial of Charles I, the apocalypse, divine right, Bedlam, The Putney Debates, and the Quakers. There were other programmes on duelling, and toleration and a programme on the history of reading, based on marginalia and the defacing of books. He reviewed for Nightwaves and commented on various historical issues for the Today Programme and Radio 5 Live.

He also took his own responsibilities as a citizen historian seriously. As the President of the Historical Association from 2014 to 2017, he used his position to highlight the lack of black historians working in schools and universities. In response to being awarded the Association’s Medlicott Medal in 2018, he offered a hugely insightful lecture on ‘Defacing the Past or Resisting Oppression?’ in which he described the alterations and removals of statues and public art depicting controversial historical figures and actions.

He spent his final years dealing with cancer: ‘It is a bit annoying to say the least’, he noted in a memorable interview for Royal Holloway’s college radio, explaining, ‘that is the reason I’ve taken early retirement, and that’s a big loss for me. Not so much the routines of academic life, but the opportunity to distort and corrupt young minds is one, you know, you could never get tired of that’, he said, making one of his favourite jokes. ‘And the moments of revelation when someone gets it and goes on to have their own career is just fantastic’. He had an infectious love of teaching and a deep respect for students’ perspectives. While understanding the validation we receive from our teaching, he asked his colleagues to think beyond themselves. Passing a room where he was holding a class lightened the darkest of moods, such was the noise of laughter and discussion. He would frequently regale colleagues with insightful points students had made in class or tell of how they had responded to one of his teaching exercises. He wanted to learn from his students and was genuinely interested in their thinking, and the response to Justin’s death on social media has demonstrated the affection he was held in by former students as well as colleagues.

Listening to his voice again has made me deeply sad at his loss, but personally and profoundly grateful that we spent time together in long discussions that ranged from memories of music and politics, to food and sport, to College governance and union business. Listening to his voice reminded me that he excelled as a storyteller. He was so much more than a partial obituary can cover. The love for his daughter Alice and his partner Sylvia. The pride he took from managing Abbey Rangers Football Club Ladies team, a resilient and robust football team that was a personal passion for him. The loyalty to his friends, even when they did something stupid. If Justin had been asked if he had a role model, he might have said Joe Strummer or Stormzy, rather than an eminent historian, and we are greatly diminished by the loss of his cool rage against the ‘filth’, as he described it, of racism, inequality and inhumanity.

Graham Smith is Professor of Oral History at the University of Newcastle and co-editor of the Historians for History site.

‘No Votes, Thank You!’ The women who campaigned against their own suffrage by Chloe Binderup

“We’d like to invite you to sign our petition regarding the question of women’s suffrage. Nearly three hundred thousand others have already signed their support; we do hope you’ll join us.”

It’s 1909 and three well-dressed women approach after your latest London lecture. As you’re handed a page already filled with scrawled signatures, your eye catches on the headline at the top of the petition, printed in bold block letters: WOMEN’S NATIONAL ANTI-SUFFRAGE LEAGUE.

Hang on a second – women’s anti-suffrage?

No, you haven’t stumbled into an alternate historical fiction or wacky Dr. Who episode. Indeed, by 1910, this British women’s group would collect over 337,000 signatures not in support of their right to vote, but against it. Four years later, the Anti-Suffrage Review announced that membership in the British National League for Opposing Woman Suffrage (or Women’s National Anti-suffrage League, WNAL) stood at 42,000 people, and five out of six of those were women.

no votes 1
Sticker and pin-badge of the British Women’s National Anti-Suffrage League, c. 1910 – 1912

Around the turn of the 20th century, as the campaign for women’s right to vote gained momentum, large numbers of women in both Britain and the United States wrote, argued, and campaigned actively against suffrage. And this was no weak or meek opposition either, but consisted of well-organised groups and included many determined, educated and passionate middle and upper-class activists. These ladies were serious.

From the vantage point of the 21st century, just after the hundred-year anniversaries of the UK’s Representation of the People Act and United States’ 19th Amendment, it can be difficult to understand these women. Why would they work, often passionately, against their own self-interest? Wouldn’t women invariably want to support women’s rights?

In their minds, they did: those who argued most stridently against female suffrage actually presented themselves as defenders and supporters of their fellow women. Writer Violet Markham worked for social reform and societal improvement, and saw her campaign against the vote as part of this mission. In 1912 she wrote of her fellow ‘Antis’: “We do not depreciate by one jot or tittle women’s work.” It was rather a matter of encouraging “proper channels of expression for that work”.

Interestingly, similar arguments emerged on both sides of the Atlantic as the United Kingdom and the United States grappled with increasing public debate on female suffrage. Just as suffragists from either country visited one another to educate, fundraise, and support, so too did anti-suffragists gain inspiration from visiting sisters across the sea. Numerous letters from future leading Antis show that American anti-suffragism was an important influence on the founders of the WNAL. It is perhaps unsurprising, then, that British and American anti-suffragist women shared the key arguments against their vote:

Women are Not Men

“We believe that men and women are different – not similar – beings, with talents that are complementary, not identical, and that they therefore ought to have different shares in the management of the State”
WNAL Manifesto

no votes 3
Postcard depicting a very feminine personification of womanhood politely declining the vote. Behind her, a much less demure figure storms by with hammer in hand (Bird, Harold and J. Miles and Co., 1912)

To understand anti-suffragists, it’s essential to understand the rigidity of their society’s perceptions of gender, and how important they thought it was for men and women to stay in their respective lanes. As they saw it, men were in one sphere of society doing their manly things (government, industry, military, religion), and women were in another doing theirs (raising families, managing households). “Because the spheres of men and women, owing to natural causes, are essentially different”, argued the WNAL, their relationship to government should be different.

Anti-suffragist women did not necessarily believe that men were better or smarter – they just weren’t women, and this meant they naturally ought to do different things. So, if men could influence the state by voting, women could contribute in the domestic sphere.

Women are Better Than That

“It would be a bigger feather in a woman’s cap – a brighter jewel in her crown – to be the mother of George Washington than to be a member of Congress.”
Jeannette Gilder, Why I am Opposed to Women’s Suffrage

British writer Ethel Harrison viewed herself as a strong supporter of her own sex, and saw no irony in titling her 1908 book The Freedom of Women whilst arguing passionately in its pages against their vote. In her mind, women had much more important tasks than dirtying their hands with politics: they were the moral guides of the family, and so had a special role to play in guiding society and the nation. “It is in no sense because we undervalue the importance of women’s contribution to public life that we depreciate and deplore the agitation for the vote,” she wrote. Rather: “We think women can do better for themselves and the world.”

As this argument went, women’s moral and emotional strengths gave them better and stronger political influence by not voting: instead, they could effectively and appropriately influence politics as really excellent mothers who raised and married responsible voters and good politicians.

Danger! Danger!

“The admission to full political power of a number of voters debarred by nature and circumstances from the average political knowledge and experience open to men, would weaken the central governing forces of the State, and be fraught with peril to the country.” -WNAL Manifesto

Finally, it would be outright dangerous for women to have a say in state affairs. Women, the argument went, did not and should not take part in the business of government. Without experience in financial, diplomatic, industrial, or military careers, how could they tend to affairs of state? It would be dangerous for their votes to affect arenas about which they knew nothing, and make laws they could not enforce.

In hindsight, of course, votes for women seem inevitable, and historical narratives mostly focus on the heroic struggles of the triumphant suffragists against misogynist male efforts to block equality. But history continually refuses to obey the simple categorisations and reductive assumptions it is so easy to make.

Women's first vote - F. Matania
‘Women Voters Recording their Votes for the First Time’ by former war-artist, Fortunino Matania, December 1918

The journey towards female suffrage in the US and the UK involved substantial debate, discourse, and disagreement – even violence – but was not a clear-cut battle between the sexes. Suffragists could be either men or women; so too could the anti-suffragists who opposed what they viewed as a radical and problematic idea. Certainly, they grew up and lived in a world reinforced with fixed views and beliefs, and it would be easy to dismiss these women as brainwashed pawns in an unequal, unenlightened era. But to do so belittles a huge number of passionate and intelligent women who truly believed they were supportive of, and working for, their fellow women.

Chloe Binderup is a student on the MA in Public History programme at Royal Holloway, University of London.

Further Reading:
Incredible Pro-and Anti- Suffrage Memes in The Atlantic
The British Library’s Votes for Women Archive
Women Against the Vote: Female Anti-Suffragism in Britain by Julia Bush

Mickey Mouse History? Disney’s America: The Theme Park that was Never Built by Tom Farrell

With twelve parks worldwide, Disney have an empire on which the sun never sets, and the fun never stops – until now. Back in March, all Disney Parks worldwide were simultaneously closed for the first time, due to the ongoing COVID-19 crisis that has impacted just about every aspect of our lives. In time, of course, their gates will reopen, and Mickey will welcome us back with open arms (or perhaps just a friendly wave if social distancing is still in force). But for one park, its gates will remain firmly closed regardless of the coronavirus pandemic, because they were never opened in the first place.

Imagine this – a Civil War re-enactment pitting brother against brother; a Native American village, soon to be razed by the encroachment of the federal government; a nineteenth-century town where slaves are auctioned off like animals amid the reverberations of the crack of the whip. Welcome to Disney’s America, the theme park that was never built!

In 1993 Disney announced plans to build a park based entirely on American history – Disney’s America. It was due to be built near the town of Haymarket, Virginia, twenty miles from Washington D.C., and was expected to draw 30,000 visitors daily who would be able to enjoy various themed areas or ‘playlands’, including a Civil War fort, an authentic family farm, and a reconstruction of Ellis Island. However, just one year later the project was dead in the water. Why did the park never materialise, and how are the central issues it highlighted relevant to public historians today?

Logo for proposed Disney’s America park

Sacred Soil?
Almost immediately, Disney’s America came under fire from historians who opposed the project on the basis of its proposed location. The park would have been situated four miles from the Civil War battlefield of Manassas and so many scholars mobilised against the project, forming the Protect Historic America (PHA) organisation. Drawing on Lincoln at Gettysburg, they argued that the Virginia Piedmont region was hallowed ground, and that Disney would not add, but would certainly detract. Historian David McCullough’s forecast was almost apocalyptic: ‘Once this commercial blitzkrieg comes, it will never be the same again.’ However, it seems this “Sacred Soil” argument is somewhat of a red herring. Had this been the only reason for opposition, surely another site could have been found; America is not short of vast open spaces ripe for development. Indeed, focus on the preservationist movement detracts from another, potentially more important reason the project never came to fruition – the question of who owns history?

Disney's America Site Location
The proposed site was just four miles from the Manassas National Battlefield Park

Another source of hostility was a certain intellectual arrogance that questioned Disney’s right to teach history at all, bearing in mind that the company has a long history of providing the public with romanticised representations of the past that have featured prominently in their motion pictures and theme parks. Influenced by this utopian track record, some historians were concerned that Disney would be disseminating “Mickey Mouse” history and therefore wished to gag the world’s most famous rodent before he gave any history lessons. Linda Shopes was one such censor, arguing ‘I simply do not believe such ventures can do good history.’ In fairness, there was some cause for concern. Firstly, the name Disney’s America implied the park would be a Disneyesque view of American history. Then there were the park’s nine “playlands”, suggesting that history is a toy, designed for entertainment and distraction, not a serious intellectual discipline. Fears were further fuelled when Peter Rummell, president of the Disney Design and Development Company, said Disney would not apologise for its belief that ‘the American story is profoundly positive and uplifting’.

Disney tried to address these concerns but did not help themselves. Rummell promised ‘the park will not whitewash history or ignore the blemishes’, but was soon undermined by Bob Weiss, Disney Senior Vice President, who dropped this clanger: ‘We want to make you a Civil War soldier. We want to make you feel what is was like to be a slave …’ In his attempt to appease critics, he only managed to trivialise issues of great historical significance. One African-American journalist was moved to report that if he ever visited, ‘my main concern there will be keeping the kids away from the slave show’.

Yet, despite all these blunders, were scholars right to appoint themselves as the gatekeepers of history? As the great proto-public historian G. M. Trevelyan said, ‘if historians neglect to educate the public, if they fail to interest it intelligently in the past, then all their historical learning is valueless except insofar as it educates themselves.’ Whilst opponents were right to worry about historical accuracy, Disney’s America undeniably had great potential to get people interested in the past.

Walt Disney - Disneyland Plans
Walt Disney with plans for the original Disneyland, Los Angeles, 1954

“The Greatest Pedagogue Of All”
In 1965 the American educationalist, Max Rafferty, described Walt Disney as ‘the greatest pedagogue of all’, and he had a point; Disney’s educational potential is immense. Over one billion people have passed through the gates of Disney theme parks since 1955, and historian Mike Wallace has suggested that ‘it’s possible that Walt Disney has taught people more history, in a more memorable way, than they ever learned in school’. Bearing this in mind, perhaps if historians truly want to educate the public, they should help plan Disney’s history lessons. There is some precedent for such a collaborative relationship. In 1993 Columbia history professor Eric Foner worked with Disney on The Hall of Presidents renovation, helping to transform the show from a glorification of the presidents and the Constitution, to a story of America’s constant struggle to extend democratic rights, which is still an unfinished agenda. Such partnership could also have helped ensure Disney’s America was a responsible historical medium and it was even something Disney appeared to desire. After retreating from Virginia, project chairman, John Cooke, claimed the company would be ‘reaching out to historians who have opposed us to make sure our portrayal of the American experience is responsible’.

“Join the Jamboree”
All in all, it is a shame that Disney’s America never materialised. Although we can speculate what it would have been like as an outlet for public history, we cannot actually experience it and say for certain whether it would have been a success in terms of historical education as the concept is a relatively unique one. Whilst living history attractions such as Colonial Williamsburg have demonstrated that education and entertainment are not mutually exclusive and that history can be “brought to life”, other institutions do not have the might of Disney at their disposal. With Mickey’s piggybank and the combined creative genius and technological wizardry of all the Disney imagineers (those who design and build Disney attractions), a completed Disney’s America would have been in a category of its own. Indeed, “Mickey Mouse history” or not, there is no denying the great potential the project had as an innovative approach to teaching history. Walt Disney once said that he wanted Disneyland to be a place ‘for teachers and pupils to discover greater ways of understanding and education’, and it appears Disney’s America was envisioned in the same vein. Crucially, for historians, joining the jamboree does not have to mean chanting M-I-C-K-E-Y M-O-U-S-E. Perhaps through collaboration the chant can change to R-E-S-P-O-N-S-I-B-L-E H-I-S-T-O-R-Y. Indeed, if we move beyond academic cynicism and achieve positive co-operation between scholars and creative partners, this attempt to bring the past into the present could become the future.

Tom Farrell is a  history graduate of the University of York currently enrolled  on the MA in Public History programme at Royal Holloway.