About Julia Wood

Julia Wood, (M.A., University of Warwick) is an author, Oscar Wilde scholar and personality. She has received extensive press and television coverage for her distinctive Edwardian lifestyle and designs all her own clothes. Julia is currently working on a novel - a ghost story - set in the Edwardian era. Visit www.julia-wood.com. Follow Julia on Twitter @edwardianspice

Is It Time to Reconnect?

“It’s imperative that we bring children into close contact with the miracle of evolution…and by so doing underline man’s consciousness of being responsible to a unit much greater and more valuable than himself, of which he is part”. (Konrad Lorenz, The Waning of Humaneness)

Has the Western world lost its way? This is a question being posed with disconcerting regularity by economic theorists since the onset of the latest in a series of catastrophic economic recessions.

But one could also pose the same question in a different sense: has the western world lost its way historically, culturally and on an individual level?

We are, it seems, engaged in a frantic search to reconnect with the past, as well as with our roots and origins, whether culturally, through the fixation with retro-referentiality, or personally, through the fascination with tracing our ancestry.

On programmes such as, Who Do you Think You Are? – celebrities are followed as they trace their ancestry, often with distressing or intensely joyous consequences. Likewise, on My Long Lost Family, members of the public engage in an often emotive search for missing relations. The journey to rediscover and to reconnect with, hidden aspects of our ancestral past has become a source of fascination.

We are becoming, it seems, a society that hankers after some mythical ‘lost’ part of ourselves, some missing part of our identity, in order to feel whole again.

There are numerous reasons for this and on several levels. On the cultural level, a type of spiritual ‘homelessness’ is part of the conditions of modernity as identified by the philosopher Martin Heidegger – who coined the phrase ‘we homeless ones’ to describe how nihilism and the rise of technology have precipitated this rift with our roots and with the essence of our selves, leading to a kind of oblivion of being. Disconnection from, not simply the past, but from the higher values imbibed from religion. Indeed, from the many different etymological derivations of the word religion, the mythologist Joseph Campbell favoured the root of the word as being from the Latin re-ligure, meaning to reconnect – hence the title of this piece.

Couple with the rise of the alienating force of technology this has led to a kind of existential rootlessness.

Since Heidegger’s time ( he died in 1976 but published Being and Time, his best known work, in 1927) – we have witnessed the gradual decline of the extended family and the increasing isolation of many peoples’ lives. People – being more geographically dispersed due to job changes and improved travel – are often lonely and cut off from the networks that once enriched people’s lives. In part then, this hankering after a connection with the past is partly due to a very literal sense of disconnection with the present.

Such feelings may lead us to begin the search for our roots, not simply because we want to feel connected to our past but because often these ancestral searches lead us to family members in the present with whom we may hope to establish friendships and connections, rekindling our sense of family in the spiritual sense but also quelling a more tangible loneliness.

Julia Wood - Features Writer

In many ways this search for a forsaken inner wholeness can be an inner journey, a voyage of self-discovery and self-understanding. Knowing where we come from can provide us with a sense of certainty and a degree of emotional security. There is consolation in feeling that we know where we belong, which can help us to feel more grounded. It can reassure us, especially in these uncertain times, helping us feel less cast adrift by the shifting waves of social and economic change. In Heidegger’s worse, less homeless.

But what does it mean on a cultural level, this search for our ancestry and origins, this need to be in touch with our history; the yearning to return ‘home’?

The state of homelessness leads to collective introspection redolent of a culture which has become more introverted and inward-looking. This phenomenon – more notable during times of economic recession – is indicative of a fear of the future and of what the future holds. We don’t like what think we see ahead so we look away; we turn within and we become obsessed with the past.

Of course it is the expansion of global networks and communications that has facilitated these introspective leanings, providing us with access to ever-greater banks of information. The rise of Google and Facebook has meant that we can conduct searches for people with who we wish to reconnect: websites such as Friends Reunited and Find your Ancestry make it especially easy for us to engage with this introspective culture.

Yet, perhaps ironically, it may be the speed with which technology has progressed in the last hundred years which has also become the catalyst for this need to reconnect with our roots. The impetus to return to nature, the rise of the Green movement and the striving to implement ecologically aware ideals into our lives through recycling and grown-your-own produce also reflect a desire to move closer to nature. The rise of the machine has in many ways impinged upon our humanity, moving us from a world of animate nature to the dehumanising world of the inanimate machine.

In the twentieth and twenty-first century machines continue to replace humans: answer machines delivering endless options except the one of speaking to another human being; self-service tills, paying-in machines – these are all devices that interfere with day to day human interaction, creating a fissure between ourselves and the world we inhabit, dehumanising our world through the depersonalisation of our daily interactions and discourses.

We have brought ourselves to the condition of self-imposed exile and alienation from our human origins and only we can extricate ourselves from it, before it is too late. As Heidegger might say, perhaps it is time for us to make our way home.

Is There an Afterlife?

Laurence Olivier as Hamlet

“That undiscovered country, from whose bourne no traveller returns, puzzles the will…”
(Hamlet, William Shakespeare)

Is there an afterlife? Where do we go when we die, or do we just vanish back into the nothingness from whence we purportedly came?

Philosophers for centuries have been exploring these questions, and religions of the world have attempted to offer answers. Yet we are no closer now to an understanding of this complex issue than our distant, and not so distant, ancestors.

Our relationship to the afterlife, whether we believe in one or not, is precisely that: a belief. It is not, nor can it ever be, a knowledge, in spite of the best efforts of science to prove or disprove its veracity.

On account of wanting answers to the question of death and the afterlife, people who would not ordinarily describe themselves as religious, may turn to religion, simply because religious discourse is the only type of discourse that explores this phenomenon with any seriousness.

In our modern world, the Ten Commandments still hold true yet they denote a social rather than religious issue; crime rather than sin. Refraining from stealing or murdering is viewed by most people as integral to a fully socialised, well-adjusted human being. Breaking the ten commandments is anti-social rather than sinful and we do not need the Bible to tell us that cheating on a spouse or loved one, or lying to someone, is socially unacceptable.

People do these things, yet still feel guilty about them because there is still an instinctual sense of what is right and wrong, in spite of the complexities of modern life.

Yet when it comes to death and the question of an afterlife, we are lost. There are no sources to turn to, no manuals with instructions, no right or wrong answers.

There are of course, plenty of self-help books for those suffering from a bereavement, but nothing that can explain to us what happens when it is our turn: what happens when we die?

Through death and death alone we stand in stark, absolute relation to the very essence of our humanity: our vulnerability, our frailty and our fallibility. Even love can be explained away by pheromones, according to science.

Yet it is in death that we encounter the essence of our imperfection and our powerlessness. No one is immune. No one is safe. None of us are protected.

It is quite conceivable that without death – if by some fluke or evolutionary hiccup we had managed to overcome our mortality and defy death we would no longer have the need for religion, or indeed faith of any kind. Faith would be superseded by knowledge, and we would live in a world of absolutes.

A strange thought indeed. Living in world entirely defined by knowledge, in which blind faith has no place. It is a world that science is perpetually trying to bring about.

Yet if our existence were defined purely by what we knew as opposed to what we believed, what effect would this have upon our humanity?

We would doubtless become less human; perhaps even a weaker and more cowardly species, since faith of any kind requires courage, a leap into some kind of unknown.

It also requires us to search ourselves for an understanding that science cannot provide and although through such intellectual meanderings we cannot achieve knowledge of the world, we can achieve a greater modicum of self-knowledge.

Without death, and the challenge it presents to our humanity, we would become less humble; more insufferably arrogant than one might think possible. As the dominant species we have no predators. Our only predator is death, our only Achilles heel is our mortality.

Yet would we want to live in a world entirely determined by reason? Of course we would not want to live in a world devoid of reason, where, as in the case of the ancients, we and the crops we grow are at the mercy of the seasons and causality can only be explained by mysterious and often vengeful forces.

Of course a world without reason is a world largely without form, in which nothing is clear or defined, in which things only make sense with recourse to superstition and ritual.

Yet a world without faith is a world without content, in which there is only the bare causality of events as explained by science, in which the hidden mysteries and the unseen things are banished.

But would we really want the answers, if we were offered the opportunity to find out, if the mystery of death could be revealed to us once and for all, indisputably, by a reliable source?  Knowledge is, after all, a responsibility and like all responsibilities, a burden.

Faith; belief, hovers just outside the perimeters of knowledge and when in our intellectual meandering we traverse those perimeters we are on our own, with only our imaginations for company, and maybe the Bible, or other religious tracts if we are so inclined. On the subject of death we are reconnected with our ancestors, as clueless and confused as we have ever been.

We can play out the fictions we create for ourselves, suspending reality to talk about long tunnels and white light, even as we might query the legitimacy of our thoughts; like having a naked lunch moment but deciding to carry on eating anyway.

Our afterlife contemplations are not in response to an extrinsic image of a god, or an afterlife, but rather are reiterations of iconic images occurring within our collective imaginations, in our culture. Such ideas are to be found all over the world, in cultures past and present, so the anthropologists tell us.

The afterlife and our ideas of it, is also where Nietzsche’s Will to Power comes into force. We may employ faith to embrace images of angels and a white-bearded god, whilst also being aware that it is we who are doing the believing.

Yet how many of us can help thinking that our deceased family members are watching us? How many of us have even talked to them, whilst also maybe feeling a tad silly and wondering if we’re heading for the funny farm?

However, the alternative is to embrace oblivion and accept that after death there is nothing. As Nietzsche might have said we are not quite brave enough to believe that.

Images reproduced from surmise-en-scene.tumblr.com, videojug.com, rosenblumtv.com and hothdwallpapers.com

The Resurrection of Oscar Wilde

Wilde’s persecution and exile have been regarded by some as a “crucifixion”. There has been a crucifixion, so, it follows; there must be a resurrection. Such is the power of the narrative; of the myth-making machinery that operates in our culture in the creation of icons.

Oscar Wilde

Oscar Wilde has a remarkable capacity to touch the lives of the twenty-first century reader, to make people feel as though he is someone with whom they are so familiar that it is as if he is their personal friend.

In fact, at times, he seems so contemporary and like “one of us” that we could be forgiven for thinking he is alive and well and living in the twenty-first century. The word that always springs to mind when considering such notions is “spirit”.

Indeed, over the past century, there have been numerous reports from people, claiming that Wilde has “appeared” to them, or has been “spotted.” One sighting, by a student at Magdalen College in 1934, claimed that he was seen drifting across the College quad in his graduation gown.

John Stokes in his book, Myths Miracles and Imitations, writes of Wilde having been seen in New York in 1905 and again in 1912 by his own nephew, Arthur Cravan.  In the latter example it is a dark and rainy night … and the apparition of Wilde appeared to Craven in his flat. According to Stokes, Cravan turned upon Wilde and abused him, but, suddenly overcome by pity, ran after him, calling his name and, when he realised Wilde had gone forever, he returned a desolate man.

Such a story seems to me to provide the perfect metaphor for the history of Wilde’s cultural reputation.  It is almost a story in miniature of his fall and subsequent rise to glory.  He was abused, he fled this world and now that world is sorry and wants to call him back, so much so that wishful thinking sometimes overflows into belief that he has been “seen”, spotted somewhere.

Elvis Presley also has the capacity to generate such rumours.  Elvis has been “seen” in some fairly surreal situations: pushing a trolley full of fish fingers outside a supermarket in L.A., where his Cadillac was parked in the disabled spot; eating a meal in Burger King wearing a white robe (what else do ghosts wear?); washing his smalls in a laundrette in West London, telling the attendant “you ain’t nothing but a hound dog” when they ran out of soap flakes.

There have been various attempts to make “contact” with Wilde. Perhaps the most amusing case was the recording made of the late Leslie Flint, a famous medium back in the 1960s.

When asked to speak, he replied, “I have never been known to say nothing” and he told the medium he was still writing and having his plays performed, saying that, “more money has been made out of my reputation since my death than I was ever able to make out of my plays, which goes to show that sin is very successful.”

The point of these ramblings about sightings and so forth is that such cases are illustrative of the power of personality – the power of the spirit. Aided and abetted by the advent of the media and its rapid expansion since Wilde’s time, some of that power is accrued through the reproduction of images, the Oscar Wilde industry, as it has become known.

Although dead for over a century Oscar is very much alive to us, not simply in the sense of being immortalised through his works as indeed many authors are, but because he was – is – larger than life – because he was more than a writer – he was a celebrity. Like many celebrities, such as Elvis Presley, it is hard for many to believe that he is dead.

He is so much a part of modern life that it is hard to believe he died all those years ago.  Wilde, in the manner of a spirit, retains a “presence” – one might be tempted to say “omnipresence” – within popular culture.

Of particular note is Wilde’s appeal to the teenage demographic, an appeal which is at least in part due to the fact that he speaks to the outsider in people.

It is no wonder then, that the author Michael Bracewell had, he confessed, two posters on his bedroom wall when he was growing up: one of David Bowie and one of Oscar. “Bowie came down after awhile”, he noted. “But Oscar stayed.” Stephen Fry too, noted that teenagers “trembling on the brink of bourgeoisification” look to Oscar as an inspiration. Indeed, there is a certain fragility about his position in Victorian society, his status as a wit and an artist – that seems to provide the perfect metaphor for the struggle against conformity endured by young people, especially teenagers, for whom individuality (a.k.a. identity) is vitally important, but who are all too keenly aware of their vulnerability to the ravages of social pressure.

Wilde’s brief career is very “teen”, in the sense that it represented a brief oasis of self-expression, flourishing in a desert of conformity.  It was all too quickly quelled, he was packed off to prison where he was stripped of his individuality, had to wear a uniform and to do what he was told.  He can be regarded as a metaphor for those with artistic aspirations who maybe cultivated an interesting style for just a few brief years of their lives, before they have to forsake themselves and end up working in an environment that does not make room for individual expression.  Like the 1890s itself, his was a flame that burned too brightly and was all too soon snuffed out.

Of course, Wilde does not merely appeal to teenagers. HE appeals to people from all walks of life. He has, albeit posthumously, become a figurehead for a whole range of communities, that have gathered around him, fought over his legacy and claimed him as their own.  The main one of these is, of course, the gay community. Over the years, much has been written about Oscar’s gay identity, about whether or not he would care to be seen as a “gay” author.

On the centenary of Wilde’s death, thousands of people came from all over the world to pay their respects, leaving flowers and messages at his graveside, such as, “love you always” and “I will keep you forever in my heart”. One message, written in French, said, “For Oscar Wilde the outraged martyr, who died in the name of love”.

The notion of Wilde as a modern celebrity is a frequently discussed one. In the twentieth and twenty-first century, with artists and celebrities becoming, for many people, like Gods or Guru figures, Wilde and his reputation fit with ease into this cultural template.

Wilde himself would perhaps not be surprised, living as he did in a culture where this had already started to happen, when the spiritualist Madam Blavatsky was looking to nominate her guru to popularise spiritualism and when actresses like Sarah Bernhard were commanding the kind of adulation now given to stars like Madonna and Kylie. Indeed, Wilde himself commanded such adulation, when on his American tour with his manager and publicist, the Victorian equivalent of Max Clifford.

Wilde epitomises the spirit of our time and that is why it feels as if he is alive and well and living in Chelsea, or Paris, or wherever one might picture him to be.

As Ellmann so aptly pointed out, “he belongs to our world more than to Victoria’s”. After a decade of celebrations Wilde’s “resurrection” is finally complete and he is restored to us in all his resplendent glory.

Julia Wood is the author of The Resurrection of Oscar Wilde: A Cultural Afterlife. (Cambridge: The Lutterworth Press, 2007) £15.00 pbk 164pp ISBN 978-0-7188-3071-7

A Victorian Christmas – Part 1

How did the Victorians celebrate Christmas? Julia Wood – City Connect’s Features Writer on Art & Culture - looks at the Victorian Christmas and finds out the truth behind the nostalgia and tradition…

Christmas is a time when not merely individuals, but culture itself, turns reflective and introspective. It is a sentimental free-for-all, in which nostalgia reigns supreme and the hankering after Christmases past, in the times before cars and computers, is too great to resist. At the centre of such hankering is the iconic Victorian Christmas, in which there were several feet of snow and people drifting around in fur-trimmed red velvet capes to the sound of Christmas carols.

In reality, the Victorians’ weather was similar to ours: mild and rather disappointingly unseasonal. The concept of snow at Christmas was in fact, borrowed from the eighteenth century when there was a mini ice age and the Thames froze over.

Yet our fascination with this Victorian ideal of Christmas, with its frosted landscapes and crackling fires continues and is responsible for the plethora of period dramas gracing our screens at this time of year. I think, in particular, of adaptations of Charles Dickens, whether repeats or new dramas, such as the BBC’s Great Expectations, starring Gillian Anderson as the cobwebbed and querulous Miss Havisham.

Indeed, one cannot imagine Christmas at all without Charles Dickens. The novels of Dickens, a key Victorian figure, epitomise what we consider to be the quintessential Christmas. This is particularly true of A Christmas Carol, which helped popularise the tradition of Christmas and its associated festivities.

The Victorians invented much of the iconography we now associate with Christmas. It was Queen Victoria and Prince Albert who brought the Christmas tree to England, and the Illustrated London News in 1848, produced a picture of the Royals, gathered around a lavishly decorated Christmas tree. Subsequently, the public followed in their wake and every Christmas tree in the land was adorned with candles, sweets, fruit and gifts.

Christmas Card, designed by J.C. Horsley for Sir Henry Cole, 1843

Back in 1843 Henry Cole commissioned the first Christmas card, which depicted some people around a table, and contained a Christmas message. Although the prohibitive cost of these cards meant that they did not catch on straight away, children were encouraged to make their own Christmas cards. However, the rapidly advancing industrial age made it possible to utilise colour technology to produce Christmas cards at a faster rate, which had the effect of reducing the price and by the 1880s the sending of Christmas cards was a highly popular tradition. In 1880, 11.5 million cards were produced and we see the first intimations of the commercial machine Christmas has now become.

Another Victorian invention is the Christmas cracker. In 1848, British confectioner Tom Smith discovered a new way of marketing sweets. He took the idea from a visit to Paris, where had seen bonbons wrapped up in paper that was twisted at either end. Smith produced packages that were filled with sweets and which snapped when pulled apart. By the late Victorian period, these had evolved into parcels containing small gifts and paper hats.

As with all things Victorian, the Victorian house at Christmas was lavishly decorated and much time and effort spent on decking out the house using elaborately woven evergreens around fireplaces and doors. As Christmas became more popular, these decorations assumed a more significant position in the house.

Read Part 2 of Julia Wood’s “A Victorian Christmas” on 30 December 2012 exclusively on City Connect.

Images reproduced from telegraph.co.uk and vam.ac.uk

Titanic: The Myth Lives on – Part 2

The sinking of the Titanic was one of the greatest tragedies of the twentieth century and many people connected to it – survivors and those who lost loved ones – were haunted by it – the glory and the hope – so soon to become a broken wreckage lying at the bottom of the sea.

What followed the tragedy was a public mourning – characterised by disbelief and shock, the shattering of those naïve dreams and aspirations towards splendour, opulence and invincibility. 

Titanic’s untimely demise represented a kind of fall from grace for humanity; a reminder of our weakness in the face of nature; our humble place in the scheme of things and our failure to comprehend that weakness.
The sea cared nothing for who was rich and who was poor. As the poet Thomas Hardy wrote, “Over the mirrors meant / To glass the opulent / The sea worm crawls – grotesque, slimed, dumb, indifferent.”

Yet perhaps the best expression of public feeling in the wake of the tragedy is described by Titanic survivor Lawrence Beesley, who perfectly captures the tremendous excitement and anticipation, so quickly followed by dashed hopes and loss:
The history of the R.M.S. Titanic of the White Star Line, is one of the most tragically short it is possible to conceive. The world had waited expectantly for its launching and again for it’s sailing; had read accounts of its tremendous size and its unexampled completeness and luxury; had felt it a matter of the greatest satisfaction that such a comfortable and above all such a safe boat had been designed and built- the “unsinkable lifeboat”- and then in a moment to hear that it had gone to the bottom as if it had been the veriest tramp steamer of a few hundred tons; and with it fifteen hundred passengers, some of them known all the world over! The improbability of such a thing ever happening was what staggered humanity.

So what of Titanic’s legacy? In the years immediately following her sinking, the town of Southampton was a community in mourning. Of her eight hundred and eighty crew, more than six hundred were from Southampton and only fifty one of those survived. It was a loss keenly felt.

 

Julia Wood - Features Writer

As I pointed out in my book, The Resurrection of Oscar Wilde A Cultural Afterlife (Lutterworth Press, 2007) with regard to the tragedy of Wilde’s downfall, mourning on a collective scale induces a multitude of conflicting emotions: denial, shock, anger and disbelief. The usual questions are asked: Why? How? And most pertinently, Who?

Who was to blame for this terrible tragedy? The grieving public, the local community and those who had lost people – wanted someone to blame; a scapegoat upon whom to vent their anger and grief.

The search for a scapegoat for the tragedy led to the media persecution of Mr. Ismay, whose good name was destroyed by the disaster, under whose shadow he lived for the rest of his reclusive life.

Also severely vilified was Stanley Lord, Captain of the Caledonian, the ship who failed to respond to the distress signal because he did not believe Titanic was in real trouble.

The real truth was, of course, far harder to accept. There were only twenty lifeboats for the ship’s 2,223 passengers and Titanic, on the instructions of Mr. Ismay, had been sailing two fast for a ship of her size, not leaving her enough time to turn before hitting the iceberg.

The real culprits for the Titanic disaster were human error, over-zealous optimism, naïve exuberance and wonderment at our incredible human accomplishment, of which its designer and builders were so proud. Like any tragedy, we are left with a vast list of might-have-beens and if-onlys.

In the words of one of Titanic’s survivors, Captain Arthur Rostron, commander of Carpathia, the ship that went to Titanic’s rescue:
I still think about the ‘might have beens’ about the Titanic, that’s what stirs me more then anything else. Things that happened that wouldn’t have happened if only one thing had gone better for her. If only, so many if onlys. If only she had enough lifeboats. If only the watertight compartments had been higher. If only she had paid attention to the ice that night. If only the Californian did come. The ‘if only’ kept coming up again and again and that makes the ship more then the experience of studying a disaster. It becomes a haunting experience to me, it’s the haunting experience of ‘if only’.”

But the legacy of Titanic extends far beyond the immediate aftermath of the disaster. The story of the great ship’s fateful maiden voyage is mythic in a way that continues to engage us today. We are enthralled by the band, who continued to play as the ship was sinking, by the courage of those who remained on board to try to save the ship. We are moved by the failed attempts of her captain, John Jack Philips, whose futile attempts to signal distress with the Marconi machine fell upon deaf ears – “Come at once, we’ve struck a berg,” “come quick as possible. Engine room’s filling up to the boilers.” And of course, his last incomplete message, sent at 2.17am “C.Q. -” And then silence as the ship sinks.

What is more, the questions identified by Len Goodman in the BBC1 documentary, Titanic with Len Goodman are ones that still resonate with the contemporary human subject because they are universally valid questions: Who would I be? What would I do?

These fundamental questions challenge our humanity, making us think about the kind of people we are, or the kind of people we would like to be. Watching dramatised versions of the story from the cosiness of our armchairs, we can ponder these questions relatively safe in the knowledge that our lives and our humanity will never be held to ransom in this way.

Would we have given up a place on the lifeboat? Or begged to be saved, regardless of who we left behind?

That we can engage with the story in this way has helped the myth to flourish and garner public interest, generating as it does, questions still pertinent to our understanding of what it means to be a human being in the twenty-first century. Our modern, egalitarian sensibilities are affronted by the privileging of the classes, by the ruthless way in which the lower classes were kept locked in the lower decks of the sinking ship, so that those in first class could be saved first.

But who can say how we would behave in such circumstances? The tragedy of Titanic stirs us to question what we value and how important or ethical those values are, performing a valuable function as a myth for our time. It is myth that continues to accrue interest and generate discussion.

Margaret Thatcher: There’s No Such Thing as Society

Whether one loved or hated her, Margaret Thatcher has undoubtedly left her stamp upon British politics and her death last week has confirmed her place in history as one of the most memorable and controversial political icons.

Margaret_ThatcherBorn in Grantham, Lincolnshire on October 13th 1925, Margaret Hilda Roberts, as she then was, spent her childhood there, where her father, Alfred Roberts owned two grocery shops. Thatcher, along with her sister, Muriel (1921-2004), lived in a flat above one of those shops.

Brought up as a Methodist, Margaret was a pupil at Huntingtower Road Primary School. From there she won a scholarship to Kesteven and Grantham Girls’ School, where from 1942-43 she was head girl.

After leaving school she went to Oxford to study Chemistry, and graduated in 1947. She worked as a research chemist until the advent of her political career, which did not begin to take shape until the beginning of the 1950s.

She came to power in 1979, to a storm of controversy, as the first female Prime Minister and resigned in 1990, after falling out of favour with both her cabinet and the people.

But what will Margaret Thatcher be remembered for? Indeed, can it be right to celebrate a figure who, in her eleven year term in office, has caused such bad feeling amongst the British people?

Thatcher’s lavish funeral – to be held on Wednesday at St Paul’s Cathedral, is expected to cost around ten million and, unsurprisingly, it has attracted criticism from those who consider it too high profile for such a controversial figure.

There has even been talk of the police arresting people before her funeral to stop them rioting, so strong is the feeling against her. Granted, some of these people were not even born when she was in office, and are simply using the event as an excuse to cause trouble.

Yet, for those who do remember her more stringent policies, Thatcher is still resented for the damage wrought upon the communities she tore apart. Most notable are her demolition of the unions during the Miners’ Strike, (1984-85) which ripped the heart out of the mining communities, and saw thousands of men out of work and families ruined.

Fuelled by bitterness at the humiliation of her predecessor, Edward Heath in the 1970s, Thatcher had a score to settle with the National Union of Mineworkers. She settled it by introducing numerous clauses and legal measure to prevent strike action. Combined with the decline of the manufacturing industries, which had been the beating heart of working class identity, these factors impacted powerfully upon communities, especially in the north of England, where unemployment was highest.

Indeed, one of Thatcher’s legacies to us is the underclass – generations of unemployed for whom work – not merely as a source not just of income, but also as a source of pride and identity – has lost its meaning. Thatcher – if one is to be visceral – ripped the heart out of the working class and stamped on it.

In many respects, the traditional working class, formed around the mining and the steel industries, was eroded, if not destroyed by Thatcher. In its place we have a lost and alienated class with a collective identity crisis.

Negative equity is another legacy from the Thatcher era. The Government’s Housing Act of 1980 allowed council tenants to buy their own homes. It was a policy which intended to create, ‘a nation of homeowners,’ but which led to the culture of debt we now inherit. Over the subsequent decades, predicated upon the assumption that the value of property would continue to rise, this housing boom created a Britain mortgaged to the hilt and drowning in debt, when the inevitable happened and house prices fell.

Then there was Thatcher’s introduction in 1987, of the contentious Clause 28, a neo-Victorian attempt to censor teachings pertaining to homosexuality in an age that was already blighted by the A.I.D.S. epidemic.

The act, which stated that under no circumstances must authorities and schools intentionally ‘promote homosexuality,’ led to the closure of many lesbian and gay societies and clubs at school, for fear that they were in breach of the act. We can never know what harm Clause 28 has done to those affected by its introduction – vulnerable young people in need of support who were rendered more marginalised and isolated by the suppressive forces of central and local government.

In summary, it is a great pity that Thatcher used her formidable will and substantial energies to wound and destroy the spirit of the nation. Exercised in a different way, her powers could have achieved her goal, to ‘make Britain Great again.’

Rightly or wrongly, Thatcher defined the 1980s. She creating the materialistic, self-serving ‘me’ generation who believed they could have it all without paying the price; where opportunities were created and not awaited, and where there was, “no such thing as society, only individuals.”

Who Wants to Live Forever? – Part 1: The Origins of Celebrity

If one had to name or describe the dominant theme of our cultural epoch, it would almost certainly be celebrity, and the public’s obsession with fame; what we have come to call the cult of the personality.

Never before has a culture been quite so fixated with fame: the quest for it; the acquiring of it, and the dealing with its pressures and demands. We are talking not simply about fame as a by-product of talent, but as a thing in itself; for its own sake. We are talking about the search for cultural icons and the making of them.

In the first part of this article I am concerned with tracing the history of celebrity because, like most contemporary themes and concerns, there is usually a history. Nothing is ever quite new, but merely an old idea cloaked in a different form.

The second part of this article will be concerned with the contemporary phenomenon of celebrity and the cult of the personality.

While it is tempting to see our cultural epoch as entirely unique, the phenomenon of celebrity and the cult of the personality can be traced back to at least the late nineteenth century, to the figure of Oscar Wilde, whose quote, ‘there is only one thing worse than being talked about and that is not being talked about,’ can be taken as a sort of mantra for contemporary publicists and the celebrities whom they represent.

Oscar Wilde

Oscar Wilde was famous before he had published his best work, largely due to his flamboyant dress style, but also because of his razor-wit and his ability to, as he put it, ‘sum up all existence in a phrase.’ Wilde was the master of the one-liner, the sound-bite, of what was then referred to as the epigram.

Oscar Wilde made himself known by going to the right parties and impressing the right people and he was lampooned in Punch Magazine for his extravagant nature and distinctive style and for his conspicuous association with the newly popular Aesthetic Movement. Contrary to popular belief, Wilde did not ‘invent’ the Aesthetic movement. Yet he has not only come to be thought as synonymous with it: he has come to be thought as its leader.

Wilde’s big break came in 1882, when Gilbert and Sullivan produced Patience, an opera which satirised the Aesthetic Movement. The main character, Bunthorne was said to be modelled on Wilde, with his long hair, flowing cravats and velvet knee-breeches, yet it could just as easily have been modelled on the painter, James Whistler, another conspicuous ‘fan’ of the Aesthetic Movement.

The interesting point here is that the ‘aesthetic’ image was already an iconic one, which the savvy Oscar Wilde adopted for himself and inhabited, for the sole purpose of securing himself some useful publicity.

In support of Gilbert and Sullivan’s opera, Wilde was invited to embark on an American lecture tour, an affair which bore many similarities to the world tours of today’s stadium rock bands, both in the press attention it received, and the public hysteria surrounding his arrival in the States. Wilde’s lectures were packed to the rafters with fans and detractors and Wilde writes of being requested to send locks of his hair to adoring fans, confessing that his manager, to whom he wisely delegated this unfortunate task, was now, ‘quite bald.’

In the manner of today’s celebrities, Wilde knew how to make controversy work for him. At one lecture, Wilde was confronted with a row of undergraduate hecklers, who had decided to mock him by turning up dressed in knee breeches and cravats. Wilde was, however, tipped off in advance and arrived for the lecture attired in a conventional suit.

Wilde worked and thrived within a society where the cult of the personality was at its nadir, yet when we examine the socio-cultural conditions of his time, we find that, running parallel to the rise of celebrity culture, is the decline of religious faith, the demise of organised religion and the rise of the self-nominated guru figure.

By the 1880s, Madam Blavatsky, founder of the Theosophical Society, was spreading the popularity of Eastern religions and had made several trips to India, where she brought back many ideas involving mysticism and the occult; including the use of various forms of divination such as Tarot cards and magic. The movement quickly grew in popularity and amongst its members were the author Arthur Conan Doyle and Oscar Wilde’s wife, Constance.

Integral to Theosophy was the idea that there is no single god, but many gods; that spirituality is accessible to everyone regardless of religious creed, and that, in the manner of the ancient Gnostics, true wisdom is to be found within, by tuning in to the inner self, rather than looking without, to a preacher, or to God.

In this sense, what we have here essentially, are the seeds of the modern New Age Movement. But what we also have is the foundation for a spiritually egalitarian culture, in which the role of the guru is open to anyone with something insightful or controversial to say.

The significance of this for the cult of celebrity is that the self-nominated guru may not necessarily be a religious figure but simply one who espouses a creed and accrues followers, and what is more, who does so through the machinery of the media, through publicity.

Thus we are beginning to see the rise of popular culture, replacing and usurping organised religion, or at least, offering an alternative. The transition from a monotheistic culture into a polytheistic one paves the way for the rise of the artist-as-guru, ‘preaching’ his creed to an adoring or outraged public.

According to many cultural historians, including Joseph Campbell, art is the spirituality of our time. It facilitates the exploration of our inner selves and can act as a moral compass; expressing the ‘soul’ or spirit of the age.

But it can also be the means for the artist to achieve celebrity and more importantly, immortality. With the prospect of ‘heaven’ and the hereafter looking less believable to many, celebrity may the only way to feel we can cheat death.

For us in the contemporary world, however, the phenomenon of celebrity has evolved into one in which the artist, or other person of talent, is often removed from the equation, thus making celebrity possible for anyone who knows how to manipulate the media. This will be the subject of Part Two.

Titanic: The Myth Lives On – Part 1

“As the smart ship grew/In stature, grace and hue/In shadowy silent distance grew the iceberg too…”
(Thomas Hardy, The Convergence of the Twain. Lines on the Loss of the Titanic)

At the approach of the one hundredth anniversary of its demise, Titanic mania seems to be gripping the nation. Dramas and documentaries are in abundance, including the Julian Fellowes’ five-part series (ITV1), Titanic with Len Goodman, (BBC1) and another factual documentary on Sunday 15th, exploring the testimonies of the ship’s survivors. In Belfast, where the ship was built, there is a series of commemorative events, including a tribute concert with special guest stars.

Titanic – both the vessel itself and the tragedy which befell her – have passed into legend. The tragedy has survived the ages, delivering to us its tale of human fallibility and vanity, ambition and hubris. As many have pointed out over the last few weeks, it is a story that still means so much, a tale that continues to capture the public imagination.

The story, Len Goodman noted in the conclusion to the second part of the Sunday night documentary Titanic with Len Goodman – leaves us asking ourselves two questions: Who would I have been, and What would I have done? But there is another question that has to be on peoples’ lips: how could it have happened?

"Titanic", "BBC1", "Len Goodman"

Titantic with Len Goodman on BBC1

How could a ship, built with such faith and hope and ambition, have encountered such a tragic end?

Titanic was built by the famous ship-builders, Harland and Wolff, but right from the start, there was a touch of doom about her. Eight men died in industrial accidents during her construction, which took place in a dockyard built for the specific purpose of accommodating her immense size. (Titanic was eight hundred and eighty feet long and weighed forty-six thousand tons). She took fifteen hundred men and three years to build and was docked in Southampton from May 1911. The crew that queued up to work on her were numbered in the thousands. Titanic was more than a ship. She was a phenomenon, an event.

But to us, Titanic has become even more than that. She is a symbol, a figurative illustration of Edwardian optimism and invincibility. In James Cameron’s 1997 film, Titanic, when Mr. Ismay, president of the company who built the ship, is informed of the ship’s plight, he responds with, “Titanic can’t sink”. “I assure you, sir, she can…” replied Titanic’s head designer and naval architect, Mr. Andrews, “… she’s made of iron.”

It was precisely this kind of hubris – which in many ways typified the naïve optimism of pre-war England – which led to the disaster. Titanic had had a lot invested in her: in time, in money, in energy and in faith, to say nothing of the lives she claimed before she had even embarked upon her maiden voyage. But it was the fallacy of her invincibility that was her undoing. In the words of Philip Franklin, White Star Line’s Vice President, ‘I thought her unsinkable and I based my opinion on the best expert advice.’

Between Mr. Ismay wanting a good headline in the American press by pushing for Titanic to sail faster and thus arrive in America early, and the insufficient number of lifeboats for the passengers on board – also the result of Ismay’s poor judgement – was the belief that Titanic was unsinkable.

Underlying that belief is the ideology of ambition, the indomitable spirit of Empire. Although built in Ireland by Irish hands, Titanic carried with her something of the spirit of that Empire; the conqueror’s sensibility that nothing could possibly destroy.

Thus, when Titanic sank, she took with her more than her passengers and crew. She took much that Edwardian naivety so characteristic of the English nation in the time before the First World War, which would see things tipped upside down forever.

Titanic sank in the early hours of April 15th 1912. She was like a portend of doom, prefiguring the disintegration of Edwardian society by a mere two years and Рat the risk of invoking a dreadful clich̩ Рshe prefigured a society that would be irrevocably afflicted by the all too-powerful waves of change that came with the First World War.

The ship, sailing blithely on the waves that would destroy her, was a cross section of our class-ridden society, the social system in miniature, upended; turned upside down and then sunk, by hubris, optimism and the quest for glory. Indeed, one might just as easily be describing the outcome of the catastrophic First World War and its effect upon England, an event which was a mere two years away from the famous maritime disaster.

In Part Two, I will be exploring the legacy of the tragedy, articulating it in terms of a public mourning and it accompanying emotions: denial, guilt, anger and loss.

Image reproduced from art.com, redeyechicago.com and cartridgesave.co.uk

Sporting in the Olympic Theatre

Olympic fever appears to have taken over the country, with Britain hosting, and performing astoundingly well, particularly in the cycling events, perhaps even putting cycling on the map as a national sport.

In such economically dark times, a bit of flag-waving patriotism may be just the tonic the country needs; the moral-booster that helps people forget their troubles.

Yet the Games isn’t just sport: it is an Event-with-a-capital-‘E’. It is sport invested with all the pomp and pageantry of which humans are capable, with its opening and closing ceremonies, its media hype and its crowd-pleasing winners. Athletes competing in the Games are not merely sportspeople but performers, playing to a crowd, putting on a show.

It seems, that in spite of the increasing de-formalisation of society, (even David Cameron’s wife opted not to wear at hat to Kate and Wills’ wedding at Westminster Abbey) – we still love the thrill and the glory of a formal event properly staged; the build-up, the presentation, the sheer performativeness of the display. The Olympics, purely and simply, is a three-week long piece of marvellous theatre, with sports commentators continually referring to an athlete’s ‘performance’ in their chosen event. Winning a race is deemed to be ‘a fine performance.’

In terms of its history, the Games is nothing new. It originated from Greece, taking its name from Olympia, the place where it was held. As early as 776B.C. the games took place every four years, although it is conjectured that it had already been established many centuries earlier. Events in these early Olympics were confined to running.

Other events were subsequently assimilated into the Games, such as wrestling and the pentathlon; an event which, in Ancient Greece, consisted of a day’s worth of contests, and included long jump, javelin, discus, a short ‘foot race’ and wrestling.

However, by A.D. 394, the Roman Emperor Theodusuis I, in some kind of personal crusade against the nature religions, abolished the Games, on the grounds that they were too pagan.

The first ‘modern’ Olympic Games was held in 1896, the International Olympic Committee having been founded two years earlier, in 1894. It was held in Athens, the home of the original Games and featured fourteen countries and a total of two hundred and forty-five athletes competed in a total of forty three events.

Of course, there was no television coverage of this first modern Games, only the live experience of seeing the event for oneself. Nevertheless, the Victorians doubtless enjoyed the spectacle in much the same way as we do today, marking it with both an opening and closing ceremony.

The 1896 Games did indeed begin with a grand opening ceremony, held on April 6th and the Panathinaiko Stadium was thronging with around 80,000 spectators, who listened in anticipation as Crown Prince Constantine declared the inaugural Games officially open.

1896 Olympic opening ceremony in Panathinaiko Stadium

Of the fourteen nations that competed in the Games, ten earned medals, the U.S.A. being the nation to earn the most gold medals, while Greece, the host country, won the greatest number of medals overall.

Of course, all the competitors were men, since the founder of the I.O.C., Baron Pierre de Coubertin, declared that to include women would be, ‘impractical, uninteresting, unaesthetic and incorrect.’

In hearty defiance of this stipulation, one woman, Stamata Revithi, did run the marathon course on April 11th, the day after the official race had been run. Revithi finished in around five hours and thirty minutes, and managed to achieve verification for her running time by persuading some witnesses to sign their names as proof of her achievement.

As to the sports themselves, the 1896 Games did not differ much from our modern Games, in terms of the events that were staged. There was Athletics, including a marathon and track running, in which the American, Thomas Burke, won the hundred metre race, finishing with a time of twelve seconds. Burke also won the four hundred metres race, finishing with a time of just over fifty four seconds. No world records were broken, perhaps because not many top athletes had opted to compete.

Interestingly enough, Thomas Burke was one of the first ‘modern’ athletes to crouch down at the start to the race instead of starting from an upright position, a move which confused the jury, who, perplexed, allowed him to start in this way.

Thomas Burke (2nd lane from the left)

In addition the Athletics, there was Gymnastics, Fencing, Shooting, Weightlifting, Tennis, Swimming, Wrestling and of course, Cycling, the track events of which, took place in the then newly constructed Neo Phaliron Velodrome, a building probably not too dissimilar from the London Velodrome, in which our modern cyclists are competing in 2012.

All cycling competitions employed rules created by the International Cycling Association and there was only one road event, a marathon of eighty-seven kilometres, racing from Athens to Marathon.

Frenchman Paul Masson won the track cycling, achieving victory in the one lap time trial, the sprint, as well as the 10,000 metres and Adolf Schmal, an Austrian, won the Marathon, which only two cyclists managed to complete.

Paul Masson

The Olympic Games has subsequently been held every four years ever since, the 1900 Games being held in Paris. The 1948 Games was the first Olympic Games to receive television coverage.

So what is it about sporting events of this magnitude that gets us all fired up? Is it the sport itself, or the media hype surrounding it?

My guess is that, in a world where political correctness has deemed competitiveness a negative thing, people enjoy watching sportsmen and women competing in a friendly, good-natured way, sympathising with the losers, celebrating the winners. At the risk of sounding dreadfully cheesy one might almost say the Games is like life, but with all the boring bits edited out. In other words, it is like a piece of art.

Interest in the Games shows that we still love a good show. The age of pomp and ceremony is not quite dead and the Games is feeding upon our human love of theatre and display, all in the name of some fine performances from our home-grown athletes.

Image reproduced from en.wikipedia.org and listverse.com

Workhouse or Workfare? Attitudes Haven’t Changed – Part 2

Julia Wood, author and scholar, continues her discussion of “the undeserving poor”, workhouses and today’s attitudes to the unemployed.

dole street

Throughout the nineteenth century, workhouses became places of refuge for those who were vulnerable, either because they were ill – mentally or physically – or because they were disabled. These people were made to work to earn their provisions, which were negligible and sparse, a type of watery gruel served with bread being the staple diet, though sometimes meat and potatoes were provided and supper would usually have consisted of bread and cheese and if they were lucky, some kind of broth.

Workhouse conditions were extremely harsh, and sanitation often negligible, a far cry, one might think, from conditions for the poor in England today.

Yet it is in their attitudes to the poor that the Victorians bear a striking similarity to today’s politicians and tabloid press. Just as today’s benefit claimants are characterised by the media and the Government as work-shy, good-for-nothing scroungers, so in the nineteenth century, the Victorian underclass – ‘the undeserving poor’ – were comprised of those generally regarded as ‘beggars and cheats.’

Indeed, a pamphlet published in 1862 by Henry Mayhew describes itself as A Victorian Guide to Those That Will Not Work and talks about a class of, ‘beggars, thieves, drunkards, gamblers and prostitutes’ not dissimilar to the underclass identified today.

The Victorians divided the poor into two categories: deserving and undeserving. The former term included anyone hard working and diligent who was a victim of circumstance, and who, through illness of loss of earnings was forced to throw themselves onto the mercy of the state.

The latter – the undeserving poor – was the term used for those regarded as work-shy and idle, those who did not want to work, or who refused to do so, preferring instead to live off the public purse (then, as now, poor relief was subsidised by taxpayers).

It is this latter category which describes today’s attitudes to those on state benefits. The media continues to perpetrate histrionic propaganda about ‘scroungers’ who live like royalty at the taxpayers’ expense and is full of stories abound about families – often immigrants – with multitudes of children, who reside in luxurious houses, along with a detailed breakdown of what this costs the taxpayer per annum.

People claiming sickness benefit are depicted as work-shy and ‘faking it’ and an exception – maybe a small handful of people conning the system – is taken as the norm and used as an excuse to denigrate the vulnerable poor.

The new and controversial Workfare scheme, in which benefits claimants are forced to work for their benefits or risk losing their entitlement, is similar in principle to the Victorian workhouses and utilises the same opportunities for labour exploitation amongst the vulnerable, the disabled and the mentally ill.

In conclusion, social conditions may have improved a bit since the Victorian era but attitudes have not. At a time of economic recession, when an angry nation is hungry for a scapegoat for its financial woes, the benefit-claiming poor are as vulnerable to public opprobrium and scorn as they were in the nineteenth century.

And although we are a secular culture, the Christian Work Ethic of humility, frugality and diligence still has the power to shape public attitudes to the unemployed.

Image reproduced from thelunaticarms.wordpress.com

Nostalgia – Part 2

“The past is not dead; it is living in us and will be alive in the future, which we are now helping to make.” – William Morris

In Part One of my nostalgia-themed article, I pointed out that the cult for nostalgia is not new, discussing the Victorian predilection for an idyllic past as exemplified through the illustrations of Kate Greenaway.

This week, in Part Two, I will be exploring the contemporary predilection for all things historical. This is a phenomenon we find expressed in the popularity of vintage clothing, and ‘retro’ styles of music, but which is perhaps most interestingly expressed in the popularity of the ITV1 series, Downton Abbey.

Cast of Downton Abbey

Fashion and popular music have always inclined towards the referential, invoking familiar sounds and images that hark from different eras. Treating the past as a vast shopping centre from which one can annex any style one chooses and reproduce it in a consciously ironic reference lies at the heart of post-modernity, to cite an over-used phrase. It is what one might call conspicuous irony.

Madonna has achieved precisely this type of relentless referentiality throughout her musical career. Her video for Vogue, released in 1990, was a pastiche of Marlene Dietrich in the film noire genre and her 1989 video for Express Yourself was a direct reference to the Fritz Lang film, Metropolis. 

Marlene Dietrich and Madonna

There is, however, a fundamental difference between this type of post-modern referentiality – self-conscious and ironic in essence – and the simple nostalgic yearning to return to an idyllic past. The post-modern sensibility is one in which there is a disenchantment with meaning.

The current leaning towards nostalgia is closer to what can be described as pre-modern, since it expresses a simple desire to revisit a time, or times, that have passed; not in order to repudiate or subvert meaning, but to recover it.

The pre-modern sensibility is more to do with taking a stand against the modern world, or expressing disenchantment with it and it is my belief that we are evolving from a post-modern culture into one that is pre-modern. That is, we have become disenchanted with disenchantment. The endless reproduction of images dissociated from their original meaning so beloved of an ironic post-modern sensibility is, ultimately, unnerving and disorientating because as human beings we have a basic need to discern meaning in our lives.

The success of ITV1’s Downton Abbey, for example, has nothing to do with irony and everything to do with the type of pre-modern nostalgia that yearns to return to a time when things seemed to make more sense.

Of course the catalyst for this current nostalgic mood is the recession. Times of economic uncertainty cause us to become introspective and nervous about the future. Instead, we retreat to the relative comfort of the past, which seems rosier and cosier than our bleak present.

Economic uncertainty can also make us feel cast adrift, unsure of our place in the great scheme of things and as we struggle to comprehend how the Western World got itself into such a mess, we may wonder if the phenomenon of the economic boom is becoming a thing of the past.

The past, of course, is a non-threatening place. We already know what the past is, or we think we do. The future, on the other hand is, as Shakespeare would say, “that undiscovered country,” an unknown place that, in times of recession, can become a terrifying one.

It is of particular note that in the equally economically depressing 1970s – also a time of recession – there was the power shortage; the oil crisis of 1973, when the Conservative Prime Minister Edward Heath imposed a three day week as an emergency measure. During this time, there was a plethora of period dramas such as Poldark, The Onedin Line, Flambards, and of course, Upstairs, Downstairs. 

Cast members from Upstairs Downstairs

The fashion for period dramas says much about our collective fears, our cultural aspirations and perceptions. The Edwardian era, in which Downton Abbey is set, was a time before the world changed irrevocably, before the carnage of the First World War, when the collective consciousness of the nation was still relatively naïve and idealistic.

In Edwardian times, the climate of optimism and innovation so characteristic of the Victorian era was still very much in evidence, as the Edwardians lived through a host of new-fangled innovations: the automobile, electricity, the telephone, central heating.

Indeed, there are some amusing scenes in Downton Abbey where characters struggle to cope with new inventions. Carson, the butler, has no idea how to use the newly installed telephone and holds it the wrong way round, then jumps out of his skin when the operator comes onto the line.

Downton Abbey's Carson played by Jim Carter

It is easy to see why, as well as envying what we perceive as their comparative complacency, we might identify with the Edwardians. They struggled to adjust to their new world much as we have struggled to ours – in our case, grappling with the complexity of computers in the often too-rapid advancement of technology; coping with the consciousness shifts brought about by global capitalism.

Like us, the Edwardians lived and worked in an uncertain, ever-changing world, in which the everyday lives of ordinary men and women were being revolutionised, both economically and culturally. The Edwardian world, contrary to popular belief in the myth of the Long Summer – was not a stable and secure one, but one fraught with protests and strikes. It saw the rise of the working man, through the founding of the Labour Party and the Trade Unions, as well as the long and often violent struggles of the Suffragette movement, which by the First World War, had secured votes for women.

Yet somehow our lives seem so much more troubled and uncertain than those of our Edwardian predecessors. It is not with irony that we follow the story of Downton Abbey’s cast of characters, but with a sentimental and perhaps rather self-indulgent fondness for times past, for ‘better’ times.

Until the recession ends, if it ever does, our days of conspicuous irony are over.

Images reproduced from madblackcat.com, thepixeljunkie.blogspot.com, guardian.co.uk and itv.com

Workhouse or Workfare? Attitudes Haven’t Changed – Part 1

workhouse

The Workhouse

Few people would dispute that social conditions and standards of living for the poor have improved since Victorian times. Yet, the Government’s draconian measures against benefit claimants suggest that conditions may have improved but attitudes have not really changed.

The poor, especially those unable – some would argue unwilling – to work, are regarded with a similar vitriolic contempt as they were in the nineteenth century. The Victorians called those too ill or sick to work ‘the undeserving poor;’ referred to these days as the underclass.

The relationship between the poor and work has always been complex, having its roots in the Christian belief in the redemptive power of work. The Protestant Work Ethic, as it has been described by social historians, was founded upon the belief that humility, frugality and good old fashioned hard work were steps along the road to salvation.

The Victorians strongly believed in the notion of work, as a means of keeping the poor out of trouble and keeping them humble so that they could be unitised as cheap labour, about which the Christian Work Ethic was adamant that they should not complain, since they would gain their reward in heaven.

During the Victorian period there was of course, no benefits system and the only means of support for the very poor, was to enter the workhouse, where they would have to endure hours of tedious, menial work for little pay and negligible nutrition.

Workhouse inmates were put to work in jobs such as picking oakum with a spike (workhouses were colloquially known as ‘spikes,’ perhaps because of this) – breaking stones, and bone-crushing for use in fertiliser. Some were so hungry and malnourished that they would suck the marrow from the bone before crushing it.

Of course the Victorians did not invent the workhouses, though the inexorable link between the workhouse and the nineteenth century is due in part to Charles Dickens’ Oliver, in which the workhouses are depicted as corrupt and filthy and their inmates as malnourished, starving and desperate.

The workhouses go as far back as the fourteenth century, to the Poor Law Act of 1388, when the labour shortages due to the Black Death meant that the movement of labourers needed to be restricted and so the poor were put into workhouses and became the responsibility of the state.

At the end of the Napoleonic Wars in 1815, there was mass unemployment, a fact that was not helped by the invention of new agricultural technology that made many farm labourers redundant and therefore reliant upon state support. Poor relief was becoming difficult to sustain and maintain.

Thus, in 1834, a new Poor Law Act was introduced that attempted to tackle the problem of an ever-burgeoning state-dependant poor. This act has been heavily criticised for its harshness, since its chief objective was to discourage or refuse poor relief to those who would not enter the workhouse, thus forcing many people to do so against their will.

Many authorities saw the opportunities for cheap labour and exploited it to its hilt. As we know from Charles Dickens, whose harsh experiences with poverty shaped much of his early life, the workhouses were designed to be harsh and unbearable in order to ensure that those who were able-bodied did not enter them, and that only the really desperate applied.

Join us next Thursday when Julia continues her discussion of the attitudes to the poor in Part 2 of Workhouse or Workfare?

Image reproduced from tumblr.com

Ghost Stories – Part 2

In Part One, I looked at the history of the ghost story, utilising Freud’s essay The Uncanny to argue that ghosts are representations of collective fears and prejudices. I argued that, in the Victorian era – when there were stronger boundaries and taboos – ghosts were represented in fiction as threatening, mysterious and sinister, consistent with the prejudices and fears of which they were an expression.

The Victorian ghost is, on the whole, a portend of doom and calamity whose true essence can never truly be known since the ghost is not of this world, but a figure less than human; a dehumanised entity whose purpose is often, but not always a subversive one.

In Part Two, I will be exploring the modern ghost story and looking at how the representation of ghosts has changed.

Like most other things in contemporary culture, the ghost has in fact, been claimed by post-modernism, given a reworking on the post modern principles of collapsing polarity and the unsettling of pairs of opposites. In the case of the ghost story this polarity is the life/death polarity.

To clarify this, two examples of ghost stories that have adopted these principles are The Others, starring Nicole Kidman, and Sixth Sense, starring Bruce Willis.

In both of these films, the viewer is unaware at the start of the story, that the main protagonists are themselves ghosts, viewing life from an outsider’s perspective. Thus, the ghost – instead of being cast as the outsider looking in; the figure on the margins – is centralised as the main point of view character. The viewer – unless he or she notices the signs at the beginning of the story – is unaware that they are seeing the world through the eyes of the ghost.

In The Others, Nicole Kidman’s character, Grace, lives in a large deserted house in Jersey after the Second World War, with her photosensitive children. She is waiting for her husband to come home from the war, though unbeknown to her, he is dead. Everything changes for her when some servants turn up at the house asking for work, even though the advert Grace had placed had not yet been published in the newspaper.

"Nicole Kidman", "The Others", "Ghosts"

Nicole Kidman in The Others

There is something disturbing about the servants, a woman and a deaf mute girl – and it transpires that they are ghosts. Gradually it becomes apparent that Grace had a breakdown and killed herself and her children and they have yet to accept that they are dead too.

In this moment the revelation for the viewer is that they too have to accept that they have been seeing the world through the eyes of a ghost. Such a revelation has the very post modern effect of unsettling the neat, established polarity between the living and the dead; of challenging and over-throwing the distinction between self and other.

In Sixth Sense, something similar happens with Bruce Willis’s character, the child psychologist Malcolm Crowe. Crowe becomes involved with a boy who ‘sees’ dead people, not realising that the boy only sees him because Crowe himself is dead.

Throughout the film there are tiny clues to this effect, such as when Crowe is dining with his wife and she does not actually acknowledge his presence, though he interprets her gestures and reactions as a response to him.

Once again the ghost is cast as a central, point-of-view character and the viewer is tricked into seeing the world through the eyes of the ghost until the revelation comes, and the line between life and death is challenged and, temporarily, overthrown.

Such examples, I would argue, illustrate a culture tolerant of difference; a society that seeks to embrace and celebrate the marginalised point of view; collapsing or overturning the often oppressive polarities that have governed the way we think, especially where those polarities have been transcribed into a governing ideology in which the first of a pair of opposites is the dominant one, e.g. male/female, white/black, straight/gay.

Thus, the post-modern ghost reflects the cultural need to dismantle the oppressive polarisation of opposites being challenged elsewhere in our language.

Contemporary representations of the ghost symbolise this trend towards inclusion and the embracing of difference; reflecting society’s move away from the marginalisation of minorities and the suppression of alternative voices and points of view. In short, in societies where there are less taboos, the figure of the ghost is friendlier and less threatening.

Of course not all modern ghost stories conform to this pattern. The recent adaptation of Susan Hill’s The Woman in Black starring Daniel Radcliffe reverts back to a more traditional ghost story. Indeed, Hill’s tale could easily have been written in the late nineteenth or early twentieth century, drawing as it does, upon the classic ghost story tradition of the malevolent figure with malign intentions, the sighting of which is a portend of doom and disaster.

Thus, it is safe to say that, in this era of technological excess and the relentless march of scientific progress – the ghost story is experiencing something of a revival. This perhaps should not come as a surprise, since a fascination with spiritual matters abounded in the late nineteenth century, when society was teetering on the brink of a new world and experiencing the often negative effects of the Industrial Revolution – effects which for many, included the dehumanisation and mechanisation of life through mass production and factories in which the need for the re-spiritualization of life, whereby we are reminded of our humanity, was ever-present.

In our time, the increasing pace of technological progress, the advances made by science, as well as the gradual decline of orthodox religion have each helped create a spiritual void.

Human beings have a fascination for mystery, for things that defy logic and explanation; for things which, as it were, go bump in the night. The popularity of programmes such as Most Haunted testifies to this.

In spite of how far science has come, or maybe because of it, we still like to think that the world has retained an element of mystery; that there is something out there which is beyond the scope of the relentless rationality and analytic scrutiny that has become the governing sensibility of our time.

Perhaps too much knowledge is a burden and we feel safer with the idea that some things can never be known. Or perhaps we hanker after the lost innocence of a less rational age, where myth could fill in the gaps left by science and our imaginations could flourish without censure or fear of ridicule.

Whatever the reason, ghost stories still grip the imagination with as much fervour as ever, reminding us of that primal fear of the dark; even in our over-lit, clinical age.

Image reproduced from thefilmpilgrim.com

Nostalgia – Part 1

We live, it would seem, in nostalgic times. Clothing now hailed as the height of fashion by critics and fashionistas, is more often than not, derivative of earlier times; usually the 1960s and seventies, sometimes earlier. Often this is regarded by cultural critics as referentiality; a self-conscious and ironic invocation of the past through the replication of familiar images, known as post-modernism, in which nothing is produced, merely reproduced. Yet behind such post-modern referentiality is a longing for better times that is anything but “ironic”. In Part Two of this article I will be exploring this contemporary tendency towards nostalgia, in the light of the success of period dramas such as Downton Abbey.

Nostalgia, however, is nothing new. From the mid-nineteenth century the Industrial Revolution, which urbanised the English landscape through expanding cities and the building of new factories, gave rise to a growing nostalgia for times past and for a forsaken rural idyll.

No artist epitomises this reflective mood better than the children’s illustrator and writer, Kate Greenaway. Greenaway harked back to the eighteenth century for the inspiration for her character’s clothing, drawing her ideas from the empire line dresses and pantaloons fashionable during this period. 

Kate Greenaway

Greenaway’s idyllic childhood paved the way for the idealistic portrayals of childhood depicted in her paintings and illustrations. To this day, Greenaway’s romantic rural images, of immaculately attired children playing in lush gardens on perfect summer days, symbolise a yearning for a lost innocence which while seeming a little sentimental in our cynical times, nevertheless still speaks volumes about the English attitude to landscape, essentially one of melancholia and loss.

Born in London in 1846 to an artist father and a mother who ran a gift shop, Greenaway, along with the painter Helen Allingham, was one of the most successful female painters of her day. Greenaway studied at the Slade School of Art, after which she began producing illustrations and had her first exhibition in 1868, at the tender age of twenty-two, which included a watercolour and a series of illustrations for fairy stories. Following this, interest in her work was such that she received a commission from the editor of the People’s Magazine. This led to her being asked to illustrate Christmas and Valentine cards for a company called Marcus Ward. These designs secured her further commissions and she began to achieve a modicum of success as a freelance illustrator and by 1871 her annual income amounted to just over seventy pounds, by 1877 this had reached around three hundred pounds. In addition she held exhibitions at the Royal Academy, as well as taking in regular commissions from the famous London Illustrated News.

Illustration from The Pied Piper of Hamlin

Her partnership in 1878 with Edmund Evans, ostensibly the finest engraver in London, led to the production of her first children’s book, Under the Window. This secured her position as the most famous illustrator of the Victorian age and by 1881 her annual income was in the region of £1,500, not a lot of money by modern standards but a sizeable amount in late Victorian times.

A study of her work reveals a surprisingly broad plethora of influences. While at an immediate glance we discern a traditional and rather sentimental Victorian fussiness, upon closer inspection this use of detail and sense of design owes much to the Pre-Raphaelites and the Aesthetic movement, both popular artistic fashions of the time. Her passion and the prevailing subject of her paintings was nature and the study of the natural world, a factor which aligned her with artists such as Rossetti and Lord Leighton, key figures in the pre-Raphaelite movement.

Greenaway led a relatively sheltered life and did not travel much, a factor which is reflected in her quintessentially English drawings, though she was friends with some of the greatest artists of her day, including the poets Browning and Tennyson, as well as the cultural critic John Ruskin, whose ideas would later influence Oscar Wilde.

One of Greenaway’s most interesting legacies is her influence upon children’s fashions. The high-wasted empire gowns depicted in her illustrations and paintings became the fashion for those who liked to dress their children in historic clothing. This paved the way for a romanticised ideal of childhood, in which freedom and play, expressed through the physical freedom of the loose flowing gowns, became central to the notion of a lost innocence, of a forsaken childhood.

What is noteworthy here is the link between the “innocent” childhood and the “historical” style of the clothing in which the children are depicted. From an adult perspective and in the popular culture of the day, childhood is mapped as the lost idyllic past, a factor that is borne out by images of clothing that refers back to an earlier time.

May Day

When something – such as childhood, or landscape – is perceived as irretrievably lost it becomes idealised, like the Eden myth, and we are barred from returning to it by the proverbial flaming sword. It gives rise to a yearning – to what I have elsewhere described as a “wound of lost community” (The Resurrection of Oscar Wilde, A Cultural Afterlife, Lutterworth Press, 2007) – a longing to be elsewhere; to be in a better place, somewhere other than here and now. Because we human beings are quixotic creatures we buy into the myth of the perfect past, of the lost Eden.

These days, Kate Greenaway’s paintings, along with those of her contemporary, Helen Allingham, are often to be found hanging in pubs and hotels, teasing us with the promise of Eden with their images of happy children playing in meadows in flowing gowns. In our cynical times the child in the sunny meadow is still a powerful image, one that resonates powerfully with those who lament the erosion of the natural environment and the ever expanding metropolis.

A Victorian Christmas – Part 2

In the second part of her look at the Victorian Christmas, Julia Wood examines the customs and traditions the Victorians started which we continue today. Click here to read part one of this article…

Until the Victorians, the giving of presents had been a New Year tradition but this tradition was moved to Christmas to reflect the significance of the Christmas festival. Because they were small, modest and relatively light, these gifts were usually hung on the Christmas tree. However, the giving of gifts quickly became more central to Christmas, the gifts became bigger and people bought rather than made them. The increased size and weight of the gifts made it impractical to hang them on the tree, so they were placed underneath it.

Although the notion of the Christmas feast has its origins in the mediaeval period, it was during the Victorian era that the meal we have come to associate with Christmas first began to emerge. Mince pies were originally made from savoury meat and not from fruit but in the Victorian era recipes without meat began to accrue popularity, giving us the mince pies we know today.

Meats such as roasted beef and goose had been in vogue until the Victorians, who added turkey to this repertoire, at least in the wealthier echelons of society but by the early twentieth century turkey had become the main Christmas dish.

Although the first collection of carols was published in 1833, four years before Victoria came to power, the Victorians also revived and popularised carols, setting old words to new tunes.

What is more, the Victorians have bequeathed to us, the notion of Christmas as a family time, with all the festivities such as eating and parlour games centred upon the family.

However, Victorian society was at the nadir of consumer culture and people did not have the presents they have today. Even the children from wealthy families would have been unlikely to receive more than one present and many gifts were hand-made rather than shop-bought. Popular amongst wealthy children were Dutch dolls, a doll’s house or the newly emerging teddy bear.

Without television or computer games people played socially interactive games – parlour games like Charades and Blind Man’s Buff and of course most houses would have had a piano around which the family would gather to sing popular songs.

In our socially isolated times, many people live alone and the insular nature of television and computers makes social interaction less likely, even for those with families. There is, it seems a warmth and conviviality about the Victorian Christmas, or at least our perception of it – which we find irresistible. The past, and especially Christmases past, exude a luxurious and sumptuous comfort. Perhaps we feel people were happier then, in spite of consumption and the workhouses and the crippling poverty in the slums where people often slept six to a bed.

An internet search for anything about Victorian Christmas will turn up a wealth of sites advertising Victorian ‘Fayres’ and Victorian themed Christmas events. Perhaps this is simply because we have inherited our notions of Christmas from the Victorians, who did, after all, invent it. Or perhaps Christmas simply brings out the sentimental traditionalist in everyone.

Image reproduced from michellehenry.fr

Ghost Stories: Part 1 – A Potted History

Ghost stories, it would seem, retain a timeless appeal. In this first of another two-part article, I will be looking at the history of the ghost story.

In the second, I will be exploring why ghost stories are still capable of captivating us today, how the Victorian model of the ghost has shifted from dehumanised to humanised spectre and why. I will be examining the treatment of the subject in modern films and television, arguing that contemporary writers and film-makers treat the world of the afterlife and the idea of the ghost quite differently from our Victorian predecessors.

This cultural shift, I will argue, is one that reflects our greater tolerance for those once confined to the margins of society.

People throughout the ages have enjoyed stories about ‘spooks’ and spirits. Shakespeare’s Hamlet and Macbeth are gripping ghost stories that still appeal to the contemporary imagination.

Hamlet is plagued by the ghost of his father, urging him to avenge his death by murdering his uncle, Claudius; while Macbeth is troubled by disturbing visions of Banquo’s ghost after sending two assassins to murder him whilst he is out hunting.

However, the era most notable for its ghost stories was the Victorian period, when the gothic novel was at the height of popularity.

The Victorians loved a good ghost story and literature of the time abounds with tales of hauntings and ghostly happenings. Writers in the Victorian gothic genre included Henry James, Sheridan Le Fanu, M.R. James, Oscar Wilde and Charles Dickens, to name but a few.

Perhaps the most famous ghost story of all is Dickens’ A Christmas Carol, published in 1843, which tells the story of Ebenezer Scrooge, who is visited by the spirit of his business partner, Jacob Marley, described by Dickens as having, ‘a dismal light about it, like a bad lobster in a dark cellar.’ Scrooge is subsequently visited by Ghosts of Christmas Past, Present and Yet to Come.

In traditional ghost stories the apparition, the visiting spirit wants something, often, but not always, revenge. An apparition is usually a portend of doom, a sign that bad things are going to happen to the person to whom they pay their ‘visit,’ and the experience of an other-worldly visitation is, more often than not, an unpleasant one.

The etymological root of ghost is, ironically, quite hard to pin down but its most common derivation is the German word ‘geist,’ from geis, meaning to be excited, amazed or frightened and this is invariably the effect upon the character in the story who sees a ghost.

Scrooge is all of these in turn, since the three ghosts of Christmas appear as catalysts for his emotional and psychological transformation, and ultimately produce a positive effect upon him but they are also sinister because what Scrooge sees frightens him, and they frighten us too, because, as the reader, we do not initially know what their purpose is.

It is this sense of the unknown which is a key factor in creating the dramatic tension in any ghost story. Yet this unknown is a paradox because it is also the known; the familiar fear, the primal terror of something beyond our understanding which has its origins from somewhere deep within the human psyche.

Sigmund Freud describes this in his essay The Uncanny (1919) – as Das Unheimlich, ‘the opposite of the familiar,’ but is actually foreign and familiar at the same time – ‘heimlich’ and ‘unheimlich,’ sometimes translated as homely and unhomely.

The Uncanny presents a highly persuasive explanation of the cultural role of ghosts, in our folklore and in our literature and one which serves us well in understanding how the role of the ghost has shifted in modern times, moving from dehumanised to humanised as our cultural values have simultaneously shifted from prohibitive to permissive.

Culturally, Freud argues, taboo subjects and ideas with which we are uncomfortable – often of a sexual origin – become repressed and then projected onto objects or figures that are then imbued with the very fears and anxieties we carry within us and as a result, deemed frightening or, in Freud’s word, unheimlich: uncanny, unfamiliar.

Traditionally, ghosts represent our repressed fears and anxieties; cases where the familiar has become strange, sinister, threatening – beyond the realms of the human, no longer living and yet not dead. They invoke feelings of distaste and fear because we cannot identify them.

We are unable to pinpoint their origins or work out where they belong within the polarity that is life/death. They belong in the shadows, on the margins, outside the boundaries of society. Of course, the Victorians had far more clearly defined boundaries than we do, coupled with a greater sense of censoriousness defining their notions of the acceptable and the unacceptable.

For example, to name a couple of Victorian taboos, homosexuality was still punishable by law and having a child out of wedlock would condemn the mother to the life of a pariah.

Thus, if as Freud argues, ghosts are an expression of repressed fears and anxieties; the familiar turned frightening, then it is no surprise that the traditional Victorian ghost – perhaps with the exception of the affable Simon de Canterville in Wilde’s The Canterville Ghost – should be unidentifiable and sinister.

The Victorians had stricter moral and social codes than we do; stronger boundaries for what was within and what was outside, the realms of society and the norm; stronger prejudices and beliefs, both secular and religious.

It is no wonder then, that the classic Victorian literary ghost should be a sinister figure hovering on the margins of life and death, a figure mostly devoid of humanity who cannot be brought into the realms of the human but remains mysterious, elusive and unsettling.

In Part Two, I will be discussing the post-modern ghost, a new, humanised figure, who is a reflection our culture’s absorption of previously marginalised identities and value systems as well as our need to overturn the polarities that many minorities have found oppressive.

Image reproduced from allposters.com

Who Wants to Live Forever? – Part 2: The Evolution of Celebrity

“Where has God gone?… We have killed him – you and I. We are his murderers…God is dead. God remains dead. And we have killed him.’ (Nietzsche, The Gay Science)

From Oscar Wilde to Jordan

How did we get from Oscar Wilde to Jordan?  This may sound like the opening line of a cheesy joke, but in actual fact the cult of celebrity has a lineage, an evolution which, like any cultural phenomenon, can be historically mapped. When we engage with this mapping process we discern a move away from the artist, to the personality, a move away from the notion of fame as a by-product of talent and towards the phenomenon of fame for its own sake.

Even as recently as the 1980s this idea was unheard of. Those in the public eye had to have a talent, usually but not necessarily, an artistic ability, in order to ascend to the status of celebrity.

Yet the seeds of this evolution were sown by Oscar Wilde, who once said, “I have put my genius into my life, only my talent into my works.” Indeed, the public’s fascination with Wilde’s life has often overshadowed the immense talent he possessed, with filmmakers and biographers concentrating upon the tragic story of his downfall rather than his works.

But with Wilde, there still is a body of work, a legacy by which to remember him, now that all those who knew him are dead. These days, with modern celebrities there is often no such thing. So hungry are we to adopt new idols that we have removed the years of hard work devoted to developing a talent in order to facilitate the quick, easy acquisition of fame. Reality shows such as Channel Five’s Big Brother, along with internet sites like Star Now, mean that almost anyone can pursue and achieve celebrity.

Channel 5's Big Brother

It is plausible that the current obsession with ‘instant fame’ and the cult of the personality may come be regarded by future historians as symptomatic of cultural indolence; a reflection of the trashy disposable society in which we presently live.

Indeed, along with flat pack furniture, high rise tenements and ready meals for one, it would be easy to regard the celebrity craze as something that shows our culture as just that: a disposable, empty and meaningless sham.

After all, remove the concept of talent from the equation and we soon have a free-for-all, in which the only criterion necessary to achieve celebrity is desperation: the desperation to be on camera and to be immortal.

But where has it come from, this need to turn our lives into a perpetual performance, to play out our lives in front of the cameras, to film, scrutinise and record every aspect of the human experience, from the day to day goings on of the contestants in Big Brother, to the D.I.Y. projects of ordinary people?

Has this impetus to observe ourselves and be observed always been there? Is it something wired into the human psyche, or it is a superficial thing, a result of the rapid expansion of media culture?

Certainly the need for idols and heroes stretches back to Hercules and beyond and is an innate human need, but the criteria defining such figures – bravery, struggle, intelligence, strength – was narrow enough to exclude most of the population. Now, what we are seeing is an expansion of the criteria defining the hero so that to be famous is to be a hero, regardless of whether one possesses the requisite qualities of courage, vision and strength.

Hercules - the original hero?

 Mythologist Rollo May was quick to point this out, arguing in The Cry for Myth (Doubleday 1991) that one of the problems of our time is that, “we have confused celebrities with heroes.” May is undoubtedly correct in this claim. We have confused celebrities with heroes.

But why? And why the restless, almost frantic search for fame on the part of those who want to be celebrities and, I would argue, the equally frenetic search for idols on the part of the public? It is suggestive of a culture that has lost its way, one that has lost the ability to navigate through the labyrinthine paths of existence towards some kind of meaning.

As I discussed in my last article, the waning influence of religion in the late nineteenth century has contributed to the search for guru figures. The gradual erosion of monotheistic culture in favour of a polytheistic one has paved the way for the egalitarian celebrity culture in which we now live. As Andy Warhol observed, “everyone can be famous for fifteen minutes.”

In the last few decades, as the search for idols and the quest for fame has increased in momentum, the criteria defining the guru figure has expanded, making it possible for almost anyone to become famous.

Andy Warhol

To desire fame is to desire immortality. That much is obvious. But it is also to desire for one’s every thought, movement, action and response to be observed. Why? Because human beings are innately theatrical. Because human life is inherently a performance, in which we need to be noticed and acknowledged in order for our lives to have meaning and if God can no longer fulfil the function of continual audience; if the hereafter can no longer be the route to immortality, then something else must fulfil those functions, otherwise what is the point?

This is where the camera comes in. If Heaven and Hell are just concepts invented by humans and not places we go to when we die, then immortality must be rethought; the afterlife re-invented to mean being captured for posterity on camera – or at least for fifteen minutes.

In addition, the ever-burgeoning human population and the rise of technology renders us more faceless and impersonal than ever. Tired of being lost in the crowd, we want to feel valued again; we want to feel as if we matter as individuals.

In the faceless bureaucratisation of our society and the equally faceless life of the cities in which most people live, people are looking to find a way back to an essential humanity. For some, the way to achieve this is by being known to everyone. Of course this is inherently narcissistic, since it is not a two-way process. Celebrities do not care to “know” their public, simply to be known by them.

Yet what the modern cult of celebrity shows is an essential discontentment with everyday, contemporary life. It is a reflection of the need to re-inject meaning into our existence, to replace what has been lost through the slow decline of religion – meaning, certainty and value – and to replace it with something more radical, egalitarian and liberating. However, whether this is a positive thing is perhaps a question only future historians will be able to answer.

Images reporduced from blog.paperblanks.com, au.thehype.yahoo.com, channel5.com, totalfilm.com and neverwoodhigh.com