Tuesday, 31 October 2017

How the politics of the Left lost its way

File 20171031 18686 cp2any.jpg?ixlib=rb 1.1
Geoffrey M. Hodgson, University of Hertfordshire
One hundred years ago, the Bolsheviks seized power in Russia and set up the first long-lasting Marxist government. The Russian Revolution’s impact was wide-ranging. One important – and overlooked – effect was how it changed the idea of the term “Left” in political terminology. Following the Bolshevik takeover, the term Left became more strongly associated with collectivism and public ownership.
But originally the term Left meant something quite different. Indeed, collectivism or public ownership are not exclusive to the Left. The word fascism derives from the fasces symbol of Ancient Rome, a bundle of rods containing an axe, which signify collective strength.
Another effect of 1917 was to undermine further the democratic credentials of the Left. These had already been undermined by early socialists such as Robert Owen, who had been opposed to democracy. After Soviet Russia and Mao’s China, part of the Left was linked to totalitarian regimes with human rights abuses, execution without trial, little freedom of expression and arbitrary confiscation of property.

The British Union of Fascists originally used the fasces on its flag. TenaciousFox via Wikimedia Commons, CC BY-SA

The origins of Left and Right

The political terms Left and Right originated in the French Revolution. In 1789, in the National Constituent Assembly, deputies most critical of the monarchy began to gather on the seats to the left of the president’s chair. Conservative supporters of the aristocracy and the monarchy congregated on the right side.
Those on the right wished to maintain the authority of the crown by means of a royal veto, to preserve some rights of the aristocracy, to have an unelected upper house, and to maintain major property and tax qualifications for voting.
Those on the left wished to limit the powers of the monarchy and to create a democratic republic. They demanded an end to aristocratic privileges and limitations to the powers of the church and the state.
Hence Left originally meant liberty, human rights, and equality under the law. It meant opposition to monarchy, aristocracy, theocracy, state monopolies, and other institutionalised privileges. The original Left opposed justifications of authority derived from religion or from noble birth. It supported democracy and private enterprise.

France’s Estates General, the precursor to the National Constituent Assembly. Auguste Couder via Wikimedia Commons

Ostensibly, the Left has always stood for equality. But what does this mean? Does it mean equality under the law? Such equality was explicitly denied by Karl Marx and his followers, who argued that after the revolution the bourgeois class should be denied legal rights. This was put into practice after the revolution in Russia in 1917.
The pursuit of equality is not confined to socialists. Liberals such as Thomas Paine and John Stuart Mill promoted the more equal distribution of income and wealth, as well as equality under the law. Liberals favour markets and private property, partly because they help protect individual autonomy. So can liberals be described as Left? Today’s Left has become so widely associated with public ownership that it would not include radical liberals in its broad movement.
The term Right has also shifted in meaning, from nationalist and traditionalist apologies for the privileges of aristocracy, to greater advocacy of free markets and private ownership, which ironically had been the territory of the original Left of 1789.

Wrong turnings?

The Marxist government in Russia quickly evolved into a one-party state. A regime of purges and terror ensued. I argue in my book Wrong Turnings: How the Left Got Lost that a slide towards totalitarianism is inevitable within Marxism. This is because the Marxist concept of class struggle and its proposal for a proletarian government undermines the notion of universal human rights, developed in the Enlightenment and proclaimed in the French Revolution.

Communism has co-opted the Left. Gustavs Klucis

By the 1970s some on the Left went further, to oppose any export of Western ideas, and to reject any notion that poorer countries deserved to enjoy the same human rights that were promoted in Europe and North America. Proposals to extend these rights or values were seen as apologies for “Western imperialism”. And, in their enthusiasm for “anti-imperialist struggles” many on the Left supported terrorists and religious extremists, including the IRA, Hamas, Hezbollah and the regime in Iran. This is far from the views of the original Left.
Of course, people that consider themselves as Left-leaning are not obliged to follow the ideas of the original Left. But it is important to understand how strains of Left thinking have twisted and turned from their original source. And recognise that alternatives are possible – particularly when the language of politics today is so broken. George Orwell wrote in 1946:
One … ought to recognise that the present political chaos is connected with the decay of language, and that one can probably bring about some improvement by starting at the verbal end.
The ConversationThe term Left has gone through major changes of meaning in the last two centuries. With this decay there has been a large degree of chaos. Meanwhile parties on the Left around the world are in crisis as a result of ideological fragmentation. If we are to have progressive change in society we need to first reconfigure the political map and no longer be restricted by what has come to define Left and Right.
Geoffrey M. Hodgson, Research Professor, Hertfordshire Business School, University of Hertfordshire
This article was originally published on The Conversation. Read the original article.

Thor: Ragnarok – the end of the world (but not as we know it)

File 20171027 13367 177suug.jpg?ixlib=rb 1.1
Sylvie Magerstaedt, University of Hertfordshire
Even before its release in the US market, Thor: Ragnorok, the final instalment in the Marvel trilogy featuring the Norse God of Thunder, topped the international box office with US$107.6m and earned more money for an October release in Britain for any movie other than a James Bond film. A thunderous launch, then, if you are given to clunky puns.
But beyond being a dose of action-packed entertainment, the film raises a number of questions on how ancient myths are incorporated into contemporary superhero franchises. Thor: Ragnarok’s mix of Nordic myth, apocalyptic visions and popular science fiction fantasies makes it hugely entertaining but also problematic where the use of myth is concerned.
Thor is of course not the only superhero blending ancient myth and comic book characters. Earlier this year, the new Wonder Woman movie showed us how Greek myth can be adapted into the comic universe. As a demigoddess drawn from a range of classical sources, Wonder Woman highlighted the “pick-n-mix” mentality with which myth is being used in superhero films (and their comic book sources).
It is undeniable that modern superheroes and the gods and heroes of mythology have a lot in common. Apart from superhuman powers, they also live by their own moral codes – often outside of human society. When adapted into a comic universe, new rules apply. The focus shifts from largely self-interested desires towards the protection of mankind as a central aim.
Thor, however, is a particularly interesting case as he is drawn from Norse rather than Greek mythology. Unlike Greek mythology, whose key players and stories are widely familiar to audiences not least from numerous small and large screen versions, Norse myths are generally much less well-known. This might make their appropriation easier as fewer people are likely to bemoan potential inaccuracies – but also means that you cannot draw on the audience’s background knowledge in the same way.
In this latest instalment, Thor faces his evil sister Hela (a variation on the goddess Hel) played by Cate Blanchett, who has taken over Asgard – the mythical home of the Norse gods (at least in this simplified version). He fears that she will bring about Ragnarok, the “Twilight of the Gods”. Yet, in the end, it is he who causes rather than prevents it, in order to destroy the evil villainess and protect his people.

Pick'n'mix mythology

The mixing up of different mythic realms is many centuries old – the 13th-century Icelandic historian Snorri Sturluson, who is the source of much of Norse mythology, conflated Asgard with the mythic city of Troy. Although he later revised this, in Sturluson’s earlier Prose Edda, Thor is the son of Tróán, daughter of the Trojan king Priam. This might go some way of explaining his reincarnation as a gladiator halfway through the film, his new look more akin to Spartacus, Leonidas and the like than his previous longhaired, Viking-inspired appearance.

Sturluson also wrote at a time when Christianity gradually took over as the dominant religion and, as such, his recounting of the old Norse myths is diffused with Christian undertones. Interestingly, hints at this replacement of the old myths by a new, Christian world order can also be found in Thor: Ragnarok. When Hela first returns to Asgard, she destroys a ceiling painting that depicts the Norse gods in a distinctly Christian manner, complete with golden halos around their heads. Hela blasts the ceiling away, revealing the images of a much darker past, where Odin is shown as brutal conqueror rather than benevolent ruler.
However, despite the epic title, very little of the film actually deals with this ultimate battle of the gods. While in the myth the whole world goes up in flames, the film contains its destruction to the floating island realm of Asgard. In contradiction with myth, first Odin then Thor claim that “Asgard is not a place, it’s a people”. These 21st-century sentiments reframe Ragnarok as a refugee crisis, in which the people of Asgard become a group of migrants now in search for a new home.

Tongue-in-cheek Thor?

While myths have always been adapted to make sense of contemporary issues, limiting these to a particular place and a small group of people somewhat jeopardises their universality. Moreover, the modern superhero genre generates mythic heroes that are no longer part of a specific mythic realm. The films are, of course, always linked to other parts of the Marvel Superheroes franchise and the Avenger films provide a chance to bring them all together.

Villainess: Cate Blanchett as Hela. Marvel Studios 2017

But first and foremost, superheroes and heroines are just individuals that battle their own issues – and mythology is used merely as a back story rather than a guiding principle for the film’s narrative. In the end, everything is smothered in irony, robbing the mythical elements of their meaning – myth is reused, recycled, and ultimately reduced to superficial entertainment.
Don’t get me wrong, it’s funny and nostalgic, in particular the nods to 1980s and 1990s film and television. The tongue-in-cheek approach and the New Zealand sets all reminded me of Hercules: The Legendary Journeys (1995-1999), which incidentally also dealt with Ragnarok in its fifth season. The difference is that for Hercules, human beings were always central, and he was happy to let the gods destroy each other if needed.
While Thor also ultimately saves the people of Asgard, the main focus is on the battle between him and Hela, the super-villainess. In these battles between superhuman beings, ordinary people often end up as collateral damage – which rather challenges the notion of superheroes as the protectors of humankind.
The ConversationAnother Marvel film, Captain America: Civil War raised this central ethical question as part of its story: is it ever justified to sacrifice human beings for the greater good? I’m not sure I can find similar significant issues being explored in Thor: Ragnarok. Or maybe the filmmakers have simply shied away from giving us a clear moral message in the way Hercules used to do. Decide for yourself and enjoy the ride.
Sylvie Magerstaedt, Principal Lecturer in Media Cultures, University of Hertfordshire
This article was originally published on The Conversation. Read the original article.

How long have we believed in vampires?

File 20171026 13309 1r3s12b.jpg?ixlib=rb 1.1
Sam George, University of Hertfordshire
Vampires have a contested history. Some claim that the creatures are “as old as the world”. But more recent arguments suggest that our belief in vampires and the undead was born in the 18th century, when the first European accounts appear.
We do know that 1732 was the vampire’s annus mirabilis. There were 12 books and four dissertations on the subject published over that year, as well as the term’s first appearance in the English language, according to gothic expert Roger Luckhurst. But archaeological discoveries of deviant burials in Europe in the last few years have unearthed a belief in vampirism and revenants before 1500, much earlier than was previously understood by literary scholars.
The body of a 500-year-old “vampire”, for example, is currently on display in an ancient cemetery in the town of Kamien Pomorski, Poland. The vampire corpse, discovered two years ago, has been reported on widely in the world’s press. Archaeologists have confirmed that it has a stake through its leg (presumably to prevent it from leaving its coffin) and a rock in its mouth (to stop any unfinished blood sucking). Even older deviant burials have been discovered in villages in Bulgaria.

An 800-year-old skeleton found in Bulgaria stabbed through the chest with iron rod. Bin im Garten, CC BY-SA

Meanwhile, the medieval remains of the first English vampires in Yorkshire’s village of Wharram Percy have reputedly been found. The inhabitants who fled the village in 1500 showed widespread belief in the undead returning as revenants or reanimated corpses. They fought back against the risk of vampire attacks and showed a medieval belief in an English zombie apocalypse, an episode that would not be out of place in a scene from The Walking Dead.
So some form of vampire was evidently believed in throughout much of Europe from the medieval period. But the seductive Romantic vampire does not leave his calling card in polite society in London until 1819, when the first fictional vampire, the satanic Lord Ruthven is born in a story by John Polidori. So how did our understanding of vampires transition from dishevelled peasants into alluring Byronic aristocrats? We must return the creature to its beginnings in early folk belief to fully understand its history.

Vampire, vrykolakoi, velku

In the first written accounts of European vampires, the creatures are understood as revenants or returners, often taking the form of a diseased family member who reappears in the unfortunate guise of a vampire. In such tales, “unfinished business”, even something as trivial as the want of clothing or shoes, is enough to make the dead return to the world of the living.
The number of words for “vampire” can frustrate scholars: Krvoijac, vukodlak, wilkolak, varcolac, vurvolak, liderc nadaly, liougat, kullkutha, moroii, strigoi, murony, streghoi, vrykolakoi, upir, dschuma, velku, dlaka, nachzehrer, zaloznye, nosferatu … the list goes on.
The Oxford English Dictionary takes seven pages to define a vampire, but the earliest entry, of 1734, is of most interest here:
These Vampyres are supposed to be the Bodies of deceased Persons, animated by evil Spirits, which come out of the Graves in the Night-time, suck the Blood of many of the Living and thereby destroy them.

Le Vampire, lithograph by R. de Moraine, Les Tribunaux secrets (1864) Wikimedia Commons

There is evidently little appeal or attraction felt for these early revenant figures. Unlike the English aristocratic vampire, modelled on Lord Byron, these early folkloric vampires are peasants and tend to appear en mass like modern-day zombies.
Agnes Murgoci explored this folk belief further. She argued in 1926 that the journey from death to the afterlife is perilous – in Romanian belief it took 40 days for the soul of the deceased to enter paradise. In some cases, it was thought that it lingered for years, and during this time there are a myriad of ways that deceased family members can succumb to vampirism.
It was thought that dying unmarried, unforgiven by one’s parents, through suicide or being murdered could all lead to a person returning as a vampire. Events after death could also have the same effect – beware breezes blowing across corpses before burial, dogs or cats walking over coffins, or leaving a mirror (a soul trap) not turned to the wall at this precarious time.

Entering literary spheres

It was a treatise written in 1746 by the French monk Antoine Augustin Calmet that famously gave British writers access to a number of encounters with vampires. Calmet took inspiration from Joseph Pitton de Tournefort, a botanising man of science, who had earlier claimed to have come face to face with a plague of bloodsucking vampires in Mykonos in 1702. His account was still being read in 1741.
Three decades after Tournefort’s encounter, the London Journal of 1732 reported some enquiries into “vampyres” at Madreyga in Hungary (a story later referred to by John Polidori). Greece and Hungary feature prominently in these early accounts – and this is mirrored in Romantic literature: Lord Byron for example makes Greece the setting of his unfinished vampire story A Fragment (1819).
But it was Polidori who was responsible for the vampire’s English pedigree and its elevation of social rank. There seems never to have been an urban, nor an educated bourgeois bloodsucker prior to The Vampyre (1819). A predatory sexuality is also introduced. We see for the first time the vampire as rake or libertine, a real “lady killer” – a trend that metamorphosed into Bram Stoker’s Dracula and anticipated the arrival of vampire romance in the beautiful undead form of Twilight’s Edward Cullen.

As this all reveals, the history of vampires is a disputed and uncertain one whatever your perspective, scientific or literary. But the “vampire” burials discovered by archaeologists of late do cohere with practices that are known to suggest a belief in vampirism (such as piercing the corpse, nailing down the tongue, putting a needle in the heart and placing small stones and incense in the mouth and under the finger nails to stop blood sucking and clawing). These “vampire” corpses do therefore go some way towards finding out how old our belief in vampires actually is.
The ConversationBut the history of vampires is still impossible to chart with any certainty, and we should probably take heed from British vampirologist Montague Summers (1880-1948) in our search for the lair of the original fiend. He referred to vampires as “citizens of the world”: to him, they existed beyond temporal or geographical boundaries.
Sam George, Senior Lecturer in Literature, University of Hertfordshire
This article was originally published on The Conversation. Read the original article.

Friday, 27 October 2017

Stephen Hawking's PhD thesis crashed its host website – here's what it says in simple terms

James Geach, University of Hertfordshire
The PhD thesis of perhaps the world’s most famous living scientist, Professor Stephen Hawking, was recently made publicly available online. It has proved so popular that the demand to read it reportedly crashed its host website when it was initially uploaded.
But given the complexity of the topic – “Properties of Expanding Universes” – and the fact that Hawking’s book A Brief History of Time is also known as the most unread book of all time, you might benefit from a summary of its main result.
The thesis covers several topics, including recently discovered gravitational radiation, but the final chapter is the part that many physicists consider the most significant. It deals with the birth of the universe itself, and is simply titled “Singularities”.

Creation theories

The major achievement of Hawking’s thesis was to effectively show that the Big Bang theory of how the universe began from a single point was physically possible. It wasn’t just a mathematical nuisance that sprang out of the equations physicists had developed to describe the possible evolution of the cosmos.

Gateway to the stars. Stephen Hawking/University of Cambridge

The concept that the universe started a finite time ago in a Big Bang is now an accepted scientific fact, and yet it remains an astounding idea. Imagine: all the matter in your body was once – in one form or another – compressed into the same tiny volume as the most distant galaxy and everything in between. About 14 billion years ago, this point rapidly expanded to create space and time. It continues to expand today.
At the time of Hawking’s PhD in the 1960s, scientists were still arguing over the idea. A popular alternative to the Big Bang was the Steady State model. Proponents of the Steady State model were uncomfortable with a universe of finite age that began in this way. In fact, the moniker “Big Bang” was coined as a derisive term by Steady State champion Fred Hoyle. To understand how Hawking showed it really was possible, we need some background physics.

Spacetime and singularities

In the early 20th century, Albert Einstein revolutionised our understanding of gravity through his general theory of relativity. Einstein showed that we could think of gravity as the curvature of spacetime, caused by the presence of mass or energy.
Spacetime is a way of thinking about the framework of the universe that combines three-dimensional space and one-dimensional time. All objects exist and all events happen somewhere in spacetime. But it’s hard for most people to imagine because, although we can move freely in three-dimensional space, we cannot travel where we like through time. It’s a bit like being an insect trapped on the surface of a pond. It can only move in two dimensions, despite there being another spatial dimension to explore.
General relativity expresses how space and time are linked. In his theory, Einstein elegantly described how the curvature of spacetime is related to the density of mass and energy in his “field equations”.

The curvature of spacetime due to the presence of objects with mass. ESA–C.Carreau

After these equations were published, other scientists used them to explore what happens to spacetime in different physical situations. In the case of objects where all the matter is concentrated into a single point, the field equations predict something unusual: the curvature of spacetime becomes so extreme that even light cannot escape. Today we know these objects actually exist as black holes, and we’ve since found evidence for them in space.
These situations where the solutions to the equations become infinite are called “singularities”. Hawking’s final thesis chapter explored this idea of singularities, not for the spacetime around black holes, but for the entire universe.

From black holes to the Big Bang

In cosmology, a central tenet is that space must, on average, be homogeneous and isotropic. In other words, over a large scale, the contents of the universe must be pretty evenly distributed and look the same in every direction.
The simplest solution to Einstein’s field equations that satisfies these conditions is called the “Robertson-Walker metric”, named after the scientists involved in its development. The metric is simply the term we use for describing the interval between two events in spacetime.
Importantly, the Robertson-Walker solution allows the spatial part of the metric to change with time. That means it can describe a universe in which space itself is expanding. Edwin Hubble found evidence that the universe really is expanding in the 1920s by showing that other galaxies are moving away from us.

Expanding universe. NASA

The Robertson-Walker metric and field equations allow us to describe this expansion in terms of what cosmologists call the “scale factor”, describing how much space has expanded or contracted between a particular point in time and the present day.
If the universe is expanding, it should have been smaller and denser in the past. Run the clock back far enough and the scale factor should go to zero. All the matter and energy in the universe must have been contained in a single point with infinite density: a cosmological singularity. This is the basis of the Big Bang model, a bit like a black hole in reverse.

Scrapping the Steady State

The Steady State model tried to eliminate the cosmological singularity, which many argued wasn’t plausible. Singularities were seen as shortcomings of the predictions of general relativity and not in line with the known laws of physics.
In the Steady State model, the universe is eternal and doesn’t have a beginning at all. Its apparent expansion can be explained by adding a “creation field” or C-field to Einstein’s equations that would mean matter is continuously created in the space between galaxies as they move apart.
But in the final chapter of his PhD thesis, Hawking argued that the idea of a C-field came with its own set of problems and that the right model involved the Robertson-Walker solution describing an initial singularity.
What he did next was what many consider groundbreaking. Building on the work of fellow British physicist Roger Penrose, Hawking mathematically proved that singularities were not a shortcoming of theory but expected features of nature. He effectively demonstrated that general relativity allowed for a universe that began in a singularity.
The ConversationHalf a century later, the observational evidence for the Big Bang creation scenario is overwhelming and the Steady State model has long been abandoned. Hawking has gone on to make further monumental contributions to cosmology and theoretical physics. Reading Hawking’s thesis is an insight into an exceptionally creative mind – and the first steps of discovery in what has been a remarkable scientific journey.
James Geach, Royal Society University Research Fellow, University of Hertfordshire
This article was originally published on The Conversation. Read the original article.

Tuesday, 24 October 2017

'Dig for Brexit' comments reveal the UK government digging a food security hole for itself

File 20171018 32382 1m07e77.jpg?ixlib=rb 1.1
‘Dig For Victory’, first time around on an allotment in London’s Kensington Gardens. Imperial War Museum
David Barling, University of Hertfordshire
According to UK transport secretary Chris Grayling, in the event the UK is unable to agree a trade deal with the European Union it can make up for the loss of tariff-free movement of food across the single European market by importing more food from beyond Europe and by British farmers simply “growing more food”. He makes it sound so easy.
More than 40 years of food supply chain integration has left Britain importing almost 60% of its food from the EU, particularly fruit and vegetables. But Europe is also a key market for Britain’s food exports, notably 96% of British beef and lamb, and crops such as wheat and barley. After Brexit, tariffs will significantly increase the cost of these exports. Will UK products be priced out of these markets? Or will the UK government offset the price increase by compensating producers?
Under the terms of World Trade Organisation agreements, the amounts required to compensate farmers for the tariff levels likely to be applied to meat exports could be met under existing WTO rules for agricultural support. But negotiating a trade deal with the EU would be a much better outcome.
Scenarios developed by the Agriculture and Horticulture Development Board, which provides research and development and advice to the farming sector, indicates that only the highest performing 25% of farms will remain profitable. If the government doesn’t step in with similar levels of support offered by the EU’s Common Agricultural Policy, the impact of a no trade deal will drive smaller farmers out of business. Similarly, faced with the loss of the immigrant labour from the continent required to harvest and process produce, the fruit and vegetable sector would be devastated.

The devil is in the (lack) of detail

The current secretary of state for the Department of the Environment, Farming and Rural Affairs, Michael Gove, has made more emollient statements regarding post-Brexit support for British farming, including maintaining incentives for farmers to improve their ecosystems while seeking to maintain high quality produce and improve productivity. However, the lack of detail means that it’s impossible to know the focus and extent of such support to British farmers and producers.
The UK measures its ability to home-grow its food through the food production to supply ratio, a measure of food’s traded value calculated as the farm-gate value of raw food production divided by the value of raw food for human consumption. Defra’s statisticians calculate that this is the closest way to match food production to what consumers actually eat, much of which is manufactured food composed of different raw ingredients combined and sold at a higher price.
By this value-based measure, 61% of the food eaten in the UK in 2015 was produced in the UK, rising to 76% among indigenous foodstuffs. The rest was imported. To increase the amount of UK-produced food to imported food would require government support, not just in compensation to farmers but in terms of a clear and detailed food policy strategy.

British propaganda from World War II. UK Government/andymag, CC BY

Food supply is complex, partly due to much of what we eat being assemblages of different food ingredients sourced from different locations across the world. These might be processed and manufactured foods with numerous ingredients, or mass-produced meats such as chicken and pork that rely on imported feed, and there is no guarantee that these will be easily or cheaply available in the future.
Outside the EU, international market competition from big players such as China will continue to affect the UK’s ability to easily and cheaply acquire other key industrial farming necessities, such as soya-based animal feed, phosphorus fertilisers or plant oils. Privately, large food manufacturers are considering these risks and to what extent they can substitute key ingredients if necessary.
These uncertainties and vulnerabilities point to the desperate need for a clear vision of how to provide food for Britain after Brexit. To what extent should the country be able to feed itself, and how can this be measured?
Using the economic value of food ignores the current problems of calories and nutrients, and the expanding problem of obesity and its costs to society and the health service. The FAO defines “food security” not as just access to sufficient food, but “safe and nutritious food that meets their dietary needs and food preferences for an active and healthy life”.
Using the economic value of food as a basis for measuring food security ignores the fact that we eat calories and nutrients, not economic value. Using a dietary-based approach to measure the UK’s ability to feed itself would provide a far more useful measure of its food security and supply.
The ConversationA broader vision for a UK food policy is needed that embraces healthy production and consumption, while protecting the natural resources and heritage upon which farming depends. So far it seems the government has little grasp of the important matters on which more detail is needed if the UK’s food supply chains are to feed a nation.
David Barling, Professor of Food Policy and Security, Director of the Centre for Agriculture, Food and Environmental Management, University of Hertfordshire
This article was originally published on The Conversation. Read the original article.

Tuesday, 10 October 2017

The Invisible Problem: mental health and how to help yours


I was in my second year when it started. My friends were visiting from my hometown and my housemates and they were getting along famously! We had plans for the whole weekend, going out drinking, trying the new local breakfast place and to marathon as many movies as possible. Sounds exciting right? It should have been. Instead, the thought of having to even go outside, let alone do any of those things filled me with dread. I had been nagging at my friends to visit me for a whole year and when they were there I just wanted them to leave. I wanted to sit alone in my room and not see or speak to anyone, ever again. Instead I pushed these feelings deep down, did all those things with my friends, had a horrible time and steadily continued to ignore the way I was feeling for a further three months. It wasn't until I lost all my friends, rejected my family and had real problems at university that I realised that something needed to be done. Nobody should ever have to get to that point.

Starting university can be a daunting, with a new place, new classmates and new teachers.
Mental health issues are extremely common in students, with anxiety, depression, bipolar disorder, OCD, eating disorders and mental health suffering due to stress being the most reported.
Rethink mental illness, a mental health charity state that: one in ten young people will experience a mental health problem.” So its really important to talk to someone if you are worried about your own mental health as soon as possible. The main thing to remember is that you are not alone, and there is help available. The sooner you get support, the better, as things will only get worse if you bottle things up all by yourself. Also, look after your friends. If they dont seem themselves and you think they may be struggling, ask them how they are and let them know you are there for them, and will listen without judgement.

The stigma surrounding mental health issues is not helped by the different myths that people believe. Knowing some facts about mental health is a step forward in challenging negative and discriminative thoughts. 
Myth: Mental health problems are very rare.
Fact: 1 in 4 people will experience a mental health problem in any given year.
Myth: People with mental illness arent able to work.                                                              Fact: We probably all work with someone experiencing a mental health problem. 
Myth: People with mental health illnesses are usually violent and unpredictable.          Fact: People with a mental illness are more likely to be a victim of violence.
Myth: Its easy for young people to talk to friends about their feelings.
Fact: Nearly three in four young people fear the reactions of friends when they talk about their mental health problems. 
Statistics show that over 50% of students don't feel comfortable admitting they're not coping to someone else. Rethink shed some advice on what first year students could do if they feel their mental health is suffering. Talk to a trusted friend or relative, or to your GP. Many universities now have mental health support so you could ask whats available.
At the University Of Hertfordshire, we have a high quality Wellbeing Service who offer sessions with qualified and experienced counsellors accustomed to helping young people. This service is confidential. To access it, all you need to do is contact them for one of their four daily appointments studentwellbeing@herts.ac.uk or drop by to make an appointment. They are based in the Hutton Hub on college lane campus and you can go there to speak to somebody if you need some advice or guidance.

© University of Hertfordshire, All Rights Reserved
They can also provide a range of tried and tested self-help programmes and activities to improve your Wellbeing and, if your condition is long-term, (over a year), and you have the backing of your doctor or other health professional they can provide a Study Needs Agreement which will help ensure that there is support with study that is geared to your individual condition. Don’t feel that it is a sign of weakness to go there. Knowing you need help and getting it is a sign of strength and the first step to helping you get better.
Talking about mental illness is one of the best ways to not only reduce the negativity that surrounds mental illness, but beating it. A former UH student who wishes to remain anonymous has spoken to me about their experience with mental health during their time at university.
Mental health is such a difficult thing to talk about. I did speak to my friend about it once, but she just said that I was just feeling sad because I was stressed, and because shes my friend I kind of believed her. And because I didn't want to believe it was anything else. This feeling lasted for months and months though. I only sorted sorted myself because my parents found out. Over the Easter holidays I went home and tried to convince them that I shouldn't go back to uni. They knew something was wrong then.
I have learnt though that it can take a while, but with help you can get there. I still have depression, but receive help for it now. Times can still be pretty trying but if anything learning how to deal with my depression and that Im not alone has taught me that its okay to not be okay sometimes.

Student wellbeing team: 
Email studentwellbeing@herts.ac.uk Tel +44 (0)1707 284453
Twitter @WellbeingSvcs
Rethink Mental Health: www.rethink.org
Article by Callie Watling. Callie is a Media Communications graduate. She has a passion for writing and aspires to have a career in journalism.

In the run up to, and on World Mental Health Day on the 10 October, the University of Hertfordshire is publishing a series of blog posts by alumni on how to manage your mental health at university. The blog posts in this series are also available in the magazine Fresh Start, which was distributed to Humanities students at the start of term.

Monday, 9 October 2017

Could the University's peer mentoring scheme help you?

I can still recall the first day I arrived at the University of Hertfordshire. I was a bundle of nervous and excited energy as I walked into Watton Hall and saw my room for the first time. Once I had bid farewell to my parents, who had left me to unpack my life’s belongings into one small room, I began setting up my new home. Left alone with my thoughts as I unpacked, an overwhelming feeling of anxiousness creeped over me. What if my flatmates didn’t like me? What if I didn’t make any friends on my course? What if I didn’t make any friends at all? Despite my reservations about my ability to morph into a social butterfly, I skipped to my first lecture confident that I would at least excel in my studies. Or so I thought. 

Instead I spent countless hours over the next few weeks with my head in my hands, despairing over what the phrase ‘critically analyse’ meant. Around me the library buzzed with the sound of students furiously typing, while I desperately tried to summon words to the blank page before me. As a self-proclaimed introvert, I obsessively worried over the social side of University, but it never occurred to me that I might find the academic side just as difficult.

I am not the only one to have doubts. According to a study by Aston University, many students entering their first year of university find the academic shift between studying at school/college and university challenging. With an increasing number of students struggling to adjust, universities are looking at ways to make the transition easier for them. To help combat this issue, the University of Hertfordshire has implemented a peer mentoring scheme.

What is peer mentoring?

The peer mentoring scheme at the University of Hertfordshire and for Humanities students such as you offers students, who require support, the chance for one-to-one mentoring with a fellow student. Mentors are typically second or third year students, who act in an advisory capacity, helping mentees activate their skills and support them in developing new ones.

“Students tend to feel more comfortable speaking with a fellow student rather than a lecturer,” says Graça Martins, the University of Hertfordshire’s Student Engagement and Peer Mentoring Coordinator, and it easy to understand why. Student mentors are more likely to have experienced similar struggles to their mentees and have found ways to cope with the demands of university life.

Despite helping students achieve higher grades and adjust to University life, there are still students who shy away from joining the scheme. Graça tells us that we all need support every now and again.

How does it work?

Think of mentoring as a tool, just like a book from a library, it has the potential to equip you with the skills required to succeed at University. Students are given the opportunity to sign up to the scheme as a mentee, or if they are in their second or third year, a mentor. All mentors are required to meet a certain criteria and complete training sessions, before they are matched with a mentee.

As a mentee, the ball is in your court. You are given the opportunity to pick your preferred mentor from a list that details the course and interests of potential mentors. Based on your needs, you can choose who will be best suited to support you.

Not only does mentoring help students with academic study, it also improves their communication skills. It’s a two-way street. In order for mentees and mentors to get the most from the experience, there needs to be a mutual agreement about what their needs are. A mentor can’t help you, if you don’t tell them what you need guidance on.

Sabahat Malik joined the peer mentoring scheme in her first year, after struggling with referencing. “I had a really good relationship with my mentor,” says Sabahat. “They not only told me where I had gone wrong, they showed me and gave me advice on how to improve.”

Everyone’s needs are different and mentees can choose how frequently they wish to work with their mentor. “I worked with my mentor more frequently when deadlines were looming. They looked over my essay structure, referencing and proof-read my work,” she adds. Sabahat, now a final year student, is using her experiences as a mentee to help others and has become a mentor on the scheme.

Case Study

Meet Alexandra Delasalle, a Primary School Education Student, who joined the peer mentoring scheme in her first year at the University of Hertfordshire.

“During my first year, I struggled with the jump from A-Level to University. The assignments were a lot different from those that I had completed at A-Level and I found it difficult to understand what was being asked of me.

“I chose a mentor who also studied Primary School Education and they helped me understand that I had all of the information, I just needed to approach the assignment in a different way. With their help, I learned what was needed to meet the criteria for my assignments and achieve a good grade.

“Not only did my mentor help me with my assignments, they also answered any questions I had about placements. It was reassuring to know that I had someone I could go to with questions.

“Mentoring completely transformed my grades – I went from getting 2.2s in my first year to achieving firsts and high 2:1s in my final year assignments.

“I would definitely recommend joining the peer mentoring programme, especially if there are aspects of your assignments, or just university life in general, that you are uncertain of. It helps you understand what is required at university level and equips you with the tools you need to meet those requirements.”

The University of Hertfordshire’s peer mentoring scheme is a great way for students to access support and guidance, especially if they feel uncomfortable talking to their lecturer. Sadly, it was not available during my first year. If it had been, there is no doubt in my mind that a mentor would have saved me from a lot of sleepless nights and countless hours despairing in the library.

If you would like to find out more about peer mentoring or would like to sign up to the scheme, please contact Graça Martins: g.m.martins@herts.ac.uk

Article by Katie Lonslow. Katie is an English Literature BA(Hons) graduate, and recently a Journalism and Media Communications (MA) graduate, who is ready to take on the world of Communications and PR. 

In the run up to, and on World Mental Health Day on the 10 October, the University of Hertfordshire is publishing a series of blog posts by alumni on how to manage your mental health at university. The blog posts in this series are also available in the magazine Fresh Start, which was distributed to Humanities students at the start of term.

Wednesday, 4 October 2017

At last, young people's voices are being heard about the future of the NHS

File 20170925 17421 1j1c1um
Making their mark: the NHS England Youth Forum. University of Hertfordshire, Author provided
Lisa Whiting, University of Hertfordshire; Gary Meager, University of Hertfordshire; Julia Petty, University of Hertfordshire, and Sheila Roberts, University of Hertfordshire
'For me, [being part of the NHS forum] was like being introduced to a whole new world. I wasn’t aware that young people could be offered opportunities like that, to actually talk to key decision makers and get people from really important organisations wanting to come and talk to us … It’s helped me with my communication skills … it’s taught me how to speak properly and confidently'.
This was Georgia talking about her involvement in the NHS England Youth Forum (NHSEYF) in 2016. It aims to improve health services for young people and to give them a voice on health issues that matter most to them.
A team from the University of Hertfordshire carried out an examination of the work of this forum. We found that the young people were highly motivated and committed to being involved in decision-making about NHS services. They found contributing to society through this forum a valuable opportunity and welcomed having their voices heard.
What emerged from our interviews was how much commitment there is among young people about the future of the NHS. Here’s Josh:
'It’s a major concern for me about the NHS … and I want to improve it, I want to give back … After being elected as young mayor in our local area … we get lots of opportunities about how we can contribute back to society and one of them was the NHS Youth Forum … I saw it and I thought what a brilliant opportunity that would be to kind of get my voice heard, obviously as a service user but also as someone who represents young people locally. It was a brilliant opportunity'.
Georgia, who we have heard from before, had another more personal reason for being committed to having a say in the running of the NHS:
The reasons behind why I wanted to join were more personal … I was quite passionate about mental health because my [relative] suffers from schizophrenia.
It is important to listen to young people about services that directly affect them. In the UK, the idea of youth forums is now well recognised. There are more than 620 youth councils and forums in existence aiming to give young people the opportunity to be involved in decision-making in their local communities. One example is the High Trees Community Development Trust which focuses on social issues that affect young people and provides training and support so that they can feel confident to participate in the decision-making process.

What is the NHS England Youth Forum?

The NHSEYF was established in 2014 to allow young people to participate in decision-making about the NHS. The aim was to give young people the opportunity to have a voice and “to contribute to improving and developing services for young people”.
There are 25 members of the NHSEYF ranging between the ages of 11 and 25. Publicity snowballed with the introduction of their own website, Facebook page and Twitter feed. Following the establishment of the NHSEYF, a number of other local forums for children and young people have developed within local hospitals and other areas across the UK including England, Scotland, Wales and Northern Ireland.

Getting involved

We found that NHSEYF members were involved in an extensive range of activities and commitments at local level – including hospital committee membership, local youth forum events and seminars as well as high-profile national events such as the National Children’s Inpatient Survey, national conferences and attendance at the NHS Citizen’s Assembly.
Attending these events raised the profile of children and young people’s needs and allowed the NHSEYF’s members to be active in consultancy-type roles. Our interviews with participants provided clear evidence that the young people were highly motivated and committed to the giving of their own time to ensure the youth voice was heard and represented.
The young people play a pivotal role within NHS England and their knowledge of their home community enabled them to network with professionals and peers within local and national government arenas in order to influence and get involved in decisions about children and young people’s care needs. Evidence from the data collected suggests that the personal growth and development of the young people involved is also likely to have influenced the success of the NHSEYF.

Measuring impact

Our evaluation of the NHSEYF clearly demonstrates the impact of the voice of young people. The Youth Forum Wheel (below) was developed to highlight key areas of importance, as a model that can be applied elsewhere.

The YFW is offered as a model that has the potential to underpin the development of other youth forums, both within and outside of a health context. University of Hertfordshire, Author provided

It’s important that central and local government measures improvement outcomes for people’s health and/or lifestyles by listening to their views directly rather than focusing on statistics or figures. There is also a recent growing emphasis on services actively involving children, young people and parents and/or carers in the commissioning, development and evaluation of services.
There is a need for ongoing research and funding to ensure that this youth forum model is widely recognised and extended. At the heart of this is recognising the commitment, motivation and enthusiasm shown by these young people in positively influencing service provision for children and young people. As one of our interview subjects concluded:
I think the most key point is showing adults that young people want to have their voices heard … yes the NHS England Youth Forum has done its job because health professionals were coming to speak to us and saying: ‘Oh, how do we engage with people?’
It is about time we listened to the young people who will determine the future health of the country and take their views seriously. The NHS England Youth Forum aims to do just that.

The ConversationYouth Forum members’ names have been changed in line with the ethics requirements of the project.
Lisa Whiting, Principal Lecturer and Professional Lead, Children's Nursing, University of Hertfordshire; Gary Meager, Lecturer in Children's Nursing, University of Hertfordshire & Children's Community Rapid Response Nurse Practitioner, University of Hertfordshire; Julia Petty, Senior Lecturer in Children's Nursing, University of Hertfordshire, and Sheila Roberts, Senior lecturer, Children's Nursing, University of Hertfordshire, University of Hertfordshire
This article was originally published on The Conversation. Read the original article.