A new director for academic freedom and freedom of speech – some relevant topical issues

Today the government has announced that Arif Ahmed, Professor of Philosophy at Cambridge will be the first director for freedom of speech and academic freedom at the Office for Students, a position which was created as part of the new Higher Education (Freedom of Speech) Act 2023. I wholeheartedly welcome this appointment, indeed can imagine few individuals in UK academia more appropriate for the task. Ahmed has long been a staunch campaigner on these subjects, given written evidence to a parliamentary committee looking at the bill during its progress, drawing attention in particular to self-censorship on the part of academics in response to a perceived hostile climate for free thought, and led a partially successful campaign against questionable clauses in new speech policies at the University of Cambridge. This originally called for academics to ‘be respectful of the differing opinions of others’, then ‘be respectful’ was modified to ‘tolerate’, but then a new description was added calling for ‘a safe, welcoming and inclusive community which nurtures a culture of mutual respect and courtesy’, as well as certain forms of compulsory diversity training which could amount to political dogma – see this article for more details. Respect may seem a reasonable expectation in a university environment, but Ahmed and others rightly questioned what such an amorphous concept might mean in practice – should views which deny climate change, for example, be treated with ‘respect’ if one believes they are fundamentally flawed and erroneous, and it is one’s duty to demonstrate why? Such clauses can easily be weaponised to close down lines of argument and debate, which is what proponents of academic freedom oppose. Ahmed was also an important panellist in a key debate in 2022 at the UCL Institute of Education, on academic freedom and gender-critical views, about which I wrote more here.

In an article published in The Times today to coincide with the announcement of his appointment, Ahmed makes powerful arguments on the issues:

A university is not a club. It is not a political lobby. It is not a seminary. It is not a “brand”. It exists to seek and speak truth, whatever it costs and whoever it upsets. Therefore, without freedom to explore controversial or “offensive” ideas, a university is nothing. […]

We settle disputes by discussion, not censorship or violence. Today that idea is fading across our institutions. Universities must defend it. Democracy itself is at stake. New legislation means universities and colleges must promote, and take steps to secure, academic freedom and free speech within the law. The regulator will interpret this broadly. Breaches could include: cancelling a talk on women’s rights due to internal political pressure, or disciplining a lecturer for provocative anti-monarchist tweets. In response to a breach the regulator can issue fines. […]

I will defend free speech within the law for all views and approaches: post-colonial theory as much as gender critical feminism. Free speech for just one side is not free speech at all. Free speech for all sides benefits all sides. This, not censorship, is the only real engine of both scientific discovery and social progress.

Amongst the examples he cites in the article are the International Holocaust Remembrance Alliance’s working definition of antisemitism. Many of the clauses in this are, in my view, definitely indicative of real antisemitism, but I worry about the clause which says ‘Denying the Jewish people their right to self-determination, e.g., by claiming that the existence of a State of Israel is a racist endeavor.’ In terms of the first part of this, Zionism in the strict sense of the advocacy of a Jewish state was an ideological view which also faced significant opposition amongst some Jewish people prior to World War Two, and even since then there have been a minority who opposed it, or the events which led to the establishment of the state, involving the displacement of a large Palestinian population. Whether or not one believes this a legitimate view, I think it would be worrying to prohibit it in academia, and Ahmed is rightly concerned that the working definition might ‘restrict legitimate political speech and protest.’ This example demonstrates that the arguments with respect to academic freedom go beyond those others – relating to decolonisation, Brexit, pro-life views – which some of a more conservative persuasion might be more likely to evoke.

Ahmed’s appointment comes just two days after the highly publicised unhappy disputes following the Oxford Union’s invitation to Kathleen Stock for interview and public debate. In 2021, Stock felt forced to resign her professorship at the University of Sussex after a sustained mobbing campaign against her led by both students and colleagues on account of her gender-critical views. The affair saw written campaigns of both support and opposition towards Stock’s right to speak, from both staff and students. Inevitably those motivated to lend their name to either view are a small number relative to the totals at the university, but the earlier votes by a range of college Junior Common Rooms to condemn the Union’s decision suggest the scale of the issue. The Union stuck to their guns, and the Vice-Chancellor, Irene Tracey, helpfully spoke out about Stock’s right to speak, and how at university ‘You’ve got to get used to views that are going to be absolutely aligned with your own, and ones that you’ll find distasteful.’

But the protests, featuring hundreds of students and others, none of whom were forced to listen to Stock (no Oxford Union event has any type of compulsory attendance), but simply wanted to stop others being able to, were not a pleasant sight, and point to a significant cohort who do not accept the validity of this type of free speech and academic freedom. As Stock herself argued in a contribution to BBC Radio 4’s Moral Maze, many of the impulses behind so-called ‘cancel culture’ have been around for some time, but are exacerbated by the effect of social media, the ethos of student-as-customer in universities, and a wider culture of virtue signalling, creating a more toxic situation. At the Union, Stock argued that some universities were becoming ‘propaganda machines’ for particular points of view. Hopefully this is some rather than most, but I have certainly observed and heard evidence in a variety of contexts of what she is describing. This is related, in my view, to the growth of an aspect of postmodern ideology which surrendered classical ideas of rationality and empirical evidence as sources of truth, which Helen Pluckrose and James Lindsay, in their book Cynical Theories: How Activist Scholarship Made Everything about Race, Gender, and Identity – and Why This Harms Everybody, argue developed into what they describe as a form of ‘reified postmodernism’ in a second wave in the 1980s, whereby claims of the impossibility of objective knowledge were nonetheless married to absolutist axioms relating to structures of power and domination.

In a 2022 article, Stock reminisced back to philosophy seminars in the 1990s and the pugilistic exchanges between speakers and faculty questioners which would invariably ensue, a state of affairs which, whilst certainly problematic in some respects, might be preferable to a new climate of synthetic positivity, avoidance of difficult questioning, but in which the disputes are displaced to other arenas, through social media, complaints or personalised back-chat, all of which could be quite poisonous and showed a marked tendency towards the ad hominem fallacies which were previously deemed off-limits to serious philosophical discussion. Earlier this week, en route to Oxford, Stock said to one accompanying her in the car that ‘Philosophy is a very combative discipline. We expose the weak spots in each other’s arguments. Test them to destruction. It can be brutal. But afterwards we all go to the pub.’

I will save wider thoughts on this for a separate forthcoming blog post on scholarly disagreement, but I cite these things to highlight the disturbing tendency among some academics to eschew proper and rigorous debate on actual issues, and re-focus it on individual academics and/or their identities, viewed relative to the absolutist view of power structures and domination which is characteristic of reified postmodernism, and bolstered by the refusal to engage in the more established means of scholarly debate. Those who disagree are pathologised, even demonised, and are viewed as problems ideally to be eliminated from academic life. All aspects of this ideological outlook entail hostility to rational or evidence-based debate, and as such the possibilities for fruitful communications and interactions between those of different scholarly, political, methodological and other views become more and more limited.

I do not see this as a positive development in academia, but it would be rash to suggest there is an easy solution. The most I could suggest for those who are committed to the possibility of at least relative scholarly ‘truth’ which is not wholly reducible to ideology is to continue to promote and propagate work which makes best use of established scholarly criteria and methods, in the hope of convincing others of the cogency of such approaches.

But I believe there are some other fundamental issues at stake here, which I hope Ahmed will address in his new role. Few philosophers need to be convinced of the values of the humanities (or at least I would hope few), but these are by no means unanimously endorsed either within academia or amongst others in politics and government making decisions which impact upon academic life. Calls to ‘blow up the humanities’, made most explicitly by cultural and media studies scholar Toby Miller in a 2012 book, may not be always be presented so starkly, but underlying similar ideological positions have contributed to a situation (not new, but perhaps more pressing today than ever) in which many in academia wishing to offer principled opposition to increasing cuts to arts and humanities degrees in the UK (a debate also mirrored in the United States) have found it harder to be heard than ever.

However, wider rhetoric, not least from some politicians, who seem intent on driving a wedge between arts and humanities subjects on one side, and STEM ones on the other, is also unproductive in this context. Wider scholarly values of critical thinking, evidence-based research, proper consideration of heterogeneous perspectives on an area of study (including those undertaken from different political perspectives), free and open debate and enquiry, all of which might be viewed as having roots in a very traditional view of the humanities, have relevance and value to practically all scholarly subjects, including STEM subjects. The latter are not just about learning technical skills – to portray them solely as this is, I believe, an idealised fantasy of those without real experience of such disciplines – but involve many critical (and often creative) decisions, engagement with any number of contingencies, competing methods and advocates for disparate approaches. I do not see a fundamental difference between some of the exacting skills I learned as an undergraduate in mathematics, having to find creative solutions to sometimes fearsome tasks which required a formulation, and those which I employ today when seeking cogent historical or analytical models relevant to particular music, or for that matter when formulating an interpretation of a piece of music I might play.

Other rhetoric from government and politicians of various persuasions relating to higher education tends to emphasise their role in equipping a workforce, stimulating a knowledge economy, or sometimes about bolstering communities around institutions. In absolutely no sense would I wish to deny the value of each of these, nor for that matter dismiss the priorities of students, sometimes from less than privileged backgrounds, who see a university degree primarily as a means to achieve better job opportunities than otherwise. But I do not believe any of these concerns should be used as a means to deny the vital role of universities as centres of independent critical scholarly inquiry. If this aspect I have argued elsewhere that critical engagement with external practice is not the same as subservience; critical independence by no means implies a malign view of external partners, and the results of scholarly inquiry may be very favourable towards such partners, but it is vital that academics and universities are not simply bound to secondary position relative to such partners, and are free in principle to challenge them. If critical scholarly inquiry disappears or is marginalised, if courses featuring it prominently are downgraded or abolished, or others are modified to remove such elements, that is one of the biggest threats to academic freedom.

New article in the Weekend Australian Review on issues relating to #notationgate and deskilling

A new cover article in The Weekend Australian Review, Rosemary Neill, ‘Notes on a Scandal: The raging debate over our next generation of composers and musicians: should they be able to read a score?’, Weekend Australian Review, 29-30 August 2020, brings to a further readership many of the key issues debated a few years ago as part of #notationgate and also of deskilling (see here and here). This is behind a paywall, but can currently be accessed here for those with a subscription.

Neill speaks at the outset to student composer Dante Clavijo, who surprises some people by saying that he still composes using pen and paper, rather than relying entirely upon digital audio workstations. Clavijo argues that songwriters and composers ‘absolutely benefit from knowing notation; it’s jut a logical way to organise musical thought.’ But this then leads to the question of whether even those studying music at tertiary level need to learn notation. On this, Neill quotes my collaborator Peter Tregear:

Yet Peter Tregear, a former head of the ANU’s school of music, points out that these days, students can graduate with music degrees without being able to read music, particularly if they are studying popular music and music technology subjects or degrees, and he is scathing about this trend.

“I find it concerning,” says Tregear, who obtained a PhD in musicology from Cambridge University and has worked at Cambridge, Melbourne and Monash universities. “It’s a misunderstanding of what universities are there to do. We’re meant to be expanding minds and opening horizons. … If you no longer teach musical notation, you effectively wipe out not just a good deal of recent Australian music history, but a large swathe of music history full-stop.”

Tregear presided over the ANU’s music school from 2012 to 2015 and waged a battle to keep several notation-centred subjects in the music degree. He lost.

He attributes the decoupling of music education and traditional notation to the march of new technologies and – more controversially – to a push to “decolonise” the music curriculum, because the classical canon was largely created by “dead white men”.

The outspoken academic, who has also won a Green Room Award for conducting, tells Review: “There has been, I think, a false or at least a very dubious conflation of arguments around the fact that western music notation is western music notation, and the idea that we shouldn’t favour it for that reason.

“To borrow an Orwellian phrase, ignorance is now a strength – it is considered that we’re actually better off not to teach this, which I find an extraordinary view for any higher education institution to take.”

In contrast, most European countries still comprehensively studied their own music histories. Still, even in Europe, there was a push at some conservatoriums and universities to “decolonise” the curriculum.

“There is a move away from musical notation as being central to a music education as a kind of excuplation for western historical wrongs,” he says.

Tregear argues that if a music student is incapable of engaging with music that was “increasingly written down” over the course of 1000 years, “a whole wealth of the global musical past is effectively closed to you”.

Tregear is opposed by composer and University of Melbourne professor Barry Conyngham who claims that whether or not his institution’s students ‘can read sheet music or not’, they are ‘very musically capable of conveying musical performances and thoughts.’ But composer Matthew Hindson, of the Sydney Conservatorium of Music, notes that all students there must study music theory and notation.

Other examples are cited such as Paul McCartney and the Beatles, but Clavijo, like others before him, points out the important contributions of others such as George Martin, who certainly did have a more traditional and formal musical training. Others make claims that any objections to the removal of traditional skills are little more than resistance to ‘decolonisation’.

This article obviously comes from an Australian context, from a country in which (as with the US and even to some extent the UK), art music traditions have a much less central cultural role than in much of continental Europe, and with fewer living musical traditions developed over centuries or millennia as in various Asian and African countries. But it points to a wider trend by which a mixture of over-elevated claims for certain technology, allied to populist and commercialist attitudes (invariably favouring Western popular musics – the study of non-Western musical traditions are faring no better in this environment, for all the rhetoric of decolonisation) are said to obviate any requirement for more rigorous training.

My online timelines fill up with videos and websites promising to teach people how to compose in a few weeks without requiring any learning of harmony, use of instruments, and so on. Furthermore, in an interview from two years ago, film composer Hans Zimmer, recently renowned for his slowed-down version of Elgar’s ‘Nimrod’ to accompany the arrival of pleasure boats to rescue British soldiers in Dunkirk, the film which was accurately described as fuelling Brexit fantasies, boasts of having ‘no technique’ and ‘no formal education’, but instead ‘the only thing I know how to write about is something that’s inside of me.’ This sort of argument is not new, and was encountered in the nineteenth-century amongst a range of Russian composers opposed to the professionalisation of music-making and establishment of conservatoires for this purpose. Appealing to some sense of inner authenticity and the notion that somehow anyone can be a composer so long as they have something ‘inside of them’, has a long and dishonourable history, as was debated extensively in the responses to Stella Duffy posted on this blog in 2017. It speaks to a wider culture of anti-intellectualism and deskilling, in which the only measure of art is commercial and popular success.

I continue to believe that it would be a great loss if those who go on to teach music in primary and secondary cannot read music and thus will be unable to impart it to pupils, or if composition becomes merely about copying and pasting others’ work. This is not to deny the importance throughout musical history of musical borrowing, an issue about which there are a range of sophisticated theoretical models (of which I undertake a critical survey in order to arrive at models for analysing the work of Michael Finnissy, in my book chapter, ‘Negotiating borrowing, genre and mediation in the piano music of Finnissy: strategies and aesthetics’). A good deal of very superficial writing on postmodernism, intertextuality and so on, is founded essentially a dichotomy between two straw men – an insistence upon absolute originality or total plagiarism, when in reality almost all music of any quality inhabits differing positions on a spectrum. That Bach, Mozart, Beethoven, Rossini, Schubert, Schumann, Chopin, Liszt, Wagner, Verdi, Brahms, Debussy, Stravinsky or any number of others drew upon existing musical forms, genres, styles, sometimes explicitly borrowed musical materials (for example Liszt’s huge range of ‘transcriptions’ for piano, or Brahms’s many pieces alluding to Renaissance or early Baroque choral music) has never seriously been in doubt to anyone familiar with their work. Such examples as Stravinsky’s transformation of baroque musical materials into an angular, askew, sometimes dissonant, and alienated musical experience, Finnissy’s transformations of small groups of pitches and rhythms from Sardinian folk song into wild, rampaging musical canvasses, Ives’s hallucinatory and terrifying visions incorporating the residues upon consciousness of mangled hymns, allusions to brass bands, Beethoven and more, Berio’s carefully-judged fragmentations and superimpositions of a wide range of music from nineteenth- and twentieth-century orchestral and other repertoire on top of parallel threads provided by the scherzo from Mahler’s Second Symphony and a text from Beckett’s The Unnamable, to create an unsettling tapestry of commentary and critique, or for that matter Chopin’s use of known dance and other genres (waltz, polonaise, mazurka, etc.) allied to a Bellinian sense of vocal line and an ultra-refined contrapuntal sensibility, are all a world away from music which simply lifts others’ work or hackneyed clichés for ready-made, tried and tested, effects and moods. What distinguishes the above (and many others, including many in non-‘classical’ fields of composition) is a highly developed and refined level of musicianship, including detailed musical understanding of the properties of the sources upon which they draw. These are not achieved easily, and empty claims that anyone can be a composer comparable with the above, without having to go through the training, are no more convincing than equivalent claims about becoming a surgeon.

The UK EU Referendum and the decline of democracy in a time of social media, safe spaces and postmodern relativism

The 2016 UK referendum campaign on EU membership has not been a happy time for democracy, even before the tragic murder of Jo Cox. There have certainly been decent and principled protagonists involved with both the Remain and Leave campaign who have drawn upon issues and data to form solid arguments (and I think here the role of Jeremy Corbyn, perhaps the ultimate politician driven solely by issues, has been underestimated). But to a very high degree, the campaign has not been like this, and has been saturated with cheap populist pandering, lies and misinformation, conflation of the EU with only tangentially-related issues (such as that of refugees from the Middle East), and above all, a type of campaigning which appeals on an emotional rather than rational level, by stoking fear, playing to tribal identity (including racism and xenophobia), crudely dismissing opponents’ positions without proper argument, and so on.’Experts’ have been summarily dismissed and denigrated, facts have been little-appreciated and understood, and the whole campaign has played out in sitting rooms, offices, bars and cafés amongst large numbers of voters who I would wager know very little about the actual nature or workings of the EU, the policies and voting records of their democratically elected MEPs, which of the EU horror stories reported by the tabloid press are fact, which fiction or gross distortion, and so on.

This is all a very great shame, as this campaign should have provided an opportunity for a new level of public education about the EU, its history, and operations, and indeed about Britain’s relationship to continental Europe as a whole. I realise that it is over-idealistic to expect all or even most of the population to make highly intelligent, rational and educated decisions based on issues rather than personalities, but the referendum campaign has sunk to new lows in this respect.

Many have not unreasonably questioned the wisdom of holding a referendum at all on such an issue, in the knowledge that it would likely be determined more by prejudice than any more mature politics. I have little doubt that it was called because of David Cameron’s needing to temper a split down the middle of the Conservative Party, just as Harold Wilson did the same in 1975 when his own party and cabinet were deeply split on the same issue. But I am hesitant about saying that referenda on major constitutional issues are wrong; if one accepts the validity of those referenda on devolution (and independence) in Scotland and Wales, for example, it is hard to argue against giving the British people a chance to vote on this.

However, I think we are now living in the worst possible time for such a campaign, and a low point for cynical dismissal of all politicians (at least those who have ever held any power or office) and democratic debate in general. And I wish to suggest a few hypotheses about some factors which have brought about this situation.

The last decade has seen the growth of social media, especially Facebook and Twitter. As a regular user of both, I would be the last person in a position to start arguing that these are a bad things, but I do see some major problems they engender. Facebook is ubiquitous, especially amongst younger generations; Twitter is particularly favoured by journalists, media types, many politicians, and others, which gives it a different general political complexion. Online communications are not so new – many used online messageboards and chatrooms before either Facebook or Twitter were created – but these more recent sites create a means by which many people’s whole lives are partially spent, and documented, online, to be seen by others, who often provide solace by expressing their approval. But of course, on Facebook in particular, one gets to choose who is in one’s circle (Twitter is much more public, a likely reason why it is used less often by those who simply wish to communicate with their friends). That in itself is not so different from some of the wider world, though it is hard to avoid coming into contact with strangers and those who might look at the world in a quite different way, unless one lives a relatively hermetic existence. That is not the case on Facebook; one can inhabit a realm entirely populated by like-minded people. In the face of cyber-bullying (much easier from the safety of a computer screen or smartphone than face-to-face bullying), many increasingly choose to do this. This is more than understandable, but with it comes the problems of creating an ‘echo chamber‘, whereby one puts out views and opinions mostly in order to have them echoed by others (at least this can be the result, if not the intention), and gain self-esteem by being regularly ‘liked’.

In itself, this phenomenon might not be so bad, except for when it blinds some to the possibility that the wider world might be quite unlike the comfort zone they inhabit on social media. Worse, it can generate a good deal of in-group/out-group hostility, leading to disdain, dismissal or even hatred towards anyone who breaks with a narrow consensus. This is how group bullying works in general (and mirrors wider prejudice and ostracisation of minority groups), but the relative safety of social media makes the bullying easier for the bullies, and arguably even more devastating for the victims (perhaps especially in the case of Twitter storms against those who have made some careless, ignorant, or mildly bigoted remarks there).

As the new Vice-Chancellor of Oxford, Louise Richardson, recently argued on a radio interview, a new generation of students have grown up spending their formative years within the echo chambers of social media, and these are the ones now demanding trigger warning, safe spaces and the likes (I would extend Richardson’s arguments to include many older adults too). Whilst it is perfectly reasonable for individuals to ask for some protection from hatred, highly personalised attacks, harassment and bullying, I fear many have lost a sense of the distinction between these and proper argument and robust debate, or rational critique (even if severe) of work, when applied fairly (i.e. not applying radically different standards to different work or individuals because of other motivations).

In many ways I do believe that many students and academics are attempting to demand that their working lives resemble the type of pampered realm to which they have become accustomed on social media, or simply from surrounding themselves with crowds of acolytes and other true believers. This is especially detrimental to academia and education in general, which should provide spaces where all types of positions and arguments can be presented and properly debated, and which can militate against easy complacency and unexamined positions. Lecturers should challenge students, students should challenge lecturers, members of each group should regularly challenge each other, and the frameworks of the institutions should ensure that this can happen. Safe spaces and trigger warnings are the very opposite of this, as are highly emotive or rhetorical modes of argument or teaching. Obviously not all students, or lecturers, necessarily have the emotional or intellectual maturity to cope with proper debate and challenge when they start out in these places, but I believe it is imperative that they learn to develop such maturity. Other factors can work against this though; one is the simple narcissism of some students and lecturers, in the latter case countenancing no dissenting viewpoints or literature, and seeking to personally demean or undermine anyone who thinks otherwise; such individuals are invariably extremely poor teachers, rarely interested in learning, only in being adored. Another is the growth of corporate academic culture, by which top-down directives are issued for management, and the wider culture rewards all types of conformity, in flagrant contradiction of the principles of academic freedom. Also, I see many academics organising into narrow factions, only containing those who agree or at least share a range of basic assumptions, with the same techniques of ostracisation of dissenters to be found in social media. This is another form of bullying which I have experienced and witnessed far too often.

This may seem a big tangent, from an academic too focused upon the type of environment in which they work. This may be the case (I would mention that I do also inhabit a very different – if equally problematic – realm as a professional musician), but I think when even the most hallowed spaces for free debate and argument are becoming corrupted in this manner, then this bodes very ill for other areas of public life. If those in academic life cannot separate issues and personalities, what are the chances of the wider public being able to do the same?

But the type of ideal democratic debate I have been outlining does require a belief in the very possibility of facts and rational debate; a belief which some who identify as ‘postmodern’ do not hold. On a feature earlier this year on BBC Newsnight, the reporter suggested that US Republicans had been having a ‘postmodern moment’ with the rise of Donald Trump, who ultimately does not care that much about facts, nor really hides the fact. It may seem very surprising to link a right-wing demagogue like Trump to postmodernism, and I would hesitate to do so, but I do see reasons why the phenomena may not be unrelated.

In the postmodern realm (about which inevitably I generalise a little), truth says more about the power held by those proclaiming it, ‘subject positions’ (which, as Terry Eagleton has argued, are the nearest contemporary thing to older ideals of ‘authenticity’) matter more than the cogency of arguments presented, ‘facts’ are mostly an illusion, rational debate is little more than an ideological conceit of the privileged, and ultimately arguments are better judged on political allegiance than any supposedly more disinterested criteria. These are the extreme positions, for sure, not all of which are held (or held in such a fundamentalist fashion) by all of those identifying as postmodern, but they are not imaginary. In certain modified forms, I would not disagree that some of these positions have value; some ‘facts’ are somewhat spurious, but have been accepted because certain people have propagated them, whilst certain narrowly ‘rational’ approaches to debate can have a dehumanising effect through the ways in which they are framed (with associated rhetoric, for example that of ‘collateral damage’). But I would challenge these in the name of better conceptions of facts, rationality, and so on, not in order to dismiss the concepts in general. Experts should be challenged, including by political campaigners in a referendum such as this one, but in order that they are required to substantiate and explain their expert views and conclusions, not because anyone else can lay an equal claim to expertise.

As Richard Evans pointed out in his book In Defence of History, when a position appeals purely on the basis of the politics it espouses, there is little if any chance of ever being able to convince someone of a different political persuasion, for that requires some appeal to wider knowledge beyond allegiances. I would say the same applies to appeals to identity; most fatally, the very legitimation of identity as a criterion of political value has ultimately emboldened most the right-wing Leave campaign, enabling them to appeal to a sense of national belonging and identity, with a concomitant fear of and hostility towards foreigners, amongst white working-class and older people (see this pertinent article by John Harris).

Modern democracy is a deeply flawed system in many ways. It has developed in line with the modern Western nation state, and no-one has yet really found a workable system which is not enclosed within the borders of such nation states (ironically, the European Parliament might be one of the better attempts at so doing). The late historian Tony Judt (in interview in the volume Thinking the Twentieth Century) pointed out that with the fall of the Hapsburg monarchy, Jewish people in Austria faced a new threat as a minority within mass democratic society, after having received some degree of protection Emperor Franz Joseph II. Democracy within a nation state will always be problematic for minority groups within that nation state, for simple numerical reasons, when there is some degree of conflict. And beyond this, it is no easy task to convince an electorate, especially one undergoing difficult economic and other conditions, to factor in the interests of other non-citizens (here including other Europeans, migrants and refugees) when this is presented as being against their own self-interest.

But I do not believe these problems cannot be at least mitigated, with a properly operative media representing a genuine plurality of opinion, a high degree of education about the political process and issues at schools, a functioning public sphere (for which a different type of social media can play an important role), and an acceptance that ‘democracy’ is a wider concept than simply putting some Xs in boxes from time to time, and involves a degree of engagement and respect for all types of groups in society. I wish I could say I see this happening in the UK, but am currently pessimistic. There is a growing level of generalised disenchantment with the political process and politicians in general, declining turnout at elections, especially amongst the young (though the Scottish Referendum was a marked exception), and a wider culture which is increasingly anti-intellectual and even tribal. Unelected and unaccountable celebrities, media personalities and even industry leaders seem to garner more respect than those who regularly submit themselves to electoral ratification.

The writer Edward Bernays, father of modern propaganda and public relations, realised the much greater potency of campaigns which operate on an emotive or atavistic level than those involving rational decisions (Bertold Brecht would have agreed, but drawn very different conclusions). Bernays’ ideas, and their application in PR, advertising, politics and more have been explored and chronicled in Adam Curtis’s documentary The Century of the Self. In the process, powerful tools have been developed which feed into an increasingly irrationalist political sphere. Extreme relativists, those cocooned in social media and echo chambers, and many of the advocates of safe spaces, should all consider whether they are playing a part in forfeiting the possibility of any alternative.