New article in the Weekend Australian Review on issues relating to #notationgate and deskilling

A new cover article in The Weekend Australian Review, Rosemary Neill, ‘Notes on a Scandal: The raging debate over our next generation of composers and musicians: should they be able to read a score?’, Weekend Australian Review, 29-30 August 2020, brings to a further readership many of the key issues debated a few years ago as part of #notationgate and also of deskilling (see here and here). This is behind a paywall, but can currently be accessed here for those with a subscription.

Neill speaks at the outset to student composer Dante Clavijo, who surprises some people by saying that he still composes using pen and paper, rather than relying entirely upon digital audio workstations. Clavijo argues that songwriters and composers ‘absolutely benefit from knowing notation; it’s jut a logical way to organise musical thought.’ But this then leads to the question of whether even those studying music at tertiary level need to learn notation. On this, Neill quotes my collaborator Peter Tregear:

Yet Peter Tregear, a former head of the ANU’s school of music, points out that these days, students can graduate with music degrees without being able to read music, particularly if they are studying popular music and music technology subjects or degrees, and he is scathing about this trend.

“I find it concerning,” says Tregear, who obtained a PhD in musicology from Cambridge University and has worked at Cambridge, Melbourne and Monash universities. “It’s a misunderstanding of what universities are there to do. We’re meant to be expanding minds and opening horizons. … If you no longer teach musical notation, you effectively wipe out not just a good deal of recent Australian music history, but a large swathe of music history full-stop.”

Tregear presided over the ANU’s music school from 2012 to 2015 and waged a battle to keep several notation-centred subjects in the music degree. He lost.

He attributes the decoupling of music education and traditional notation to the march of new technologies and – more controversially – to a push to “decolonise” the music curriculum, because the classical canon was largely created by “dead white men”.

The outspoken academic, who has also won a Green Room Award for conducting, tells Review: “There has been, I think, a false or at least a very dubious conflation of arguments around the fact that western music notation is western music notation, and the idea that we shouldn’t favour it for that reason.

“To borrow an Orwellian phrase, ignorance is now a strength – it is considered that we’re actually better off not to teach this, which I find an extraordinary view for any higher education institution to take.”

In contrast, most European countries still comprehensively studied their own music histories. Still, even in Europe, there was a push at some conservatoriums and universities to “decolonise” the curriculum.

“There is a move away from musical notation as being central to a music education as a kind of excuplation for western historical wrongs,” he says.

Tregear argues that if a music student is incapable of engaging with music that was “increasingly written down” over the course of 1000 years, “a whole wealth of the global musical past is effectively closed to you”.

Tregear is opposed by composer and University of Melbourne professor Barry Conyngham who claims that whether or not his institution’s students ‘can read sheet music or not’, they are ‘very musically capable of conveying musical performances and thoughts.’ But composer Matthew Hindson, of the Sydney Conservatorium of Music, notes that all students there must study music theory and notation.

Other examples are cited such as Paul McCartney and the Beatles, but Clavijo, like others before him, points out the important contributions of others such as George Martin, who certainly did have a more traditional and formal musical training. Others make claims that any objections to the removal of traditional skills are little more than resistance to ‘decolonisation’.

This article obviously comes from an Australian context, from a country in which (as with the US and even to some extent the UK), art music traditions have a much less central cultural role than in much of continental Europe, and with fewer living musical traditions developed over centuries or millennia as in various Asian and African countries. But it points to a wider trend by which a mixture of over-elevated claims for certain technology, allied to populist and commercialist attitudes (invariably favouring Western popular musics – the study of non-Western musical traditions are faring no better in this environment, for all the rhetoric of decolonisation) are said to obviate any requirement for more rigorous training.

My online timelines fill up with videos and websites promising to teach people how to compose in a few weeks without requiring any learning of harmony, use of instruments, and so on. Furthermore, in an interview from two years ago, film composer Hans Zimmer, recently renowned for his slowed-down version of Elgar’s ‘Nimrod’ to accompany the arrival of pleasure boats to rescue British soldiers in Dunkirk, the film which was accurately described as fuelling Brexit fantasies, boasts of having ‘no technique’ and ‘no formal education’, but instead ‘the only thing I know how to write about is something that’s inside of me.’ This sort of argument is not new, and was encountered in the nineteenth-century amongst a range of Russian composers opposed to the professionalisation of music-making and establishment of conservatoires for this purpose. Appealing to some sense of inner authenticity and the notion that somehow anyone can be a composer so long as they have something ‘inside of them’, has a long and dishonourable history, as was debated extensively in the responses to Stella Duffy posted on this blog in 2017. It speaks to a wider culture of anti-intellectualism and deskilling, in which the only measure of art is commercial and popular success.

I continue to believe that it would be a great loss if those who go on to teach music in primary and secondary cannot read music and thus will be unable to impart it to pupils, or if composition becomes merely about copying and pasting others’ work. This is not to deny the importance throughout musical history of musical borrowing, an issue about which there are a range of sophisticated theoretical models (of which I undertake a critical survey in order to arrive at models for analysing the work of Michael Finnissy, in my book chapter, ‘Negotiating borrowing, genre and mediation in the piano music of Finnissy: strategies and aesthetics’). A good deal of very superficial writing on postmodernism, intertextuality and so on, is founded essentially a dichotomy between two straw men – an insistence upon absolute originality or total plagiarism, when in reality almost all music of any quality inhabits differing positions on a spectrum. That Bach, Mozart, Beethoven, Rossini, Schubert, Schumann, Chopin, Liszt, Wagner, Verdi, Brahms, Debussy, Stravinsky or any number of others drew upon existing musical forms, genres, styles, sometimes explicitly borrowed musical materials (for example Liszt’s huge range of ‘transcriptions’ for piano, or Brahms’s many pieces alluding to Renaissance or early Baroque choral music) has never seriously been in doubt to anyone familiar with their work. Such examples as Stravinsky’s transformation of baroque musical materials into an angular, askew, sometimes dissonant, and alienated musical experience, Finnissy’s transformations of small groups of pitches and rhythms from Sardinian folk song into wild, rampaging musical canvasses, Ives’s hallucinatory and terrifying visions incorporating the residues upon consciousness of mangled hymns, allusions to brass bands, Beethoven and more, Berio’s carefully-judged fragmentations and superimpositions of a wide range of music from nineteenth- and twentieth-century orchestral and other repertoire on top of parallel threads provided by the scherzo from Mahler’s Second Symphony and a text from Beckett’s The Unnamable, to create an unsettling tapestry of commentary and critique, or for that matter Chopin’s use of known dance and other genres (waltz, polonaise, mazurka, etc.) allied to a Bellinian sense of vocal line and an ultra-refined contrapuntal sensibility, are all a world away from music which simply lifts others’ work or hackneyed clichés for ready-made, tried and tested, effects and moods. What distinguishes the above (and many others, including many in non-‘classical’ fields of composition) is a highly developed and refined level of musicianship, including detailed musical understanding of the properties of the sources upon which they draw. These are not achieved easily, and empty claims that anyone can be a composer comparable with the above, without having to go through the training, are no more convincing than equivalent claims about becoming a surgeon.


The UK EU Referendum and the decline of democracy in a time of social media, safe spaces and postmodern relativism

The 2016 UK referendum campaign on EU membership has not been a happy time for democracy, even before the tragic murder of Jo Cox. There have certainly been decent and principled protagonists involved with both the Remain and Leave campaign who have drawn upon issues and data to form solid arguments (and I think here the role of Jeremy Corbyn, perhaps the ultimate politician driven solely by issues, has been underestimated). But to a very high degree, the campaign has not been like this, and has been saturated with cheap populist pandering, lies and misinformation, conflation of the EU with only tangentially-related issues (such as that of refugees from the Middle East), and above all, a type of campaigning which appeals on an emotional rather than rational level, by stoking fear, playing to tribal identity (including racism and xenophobia), crudely dismissing opponents’ positions without proper argument, and so on.’Experts’ have been summarily dismissed and denigrated, facts have been little-appreciated and understood, and the whole campaign has played out in sitting rooms, offices, bars and cafés amongst large numbers of voters who I would wager know very little about the actual nature or workings of the EU, the policies and voting records of their democratically elected MEPs, which of the EU horror stories reported by the tabloid press are fact, which fiction or gross distortion, and so on.

This is all a very great shame, as this campaign should have provided an opportunity for a new level of public education about the EU, its history, and operations, and indeed about Britain’s relationship to continental Europe as a whole. I realise that it is over-idealistic to expect all or even most of the population to make highly intelligent, rational and educated decisions based on issues rather than personalities, but the referendum campaign has sunk to new lows in this respect.

Many have not unreasonably questioned the wisdom of holding a referendum at all on such an issue, in the knowledge that it would likely be determined more by prejudice than any more mature politics. I have little doubt that it was called because of David Cameron’s needing to temper a split down the middle of the Conservative Party, just as Harold Wilson did the same in 1975 when his own party and cabinet were deeply split on the same issue. But I am hesitant about saying that referenda on major constitutional issues are wrong; if one accepts the validity of those referenda on devolution (and independence) in Scotland and Wales, for example, it is hard to argue against giving the British people a chance to vote on this.

However, I think we are now living in the worst possible time for such a campaign, and a low point for cynical dismissal of all politicians (at least those who have ever held any power or office) and democratic debate in general. And I wish to suggest a few hypotheses about some factors which have brought about this situation.

The last decade has seen the growth of social media, especially Facebook and Twitter. As a regular user of both, I would be the last person in a position to start arguing that these are a bad things, but I do see some major problems they engender. Facebook is ubiquitous, especially amongst younger generations; Twitter is particularly favoured by journalists, media types, many politicians, and others, which gives it a different general political complexion. Online communications are not so new – many used online messageboards and chatrooms before either Facebook or Twitter were created – but these more recent sites create a means by which many people’s whole lives are partially spent, and documented, online, to be seen by others, who often provide solace by expressing their approval. But of course, on Facebook in particular, one gets to choose who is in one’s circle (Twitter is much more public, a likely reason why it is used less often by those who simply wish to communicate with their friends). That in itself is not so different from some of the wider world, though it is hard to avoid coming into contact with strangers and those who might look at the world in a quite different way, unless one lives a relatively hermetic existence. That is not the case on Facebook; one can inhabit a realm entirely populated by like-minded people. In the face of cyber-bullying (much easier from the safety of a computer screen or smartphone than face-to-face bullying), many increasingly choose to do this. This is more than understandable, but with it comes the problems of creating an ‘echo chamber‘, whereby one puts out views and opinions mostly in order to have them echoed by others (at least this can be the result, if not the intention), and gain self-esteem by being regularly ‘liked’.

In itself, this phenomenon might not be so bad, except for when it blinds some to the possibility that the wider world might be quite unlike the comfort zone they inhabit on social media. Worse, it can generate a good deal of in-group/out-group hostility, leading to disdain, dismissal or even hatred towards anyone who breaks with a narrow consensus. This is how group bullying works in general (and mirrors wider prejudice and ostracisation of minority groups), but the relative safety of social media makes the bullying easier for the bullies, and arguably even more devastating for the victims (perhaps especially in the case of Twitter storms against those who have made some careless, ignorant, or mildly bigoted remarks there).

As the new Vice-Chancellor of Oxford, Louise Richardson, recently argued on a radio interview, a new generation of students have grown up spending their formative years within the echo chambers of social media, and these are the ones now demanding trigger warning, safe spaces and the likes (I would extend Richardson’s arguments to include many older adults too). Whilst it is perfectly reasonable for individuals to ask for some protection from hatred, highly personalised attacks, harassment and bullying, I fear many have lost a sense of the distinction between these and proper argument and robust debate, or rational critique (even if severe) of work, when applied fairly (i.e. not applying radically different standards to different work or individuals because of other motivations).

In many ways I do believe that many students and academics are attempting to demand that their working lives resemble the type of pampered realm to which they have become accustomed on social media, or simply from surrounding themselves with crowds of acolytes and other true believers. This is especially detrimental to academia and education in general, which should provide spaces where all types of positions and arguments can be presented and properly debated, and which can militate against easy complacency and unexamined positions. Lecturers should challenge students, students should challenge lecturers, members of each group should regularly challenge each other, and the frameworks of the institutions should ensure that this can happen. Safe spaces and trigger warnings are the very opposite of this, as are highly emotive or rhetorical modes of argument or teaching. Obviously not all students, or lecturers, necessarily have the emotional or intellectual maturity to cope with proper debate and challenge when they start out in these places, but I believe it is imperative that they learn to develop such maturity. Other factors can work against this though; one is the simple narcissism of some students and lecturers, in the latter case countenancing no dissenting viewpoints or literature, and seeking to personally demean or undermine anyone who thinks otherwise; such individuals are invariably extremely poor teachers, rarely interested in learning, only in being adored. Another is the growth of corporate academic culture, by which top-down directives are issued for management, and the wider culture rewards all types of conformity, in flagrant contradiction of the principles of academic freedom. Also, I see many academics organising into narrow factions, only containing those who agree or at least share a range of basic assumptions, with the same techniques of ostracisation of dissenters to be found in social media. This is another form of bullying which I have experienced and witnessed far too often.

This may seem a big tangent, from an academic too focused upon the type of environment in which they work. This may be the case (I would mention that I do also inhabit a very different – if equally problematic – realm as a professional musician), but I think when even the most hallowed spaces for free debate and argument are becoming corrupted in this manner, then this bodes very ill for other areas of public life. If those in academic life cannot separate issues and personalities, what are the chances of the wider public being able to do the same?

But the type of ideal democratic debate I have been outlining does require a belief in the very possibility of facts and rational debate; a belief which some who identify as ‘postmodern’ do not hold. On a feature earlier this year on BBC Newsnight, the reporter suggested that US Republicans had been having a ‘postmodern moment’ with the rise of Donald Trump, who ultimately does not care that much about facts, nor really hides the fact. It may seem very surprising to link a right-wing demagogue like Trump to postmodernism, and I would hesitate to do so, but I do see reasons why the phenomena may not be unrelated.

In the postmodern realm (about which inevitably I generalise a little), truth says more about the power held by those proclaiming it, ‘subject positions’ (which, as Terry Eagleton has argued, are the nearest contemporary thing to older ideals of ‘authenticity’) matter more than the cogency of arguments presented, ‘facts’ are mostly an illusion, rational debate is little more than an ideological conceit of the privileged, and ultimately arguments are better judged on political allegiance than any supposedly more disinterested criteria. These are the extreme positions, for sure, not all of which are held (or held in such a fundamentalist fashion) by all of those identifying as postmodern, but they are not imaginary. In certain modified forms, I would not disagree that some of these positions have value; some ‘facts’ are somewhat spurious, but have been accepted because certain people have propagated them, whilst certain narrowly ‘rational’ approaches to debate can have a dehumanising effect through the ways in which they are framed (with associated rhetoric, for example that of ‘collateral damage’). But I would challenge these in the name of better conceptions of facts, rationality, and so on, not in order to dismiss the concepts in general. Experts should be challenged, including by political campaigners in a referendum such as this one, but in order that they are required to substantiate and explain their expert views and conclusions, not because anyone else can lay an equal claim to expertise.

As Richard Evans pointed out in his book In Defence of History, when a position appeals purely on the basis of the politics it espouses, there is little if any chance of ever being able to convince someone of a different political persuasion, for that requires some appeal to wider knowledge beyond allegiances. I would say the same applies to appeals to identity; most fatally, the very legitimation of identity as a criterion of political value has ultimately emboldened most the right-wing Leave campaign, enabling them to appeal to a sense of national belonging and identity, with a concomitant fear of and hostility towards foreigners, amongst white working-class and older people (see this pertinent article by John Harris).

Modern democracy is a deeply flawed system in many ways. It has developed in line with the modern Western nation state, and no-one has yet really found a workable system which is not enclosed within the borders of such nation states (ironically, the European Parliament might be one of the better attempts at so doing). The late historian Tony Judt (in interview in the volume Thinking the Twentieth Century) pointed out that with the fall of the Hapsburg monarchy, Jewish people in Austria faced a new threat as a minority within mass democratic society, after having received some degree of protection Emperor Franz Joseph II. Democracy within a nation state will always be problematic for minority groups within that nation state, for simple numerical reasons, when there is some degree of conflict. And beyond this, it is no easy task to convince an electorate, especially one undergoing difficult economic and other conditions, to factor in the interests of other non-citizens (here including other Europeans, migrants and refugees) when this is presented as being against their own self-interest.

But I do not believe these problems cannot be at least mitigated, with a properly operative media representing a genuine plurality of opinion, a high degree of education about the political process and issues at schools, a functioning public sphere (for which a different type of social media can play an important role), and an acceptance that ‘democracy’ is a wider concept than simply putting some Xs in boxes from time to time, and involves a degree of engagement and respect for all types of groups in society. I wish I could say I see this happening in the UK, but am currently pessimistic. There is a growing level of generalised disenchantment with the political process and politicians in general, declining turnout at elections, especially amongst the young (though the Scottish Referendum was a marked exception), and a wider culture which is increasingly anti-intellectual and even tribal. Unelected and unaccountable celebrities, media personalities and even industry leaders seem to garner more respect than those who regularly submit themselves to electoral ratification.

The writer Edward Bernays, father of modern propaganda and public relations, realised the much greater potency of campaigns which operate on an emotive or atavistic level than those involving rational decisions (Bertold Brecht would have agreed, but drawn very different conclusions). Bernays’ ideas, and their application in PR, advertising, politics and more have been explored and chronicled in Adam Curtis’s documentary The Century of the Self. In the process, powerful tools have been developed which feed into an increasingly irrationalist political sphere. Extreme relativists, those cocooned in social media and echo chambers, and many of the advocates of safe spaces, should all consider whether they are playing a part in forfeiting the possibility of any alternative.