I read somewhere that most academic articles in the humanities never receive a single citation. It seemed sad and odd, so I thought to look it up. It’s actually 92% that go uncited!
And yet social media is full of people advertising their latest papers or calls for more papers. Why? Well, obviously they’re proud of their hard work and want to let people know about it. And they’d rather like their hard work to be among the 8% of papers which do get noticed.
But most people connect with more than just other academics in their field, so they know that such posts might come across as a bit spammy to those not in the field of Higher Education. Like family or friends.
So why do they continue to do so? Because they are required to as part of their job.
Publishers will demand to know how academics intend to promote their work. And universities measure academics against things that are actually impossible for them to guarantee – the shibboleths of ‘impact’ and ‘outreach’ primarily. And these are counted in purely numerical terms – the clicks, the tweets, the retweets and reposts, but most of all the citations.
So there’s a lot of work being produced and hence a lot of people trying to get their voice heard in the shouting gallery of social media, pleading with anyone who’s listening to read their work, or even just click on it.
But yet I rarely hear anyone say that they READ a great academic article recently.
Since academia became a publish-or-perish game, it has incentivised academics to churn out endless articles but it doesn’t actually incentivise them to read any, except perhaps as footnote fodder for their own outputs. No academic is set targets for how much they will read in the forthcoming semester or year.
Universities don’t just apply arbitrary (and sky-high) publishing targets to academics. They also demand things it simply isn’t in an academic’s control to offer, ie impact, outreach, virality and citations. It all adds up to a lot of pressure, and undoubtedly affects the quality of the papers being written.
The downstream affects of this ought to be obvious, especially in fields and disciplines which move quickly and require scholars to stay up-to-date with the latest discoveries or thinking.
Now I no longer have work targets for publication set by paper-shuffling administrators, I’ve really come to appreciate the ability to read and learn.
Universities should find ways to incentivise academics to read more and write less. Students and scholars and the sum of human knowledge would all benefit. But then, the bean counters wouldn’t have anything to count, and would no longer be able to bully or overwork staff with arbitrary and often punitive targets, or metrics they have no way of being able to guarantee.
In most areas of life, you can pursue quantity or quality. If universities really want their research to reach people, inspire them, change the game of a research field, or make a difference to society or the public, then they need to facilitate time and space for academics to read, and let them deliver work that really counts
It’s that time of the year when the students are facing their exam periods. I used to see quite a few meltdowns each year from anxious kids inhabiting a grade borderline. The pressure seems so vast for some of them.
So much seems to ride on their results. Will they outcompete their peers sufficiently to nab a good job? Where even are there any jobs anymore? AI seems to be devouring entire sectors of human labour voraciously. Copyrighting, graphic design, news reporting, even creative writing. Even the student’s own essays. All their future labour is already being replaced before they even got there.
Most of the answers out there tend to say STEM. Go into science or tech. That’s where the last stand of human labour takes place. That’s where there’s still a wage, still some future perhaps.
As a result, the humanities are allegedly dying. Literature and History now look as threatened as Modern Languages did a half-generation ago. Market forces are eroding Literature enrollments so much that entire departments are closing en masse. It doesn’t currently look like Literature is going to have a happy ending.
I had a ringside seat in academia for some of the death throes of Modern Languages as a discipline. As the world, particularly the online bit of it, converges on Globish as its chosen lingua franca, there will always be a need for TEFL and TESOL, though increasingly even pedagogy will be achieved in part online and learning from machines.
But to an entire generation in Britain and Ireland, it suddenly seemed kinda pointless to study French, or German, or Spanish, or Italian, never mind Russian, or Chinese, or Arabic. Entire departments dissolved overnight. The discipline is a fraction of its former size now in just a few years and the condition may be close to terminal.
And now it seems like English Lit is dying too, or at least, so the media is scaremongering. For full disclosure, I’m happy to admit I have an English Lit degree. In fact, I’ve got two (a BA (Hons) and a PhD, both from Trinity College Dublin; I never bothered doing a Master’s.)
When I was doing my second, in what we might politely refer to as early middle age, I was frogmarched to a fascinating talk given to doctoral students by former doctoral graduates of English. We were told that only one in ten of us would end up permanently employed in academia, such was the pressure of the number of graduates versus the number of already threatened positions worldwide. Such jobs as did seem to be available were already primarily in China.
We got given inspirational little mini-lectures from people working in publishing, in parliament, in accountancy and law, and entrepreneurs of all kinds. The message was clear: this is the future most of you should expect.
Then they told me something which has haunted me since, primarily because they excluded me from it on the grounds of my more advanced age. They said: the demands of the world are now constantly changing. Most of you will have about ten jobs in your career, where previous generations might have had merely one or two, the ‘job for life’ of lore. And of those ten jobs, they added, six haven’t yet been invented.
Of course, it’s true, or at least it’s a truism of sorts. Tech is accelerating sufficiently as to require entire squadrons of people, from programmers to imagineers, that didn’t exist a generation ago, and will continue to do so. Hence the allegedly safe haven of STEM.
University Careers Advisor: Six out of ten of your future jobs haven’t been invented yet, yay!
Digital Careers Advice Avatar ten years from now: The AI overlords are hiring radiation sludge technicians for the prohibited zone. It’s that or we’re uploading you to the cloud to save the cost of feeding you. Which do you prefer?
I was lucky enough to be the one in ten who got a career in academia, though I’m currently out of it. But it did get me thinking about the need to embed some form of adaptability and resilience into student curricula at all levels, from primary school to post-doc. (I’ve spoken about this extensively before.) Because those are the only attributes that will truly allow young people to future-proof themselves for the demands of their adult lives.
And this is where I think studying the literature of the past can come in useful. English Literature is a degree which teaches critical thinking, use of language, aesthetic appreciation and a range of other comprehensive techniques. But it also frames the world as stories. As Yuval Noah Harari has pointed out, the human superpower, the ability which shot us past predator species and all other creatures to dominate this planet, was and remains our collective abilities to tell and share stories. And English Lit graduates learn how those stories work, which is another way of saying how the world works.
So I have two English Lit degrees. I can’t exactly say that they always directly impacted on all the jobs I’ve had. I have been among other things a roulette croupier, a barman in a lesbian pub, the Olympics correspondent for the Morning Star newspaper, a wine bar sommelier, a roofer, a film critic, maître d’ of a Creole restaurant, a playwright, and a member of the Guinness quality control taste panel at St James’s Gate brewery.
Most of those experiences didn’t make it to my formal LinkedIn CV. Pretty much all the things that did make it – primarily my careers in journalism and academia – are very clearly connected to my initial course of study.
But whether I was serving ales to Dublin’s lesbian community, reporting on an international soccer match, describing that night’s special in the restaurant, or assessing an exotic variant of Guinness, I think my undergrad study of literature and language always served me well.
The literature I studied taught me lessons of adaptability and resilience. I can’t think of another degree that might have prepared me better for life. This world is made of stories and I was privileged to spend some years learning how those stories work.
Due to the production lag of academic work appearing in public (peer review, people working for no remuneration, etc), some of the early work I’ve done on Science Fiction and Buddhism is only now emerging in print/pixels.
The first chapter of the slowly emerging book, which looked at Buddhist Reception in Pulp SF, appeared about 18 months ago, as I noted at the time.
Now two more chapters have emerged, long awaited and then arriving in tandem, like buses of legend. The first of these examines the crypto-Buddhism of Arthur C. Clarke, and can be found in a very excellent essay collection entitled Rendezvous with Arthur C. Clarke: Centenary Essays, published unsurprisingly to commemorate 100 years since ACC’s birth.
Contact the publisher Gylphi if it’s of interest, or else pick up a copy on Amazon or via your favourite brainy bookshop. Like all Gylphi books, it costs only a fraction of most academic texts.
The second publication likewise focuses on a single author, this time Frank Herbert of Dune fame, and therefore it will come as no surprise that it’s entitled “The Dharma of Dune: Frank Herbert and Zen Buddhism”.
It can be found in volume one (of two) of Fantastic Religions and Where to Find Them, or, to name it in its original language, Religioni fantastiche e dove trovarle: Divinità, miti e riti nella fantascienza e nel fantasy. If you follow that link, it will take you to the page of publisher Edizione Quasar where you will find both volumes, which contain an eclectic range of clever and considered takes on religion in a vast array of SFF texts, literary, televisual, cinematic and cultural.
Many thanks are due to the editors of both volumes, Paul March-Russell and Andrew Butler in the first instance, and Igor Baglioni, Ilaria Biano and Chiara Crosignani in the second.
You might reasonably wonder why all three of these chapters feature rather old SF. That’s partly because I’ve been working on this book chronologically, though I’d expect that the next chapter to emerge blinking into the light might jump a few decades to consider Cyberpunk. Don’t ask me when that will be though. The gestation period of academic texts is an arcane mystery. When it happens, I’ll be sure to let you know though.
The Ponying the Slovos team were honoured to be able to contribute to the volume, and eagerly grasped the opportunity to compare Burgess’s Nadsat to that which features in Stanley Kubrick’s script (and thereafter, the movie itself.)
Alas, as is so often the case with academic research these days, the purchase price is not so cheap. My suggestion is to ask a friendly academic librarian to consider purchasing it on your behalf. However, I can offer you a flavour of what we discovered, and subsequently wrote about in the volume.
Last January, I flew out of Cappadocia and left academia, which was a strange thing for me to do really, since I’d aspired to be an academic for decades before I finally achieved it.
So what lured me away? The opportunity to work with Professor Yuval Noah Harari and his NGO Sapienship, a social impact company that aims to focus global attention on issues of global importance, including the climate crisis, technological disruption and the prevalence of war.
So what have I been doing with them for the past year? Largely, I’ve been working with my colleagues developing the Sapienship Lab which launches today. There’s a lot of content in there already and much more to come over the coming weeks and months. It even includes some audio dramas I wrote which I guess count as my first published science fiction in a while.
Most of the content is factual, educational, and intended to act as a guide through the labyrinth that is our fast-moving now. A lot is aimed at middle schoolkids through to undergrads, but we hope that everyone can learn something from it.
I miss teaching but I feel that in my new role I can still educate, albeit remotely, and contributing to the Lab is how I’m doing that now. I hope you’ll take a look at some of what we’ve prepared. It’s taken a lot of people a long time to put all this together.
And perhaps you might share it too with anyone who might be interested, which hopefully will transpire to be everyone, because we strongly believe that we’re all in this together and only by talking and listening to everyone will we manage to improve our world.
Over at Ponying the Slovos, our ongoing project on invented languages in art and literature, I wrote a series of posts on Anthony Burgess’s other invented languages a couple of years back, of which there are more than a few.
These collected thoughts have now been expanded, revised and published in the peer-reviewed Hungarian journal of English literature, The Anachronist, and (almost all) the journal is free to read or download in the spirit of open access thanks to the publishers at ELTE, Hungary’s foremost university.
In this paper, Burgess is used to demonstrate that the role of invented languages in literature goes far beyond the existing well-explored territories of Science Fiction (SF) or High Fantasy, though they predominate therein, and can also be found in historical novels, and even realist fiction, as Burgess’s variegated novels reveal.
This is Ponying the Slovos’s second publication for 2023, and it’s not even two weeks in. We might need a little lie-down!
Anyhow, feel free to read the article here, and the whole journal, all of which will be of interest to Burgess scholars, may be accessed from this page.
Or, you are DEFINITELY the data they’re looking for.
Do you remember when AI was nothing to worry about? It was just an oddity, a subject of humour. But yet people with lots of money and power kept taking it extremely seriously. They kept training up AIs, even when they turned out to be hilarious, or racist, or just downright incompetent.
And then all of a sudden AI got good at things. It began to be able to draw pictures, or write basic journalistic-like factual articles. Then more recently, it began to write plausible student essays. I say plausible, even if it did seem to be doing so with artificial tongue placed firmly in virtual cheek, penning histories of bears in space.
Nevertheless, this was an example of the sole virtue which Silicon Valley values – disruption. And so everyone took notice, especially those who had just gotten disrupted good and hard. Best of luck to academic institutions, particularly those responsible for grading student work, as they scramble to find a way to ensure the integrity of assessment in a world where Turnitin and similar plagiarism software systems are about to become defunct.
And yet there are still some people who would tell you that AI is just a toy, a gimmick, nothing to worry about. And yes, as AI begins to get good at some things, mostly we are enjoying it as a new toy, something to play with. Isn’t it, for example, joyous to recast Star Wars as if it had been made by Akira Kurosawa or Bollywood?
(Answer: yes, it very much is, and that’s why I’m sharing these AI-generated images of alternative cinematic histories below):
So where, if anywhere, is the dark side of this new force? Isn’t it fun to use the power of algorithms to invent these dreamscapes? Isn’t it fascinating to see what happens when you give AI an idea, like Kurosawa and Star Wars, or better again, a human-written script, and marvel at what it might produce?
(Answer: Yes, it is fascinating. Take for example this script written by Sapienship, inspired by Yuval Noah Harari, and illustrated by algorithm. Full disclosure: I wrote a very little bit of this.)
The one thing we all thought was that some jobs, some industries, some practices were immune to machine involvement. Sure, robots and automation might wipe out manufacturing and blue collar work. What a pity, eh? The commentariat for some time has shown little concern for the eradication of blue collar employment. Their mantra of ‘learn to code’ is now coming back to bite them on the ass as firstly jobs in the media itself got eviscerated and then so too this year did jobs in the software sector.
But those old blue collar manufacturing industries had mostly left the West for outsourced climes anyhow. So who exactly would lose their jobs in a wave of automation? Bangladeshi garment factory seamstresses? Chinese phone assemblers? Vietnamese machine welders? (In fact, it turns out to be lots of people in Europe too, like warehouse workers in Poland for example.)
But the creative industries were fine, right? Education was fine. Robots and automation weren’t going to affect those. Except now they are. People learn languages from their phones rather than from teachers increasingly. (Soon they won’t have to, when automation finally and successfully devours translation too.)
Now AI can write student essays for them, putting the degree mills and Turnitin out of business, and posing a huge challenge for educational institutions in terms of assessment. These are the same institutions whose overpaid vice-chancellors have already fully grasped the monetary benefits of remote learning, recorded lectures, and cutting frontline teaching staff in record numbers.
What’s next? What happens when someone takes deepfakes out of the porn sector and merges it into the kind of imagery we see above? In other words, what happens when AI actually releases a Kurosawa Star Wars? Or writes a sequel to James Joyce’s Ulysses? Or some additional Emily Dickinson poems? Or paints whatever you like in the style of Picasso? Or sculpts, via a 3D printer, the art of the future? Or releases new songs by Elvis, Janis Joplin, Whitney Houston or Tupac?
Newsflash: we’re already there. Here’s some new tracks dropped by Amy Winehouse, Jim Morrison and some other members of the 27 Club, so named because they all died at 27.
What happens, in other words, when AI starts doing us better than we do us? When it makes human culture to a higher standard than we do? It’s coming rapidly down the track if we don’t very quickly come up with some answers about how we want to relate to AI and automation, and how we want to restrict it (and whether it’s even possible to persuade all the relevant actors globally of the wisdom of doing so.)
In the meantime, we can entertain ourselves with flattering self-portraits taken with Lensa, even as we concede the art of photography itself to the machines. Or we can initiate a much-needed global conversation about this technology, how fast it is moving, and where it is going.
But we need to do that now, because, as Yoda once said in a movie filmed in Elstree Studios, not Bollywood nor Japan, “Once you start down the dark path, forever it will dominate your destiny.” As we generate those Lensa portraits, we’re simultaneously feeding its algorithm our image, our data. We’re training it to recognise us, and via us, other humans, including those who never use their “service”, even those have not been born yet.
Let’s say that Lensa does indeed delete the images afterwards. The training their algorithm has received isn’t reversed. And less ethical entities, be they state bodies like the Chinese Communist Party or corporate like Google, might not be so quick to delete our data, even if we want them to.
Aldous Huxley, in his famous dystopia Brave New World, depicted a nightmare vision of people acquiescing to their own restraint and manipulation. This is what we are now on the brink of, dreaming our way to our own obsolescence. Dreams of our own unrealistic and prettified faces. Dreams of movies that never were filmed, essays we never wrote, novels the authors never penned, art the artists never painted.
Lots of pretty baubles, ultimately meaningless, in return for all that we are or can be. It’s not so great a deal, really, is it?
Plagiarism isn’t illegal, says this article. No, it isn’t, anywhere, but it ought to be. Because plagiarism kills.
I heard a decade ago of someone making six figures a year from writing papers for cash. That’s a big temptation for people struggling to get by as a postgrad or post-doc. And indeed, those who provide this service to cheats do get the tiny violins out to justify their decisions in the article linked above.
The bottom line is that everyone knows this is going on and is wrong. But universities simply don’t care, in their marketised, stack-students-high, sell-degrees-expensively model.
Academics are actively discouraged from challenging cases of plagiarism. It’s a ton of extra work, and often the students simply get a slap on the wrist anyway. Given the choice of reporting a suspected case, providing the evidence to a faculty committee, engaging in a formal investigation and then watching the student be told “don’t be naughty again” is sufficient discouragement for most lecturers, whose time is already at a premium.
But this approach isn’t good enough. Plagiarists devalue the degrees and credentials of their honest peers, and the vast market which has sprung up like foul toadstools to service those who’d rather pay than study is not simply ethically dubious but actively threatens the integrity of the entire educational system.
And it is a real issue for society. Do you want to drive over a bridge designed by an engineer who handed in other people’s essays to get their degree? Or would you like your child to be delivered by a midwife who did likewise? I am aware of cases of students who obtained degrees in both disciplines, and in many others, via consistent and continuous cheating in this manner. We must assume those graduates are in jobs now, in critical positions for which they are not actually qualified.
This is why you should care. It’s time to put the paper mills out of business for good, by making them illegal, and by properly punishing students who engage in plagiarism. Expulsion and a ban from higher education should be the minimum response.
Plagiarism has ramifications in many other areas too. Once uncovered, it generally leads to a significant loss of commercial, institutional or individual credibility and reputation. It’s hard to come back from. And of course, where authors have their work stolen (rather than sold), it’s literally beggaring people and thieving unearned benefits from the work of others.
But to create a truly anti-plagiarism culture, we need to start with education. Perhaps it may even be too late in the cycle to do so at university level, since reports of students plagiarising work in secondary schools is also rife in very many nations. But we badly need to start somewhere.
And if higher education don’t address their plagiarism problem, they will soon find their expensive degrees becoming more and more worthless, as more and more people simply purchase rather than earn those credentials.
Already it’s October, when the leaves turn red and fall from the trees, the nights grow longer and the days colder, and the Nobel prizes are awarded.
The Nobel committee for lit does tend to go leftfield when possible. One is therefore required to read into their decisions, a little like ancient haruspices reading the entrails of chickens or 20th century Kremlinologists interpreting the gnomic actions of the politburo.
How then should we read the decision to anoint the sparse, harsh and uncompromising pseudo-autobiographical work of Annie Ernaux?
To me it seems like a commentary upon Michel Houellebecq and Karl Ove Knausgård. All three are known for writing their big books of me, but perhaps the men are better known than Mme Ernaux internationally. Equally, both Houellebecq and Knausgård have been heavily criticised, among other things, for their misogyny. Awarding Ernaux seems to me to be a reaction to their popularity and the fact that both have been tipped for this prize previously. Your mileage may vary.
(Full disclosure: I’ve never read Knausgård or Ernaux and have at best a passing familiarity with Houellebecq, who I found to be a very rude interviewee at the Dublin Impac Award in a previous millennium.)
Also elevated to laureate this year was Svante Pääbo, the man who proved that ancient hominid species such as Neanderthals did not entirely die out but in fact persist to this day within non-African human genomes. In fact, I likely owe some Neanderthal ancestor the gene which oversees my melanocortin-1 receptor proteins, which gave me my once russet beard.
What’s intriguing personally for me about this year’s Nobels for medicine and literature isn’t that I’d not previously heard of the literature recipient, nor that I had previously heard of the medicine recipient, but the fact that both these things occurred in the same year. I guess my interests have shifted over the decades away from solely literary pursuits, and towards scientific interests, especially in early hominids. This year’s prizes have brought that home to me, and congratulations to the winners.
I’ve long criticised the Nobel Prize for Peace, because the Norwegian parliament committee which awards it has a knack for often choosing inappropriate recipients. Hello Henry Kissinger, Aung San Suu Kyi, Barack Obama, UN “peace-keeping” forces, etc.
Nevertheless, I’d argue they got it right this year. The 2022 Nobel Peace Prize has been awarded to human rights advocate Ales Bialiatski from Belarus, the Russian human rights organisation Memorial and the Ukrainian human rights organisation Center for Civil Liberties. Congratulations to them too.
POST-SCRIPT: The newest Nobel physics laureates have also been announced and their award is for proving that reality, as we understand it currently, is not real in the ways we think it is. Not awarded, though clearly the forefather of all of this research (which aimed to prove his hypotheses) is my compatriot John Stewart Bell, who alas died in 1990 while the experiments proving him correct were still in process.
Congratulations to Alain Aspect, John F. Clauser and Anton Zeilinger for proving once again that the universe is not only stranger than we think, but most likely as Heisenberg noted, stranger than we can think.
I recently got the chance to appear on the excellent Art of Problem Solving podcast on behalf of Sapienship, talking about how to raise and educate a generation whose jobs may not exist yet, or who may find automation erodes their employment opportunities.
To date, I haven’t spoken much on my personal site here about my work with Sapienship, largely because most of it has yet to reach the public domain. I expect that to change quite a lot in the next few months.
Anyhow, one of the benefits of migrating to an academic-adjacent position, especially one as wide-ranging as mine, is the ability to escape the narrow pigeon-holes of expertise which the artificial boundaries of academic disciplines enforce.
In my career, as noted elsewhere, I’ve had a number of very different roles. As a journalist alone, I gained expertise in a very varied range of topics and subjects including healthcare, politics and international sport. Hence it always seemed somewhat constrictive to me that academia was so insistent that I stay in my narrow lane, even as it nominally espoused interdisciplinary practices.
This is why my current areas of personal research are fundamentally interdisciplinary – in particular Religious Futurisms and Invented Languages. But it also informs why I have always been keen to teach students to be resilient and adaptable. I’ve finally been offered the chance by the Art of Problem-Solving podcast to expound on this pedagogical ethos and I feel especially privileged that in this area, as in many others, I find my personal values echoed and amplified by Sapienship.
I did not have a role model or a teacher to guide me how to become resilient and adaptable to a world in which change seems to be perpetually accelerating. I had to develop those skills myself, on the hoof, as I migrated from the Arts to Journalism to Academia and to the position I now hold.
Hopefully this podcast can help others to shorten that learning process, because the world is not slowing down anytime soon, and resilience and adaptability are going to become the defining traits of success, or possibly even survival, in the decades to come.