Where exactly did the Roman Empire end?

Like a lot of questions about history, this is both superficially straightforward and on closer reflection highly philosophical. I have a very straightforward answer for you, one that I have never seen referred to in print or pixel before, but let’s take the complex route first.

We’d need firstly to define what we mean by Roman Empire. The Ottomans, the Germans, the Venetians, the Bulgarians, the Russians and a whole host of other civilisations all considered themselves in one way or another to be true heirs of Rome. Should we consider them as true continuations or not?

Then we’d need to consider what we mean by where. Where is a subset of when in this instance. If we define the Roman Empire as ending with the sack of Rome by Alaric the Visigoth in 410 CE, then obviously the empire fell at Rome itself. But Alaric was dead within a few months, and there was still an emperor in Rome over 60 years later.

And what of what we now call the Byzantine Empire, but which knew itself as the Roman Empire? Founded in Anatolia in the fourth century due to a split of the Empire into Eastern and Western administrations, the empire based in Byzantium (later Constantinople, later Istanbul) continued until it was overran by the Ottomans in the fifteenth century just as its Western twin was overrun by the Ostrogoths in the fifth.

For me, as for themselves, the Byzantines were Roman. Greek-speaking, yes, but Roman all the same, with a continuity of culture all the way back to the founding of Rome as a city state in the eighth century BCE. So if we consider the Byzantines to be the last vestige of the Roman empire, then surely it fell when Constantinople was captured by Sultan Mehmed II (known understandably as ‘the conqueror’) in May 1453 CE?

Close but not quite. Some of the Byzantine empire still stood even as Constantinople was sacked and burned. One standout was the Maniot territory in the Peloponnese in Southern Greece, which at the time Constantinople fell was under the command of the wonderfully named Despotate of Morea, which in practice meant two Byzantine leaders (who promptly fled.) The Maniot people did not flee however, and the Ottomans didn’t bother invading this mountainous and difficult territory until 1770 CE. But with the departing despots so also departed any vestiges of ruling Byzantine (and hence Roman) culture. This was a Maniot defiance of Ottoman rule, not a Byzantine one.

Then there was the principality of Theodoro, which was a sliver of Crimea under Byzantine rule sandwiched between the coastal Genoese colonies and the inland Khanate of Crimea. Technically again, this was Byzantine territory. But in reality, it was populated by Goths.

What? Yes, in fact the Ostrogoths had been in Crimea for over a thousand years, since the FOURTH CENTURY CE! Byzantine rule (following the fourth crusade) was merely yet another imperial vassalage for the Goths of Crimea. At various times they had fallen under the nominal rule of a bewildering range of imperial powers, including the Huns, Khazars, Mongols and Genoese. Ultimately, they were merged into the neighbouring Khanate in 1475 CE, and became part of the Ottoman empire. So, not exactly the last stand of Rome.

Which brings me to my own answer to the question, where did the Roman Empire end? The Empire of Trebizond was a secessionist state of the Byzantine imperium. Formed during the fourth crusade as an opportunistic power grab by a local potentate, the Trebizond empire sustained only a little longer after the fall of its parent state at Constantinople. The Trebizond secessionists were if anything even more aggressed by the combined threat from Turkmen and Ottoman forces than the Byzantines were. Throughout the 1440s and 1450s, they repelled repeated attempts at invasion.

The end finally came in 1461, a mere eight years after the fall of Constantinople. There is a wonderful, almost contemporaneous painting depicting the departure of the Byzantines from Trebizond following King David’s surrender to Mehmed II:

So what happened exactly? Mehmed swooped in from the west to isolate Trebizond and place it under siege, which continued for a month. To achieve this, his forces had to go into the high hills immediately behind the coastal city and outflank it, so that they would be unable to receive either reinforcements (which David hoped would come from Christian Europe) or supplies via the harbour.

Trebizond was a high walled city located between two freshwater sources flowing into the Black Sea, so a physical attack was ill advised. For Mehmed, it was easier to maintain negotiations while besieging the city. And the inhabitants were well aware of what had happened to Constantinople for refusing to negotiate.

This map, take from Wikipedia, gives a good sense of the geography of the time:

The formal surrender would of course likely have taken place in the citadel or the palace (both currently under archeological exploration at the time of writing.) However, this followed an agreement between David and Mehmed for a negotiated surrender. With their forces primarily located to the east of the city, adjoining the freshwater river that is now only a dry river valley in the modern city, it is possible that Mehmed’s forces first entered the city via the lower gate closest to the harbour and market, but more likely that they entered through the double gate closer to the citadel.

Amazingly, this gate is still standing, entirely unremarked upon, and can be found down a narrow cobblestoned alleyway strewn with graffiti and with children’s laundry drying at head height. There is no plaque or commemorative item of any kind to inform you that this place was the geographic spot where over 2200 years of continuous Roman culture came to its final end. And yet, that’s exactly what it is:

The inner gate of Trebizond’s double gate, where 2,200 years of Roman culture came to an end.

Think in 5-D: Learn a Language

There are around 140 language families on the planet. Nearly half of all people speak a language from only one of those families as their native tongue, never mind all those who speak them as second or subsequent languages.

That family is Indo-European, and it includes English, Spanish, Hindi, Russian and some other very big hitters in terms of global speakers.

As the world continues to globalise, we will inevitably lose languages and even entire language families. Some projections suggest we might be down to only five or six major languages by 2500. Of those, only probably Arabic and Chinese stand a chance of being non-Indo-European languages spoken by anyone.

Once upon a time, I scoffed at learning my national language, Irish. What’s the use? Who gives a shit about old myths? Anyhow, it was all tied up with politics and my limited brain could only just about accommodate French.

Now I regret that decision, like I regret not maintaining my knowledge of Attican Greek and Latin, not properly learning Italian, Russian or Turkish, and being so scared by Hebrew and Arabic that I gave up on day one.

Because languages aren’t just interchangeable modes of communication. Each one expresses an entire culture, and even more, a wholly unique way of conceiving of the world. To speak more than one language is to see the world in multiple dimensions at once.

I envy my five year old his bilingualism. It’s a gift I intend to jealously defend for him, and no doubt on occasion even against his future wishes.

If you want to save culture and add literal dimensions to your brain, learn a language. Start today.

Do Europeans Fear the African Columbus?

I’ve been researching the ‘discovery’ of the Americas recently, particularly the history of Columbus, Vespucci, and Magellan, as well as the conquering of the Aztecs by Hernan Cortes.

What strikes me, reading the letters of Vespucci or the affidavits of Columbus, is their braggadocio of adventure. It’s all couched of course in careful obsequence to lordly funders and rulers, and pious devotion to the mother church, who one suspects had at best tenuous command in small, rickety ships traversing unknown oceans. But it’s easy to discern their sense of excitement, of being the first to see and claim terra incognita, to place the first footsteps on a new world.

They were, in short, adventurers who had little concern about the indigenes they encountered other than a kind of sociological curiosity to describe them as they might describe sea routes or the local flora and fauna, all filtered through their world view of manifest destiny and medieval Catholicism, and their barely-suppressed exhilaration.

But it was, as we now recognise, a somewhat dark and bloody history, replete with dehumanisation and erasure of the peoples who already lived in those locations, and interspersed with crimes of violence, atrocity and domination.

The Capture of Tenochtitlan by the forces of Hernan Cortes, signifying the end of the Aztec Empire

Much of the evidence of those times now exists as absence. In searching for the Taino indigenes of the Caribbean, one finds only their diluted bloodlines. Their civilisation, culture, language and polities are long since effectively vanished. Similarly, some 97% or so of Argentina today is of at least partial if not total European descent. In Uruguay, it’s just under 90%. In neither country is there a significant indigenous population remaining.

Somewhere, buried perhaps in the genetics of modern Turks, still echoes the bloodline of the Hittite empire too. But the Hittites were builders and the Taino were not. The Hittites left correspondence and monuments by which we can remember them. The Taino did not. In some ways, the Hittites are more current three millennia after their demise than the Taino are, who died out in only the past few hundred years.

Downstream over five centuries from those heady days, we might believe we are now in a position to consider them sanguinely, if you will forgive a pun in bad taste. We are now almost a century into the process, or thinking, of postcoloniality, of decolonisation. The spokes now speak to the hubs. The empires strike back.

Today, the flows of people which cause the most contention are those into Europe and the European-founded states in North America and Australasia. It’s unsurprising that this would be so. Firstly, those nations habitually top tables for metrics like income, quality of living and education, happiness, security and so on. Who wouldn’t like to live in countries with those qualities?

And of course those coming to them are by definition coming from countries which lack those qualities. They suffer poverty, war, poor educational standards, insecurity in general. They aren’t happy, or they wouldn’t be moving.

But also, they are adventurers like Columbus, Vespucci, and Magellan. They are primarily desperate young men with little to lose and much potentially to gain. They travel embedded within their own cultures, religions and languages. The increasingly loud and paranoid concerns from European nationalists is that they may also come as conquerors like Cortes.

As a scholar of uchronia, or history which never happened, I am always intrigued by the what ifs. What if Ming China had not turned its back on the world in 1433, but had instead beaten the Europeans to colonise the Americas by over half a century? Would Admiral Zheng He now enjoy the oscillation between celebration and opprobrium currently offered to the memory of Columbus?

Or what if it had been Africans or Amerindians who had first embarked on transcontinental sea travel and had arrived in small boats at the shores of a frightened and uncomprehending European populace not unlike the fleets of dinghies which now traverse the English channel daily? Would the cities of Benin, Lagos, Accra now boast the wealth of imperial buildings and infrastructure we instead find in London, Amsterdam, Paris and Lisbon?

We would be in a very different world perhaps. Or more likely, we would not. The processes of colonialism would most likely have remained intact. The resulting erasures, atrocities and domination would likely still have occurred, only with the positions of the colonised and colonisers reversed.

What evidence for this is there, outside of my fevered imagination of the multiverse? Well, firstly one might consider the Bantu Expansions of the 11th to 17th centuries. On encountering the sparse populations of existing pastoralist and nomadic peoples of central and southern Africa, they largely either wiped them out or absorbed them, resulting in an African variant of what we might call the Argentina model.

And we don’t even need to look to history for examples of Chinese colonialism. It continues today, as Tibetans, Uyghurs and those in various South and East China Sea islands can testify.

In short, history teaches us that cultures do clash, and that all too often, if not indeed most of the time, one of those cultures is going to come off worse, often to the point of eradication. The process of cultural evolution, which exists both in isolation and in free associations via trade, commerce and technological development, continues ever faster in the globalised and techno-enabled world in which we find ourselves. Cultures do not atrophy by themselves. History indicates that when they die, it is not by suicide but more commonly at the hands of conquerors and colonisers.

The bafflement of the political class in Europe at the inexorable rise of ethnocentric, hypernationalist and insular right-wing parties is itself therefore baffling. History suggests that this is a manifestation of resistance to perceived colonial attack. The rhetoric on all sides illustrates this very clearly, whether it is assertions of Europe as being inherently white and Christian, and Islam an existential threat, or the counter-rhetoric of inflammatory Islamic preachers demanding Sharia law in Europe, and the misplaced triumphalism with which Indians proclaim ownership of London.

Is it a sense of folk guilt which fuels the suspicion of Europeans encountering the African Columbus or subcontinental Vespucci today? Postcolonial theory suggests as much. But perhaps it is also something more deeply felt – an existential fear that they are instead meeting columns of modern-day Cortes.

Diversity by definition is divisive. It is not inherently a strength, otherwise the late Roman Empire would have been stronger than its earlier iteration. But diversity could become a strength if we could somehow harness a collective expansion of in-group sensibilities, a magnification from the gigatribes of nations to the teratribe of humanity.

For that to occur, however, a sea change in perspective is required by everyone. Those intent on building fortresses around their cultures need to understand that no walls can stand against the march of human adventure and ingenuity. And those who set sail for new worlds must leave their small-minded cultural and religious preconceptions at home in the past.

Only then can we truly move beyond zero-sum colonial mindsets.

When do wars actually end?

World War One started in July 1914, but when did it end? Conventionally, people assume it ended in November 1918, with the surrender of Germany.

But people were still dying many years later. My own grandfather suffered for decades with lungs rotted out by mustard gas at the Somme, and didn’t die for many years, gasping and coughing nightly.

The most recent victims, astonishingly, were as recently as March 2014, almost exactly a century after the conflict started. How is that possible? They were construction workers, who accidentally triggered an unexploded bomb buried beneath where they were working.

During WW1, a ton of explosives was fired for every square metre of territory along the front.

As a result, the French Département du Déminage (Department of Mine Clearance) recovers about 900 tons of unexploded munitions every year. They call it the Iron Harvest.

Unexploded ordinance is left behind after all conflicts. Children are maimed and killed every year as a result of uncleared mines and bombs in Asia and Africa.

The wars we fight today will kill not only us but our grandchildren and great-grandchildren too. It’s time to make war history.

The Cosy Sectarianism of the Great Irish Writers

How cosy and quaint do the petty sectarian bigotries of 20th century Irish writing seem today.

I’m not referring to the civil war in the North of Ireland, usually euphemistically referred to in a diminished manner as the ‘Troubles’. I lived through most of that, and it was extremely unpleasant indeed.

Rather I mean the slightly earlier period of the early and mid-twentieth century, when Irish writing bestrode the world in the forms of giants like Joyce, Beckett, Yeats and Behan.

What’s interesting, considering just these four (though we could add many other lesser names), is the varying personal reactions to the sectarian divide in Ireland. For the Protestant-raised, middle-class and cosmopolitan Beckett and Yeats, minor distinctions in flavours of Christianity was an irrelevance at best.

Yeats in later life veered into mysticism, theosophy, magick and the occult. Beckett by contrast tended to dismiss Christianity if not all religion entirely, referring to it as “all balls”, though conceding that it amounted to more than merely “convenient mythology”. Raised in the era they were, both Yeats and Beckett imbibed plenty of Christian dogma in school and wider culture however, and both demonstrate in their writing an easy and deep familiarity with Christian writings and the Bible.

Beckett, probably not considering conversion to Catholicism

By contrast, the Catholic, lower middle-class/working class Joyce and Behan seemed unable entirely to shake off the tribal Catholicism of their backgrounds and education. I was reminded of this recently when I re-encountered Behan’s hilarious take on Anglicanism:

Don’t speak of the alien minister,

Nor of his church without meaning or faith,

For the foundation stone of his temple

Was the bollocks of Henry VIII.

Behan wearing a rosette proclaiming what is undoubtedly the greatest sporting chant ever.

Behan was a self-described “daylight atheist”. This is often presented online in the form of a quote: “I’m a communist by day and a Catholic by night”. However, I’ve not found a reliable source for this variant. Anyhow, Behan clearly had not managed to transcend the petty sectarian rivalries which beset Ireland, and in this he echoes Joyce, who in the highly autobiographical A Portrait of the Artist as a Young Man describes his alter-ego protagonist Stephen Dedalus refusing to consider conversion to Protestantism:

– Then, said Cranly, you do not intend to become a Protestant?

– I said that I had lost the faith, Stephen answered, but not that I had lost self-respect. What kind of liberation would that be to forsake an absurdity which is logical and coherent and embrace one which is illogical and incoherent?

We might consider this passage as a depiction in mature adulthood of his prissy adolescence were it not that it is echoed elsewhere in his work, such as the short story ‘Grace’ in Dubliners.

Joyce in his lengthy European exile.

It’s worth remembering too that Joyce and Behan both escaped the confines of petty Ireland if anything more completely than Yeats ever did, the latter becoming a senator in the newly independent Ireland whereas Joyce relocated permanently to Europe, while Behan spent much of his time in London and America. (Beckett like his mentor Joyce went to Europe and never looked back.)

So then, what fuels this seemingly pointless animus? The grounds of objection from both Joyce and Behan relate to an apparent illogicality inherent to Protestantism. Notably in both instances, there is no defence of Catholicism offered, merely a snide (and in Behan’s case, very funny) dismissal of Ireland’s second-largest faith.

And unlike Yeats, neither sought to construct a religious faith of their own, though in Joyce’s case at least there was an astonishing attempt to replace the religious impetus with an aesthetic one, succinctly underpinned as Joyce said, by “silence, exile and cunning.”

I think Behan’s piece (a translation as it happens from 16th century Irish) gives the game away here. In many locations, the first line of his translation is misquoted as referring to “your Protestant minister”. But Behan like his source material makes clear that while Anglicanism is being referred to, the issue is less the protest against Catholicism underpinning it than its alienness, that is, the fact that it was the faith of the foreign (ie English) overlords who governed Ireland from the time of bebollocked Henry to their present day.

In other words, it was an atavistic political tribalism rather than a theological objection. We still have those tribalisms in Ireland today, primarily in the North where those overlords remain in position, likely against their will and desire, due to the complexities of establishing a permanent and lasting peace. In the 26 counties of the Irish Republic however, these passages stand out as glaring anachronisms now.

And even in the North, the late great “famous” Seamus Heaney (like Yeats and Beckett a Nobel laureate) is best described as sociologically post-Catholic rather than a devotee of the creed of his birth. This runs counter to the opinions offered by some of his most astute critics, Conor Cruise O’Brien and Edna Longley in particular of course, but is it unfair to point out that both critics came from Protestant backgrounds and hence saw the cultural references to Catholicism in Heaney’s work as more significant than it was simply because those references were alien to them in the same way that Protestantism was to Behan?

So, will you be converting to Protestantism, Seamus?

In other words, the sensitivities may be reversed here. Perhaps it is as readers that we detect these curious emphases. Perhaps we misconstrue the petty cultural rivalries of sectarianism in mid-20th century Ireland because religion played such a larger role in cultural life in those days, in ways that anyone under 50 is unlikely to recognise in Ireland today.

The great Irish writers never stop teaching us, and one of their lessons is that we must challenge ourselves as readers with regard to what we find striking in their writing. What we notice and what we do not says perhaps as much about us as it does about them. They hold a mirror to our souls, even if, like Behan, we are daylight atheists.

The Curious Tale of the Metaverse and the Multiverse

One of the issues with trying to surf the zeitgeist is precisely that – you remain on the surface with no depth of understanding of any individual issue. So high is the noise-to-signal ratio nowadays that it is almost overwhelming for many people to ascertain what information IS relevant and important to their lives, and what is not.

It can be hard to find the time to think deeply about quickly moving events, or to link them correctly to one another. In fact, such are the time and cognitive pressures that many people end up succumbing to conspiracy theories which offer neat and totalising explanations for the state of the world, provide suitably nefarious-seeming scapegoats and attempt to rally the public to action.

Of course, a lot of this action devolves quickly into “send me money”, but at that point some people are already sufficiently relieved to find a handy explanation for everything, happy not to have to think deeply, and grateful enough to contribute to the professional liars.

Unfortunately, there are no quick fixes or easy answers. Not for the world, and not for those of us who live in it. And there are many ways to become confused, or to pursue dead-end fictions, in the attempt to comprehend the fast-moving reality we find ourselves in. Conspiracy theories are just the odious tip of a large iceberg of false information and fake news. Beneath the surface are many other attempts to explain the world simply, or to simplify it, most of which are not as nefarious as conspiracies, but are in some regards equally constructed and equally untrue.

Two terms which crop up often these days, though maybe not often enough in this context, are the multiverse and the metaverse. The multiverse refers to the idea, widely accepted by theoretical physicists, that our universe is not the only one, and instead exists in relation to an infinitude of other universes, some highly similar, some wildly different from our own.

Many universes – but isn’t this one enough already?

By contrast the metaverse is an as yet hazy idea quickly obtaining momentum among tech circles which proposes itself as the future of the internet, and seeks to displace or replace many aspects of contemporary life with a virtual reality alternative.

Mark Zuckerberg’s vision of your future

So the multiverse is an expansive concept and the metaverse is a limiting one, but both seek to tackle the issue of explaining the complexity of the world by replacing it with something else. And they do so in different ways. While the metaverse is a collective effort by tech firms, Facebook (now renamed ‘Meta’) in particular, the multiverse is an idea poorly adopted from theoretical physics and science fiction novels which has grown, like conspiracy theories, in the corners of communication that the mainstream media do not reach primarily.

Already it seems that the brave new Metaversal world may not be about to materialise in quite the way its ‘imagineers’ were hoping. Only today, Facebook – sorry, Meta – announced swingeing job cuts across their company, which is undoubtedly informed by the one billion dollars PER MONTH they have been spending recently on developing Metaverse tech.

Over the past three decades, we have as individuals, societies and even as species, learned to adopt, adapt and accommodate the internet in our lives. But the prospect of a life spent primarily in virtual reality seems to be a bridge too far for many of us. We are not our avatars. We are not inputs into a global algorithm. We do not need to escape meatspace for metaspace.

But it seems some people do want to escape, though perhaps not into a corporate vision of virtual reality. After all, movies like The Matrix have warned the public to be wary of dreamscapes, especially when those dreams are programmed by others. Instead, they escape into their own dreams, where the complexity of reality can be argued away, in all its nuances and seeming contradictions, by the simple assertion that they have migrated between universes.

The growth of a subculture of people who appear to believe that they can traverse between universes is a particularly fantastikal form of failing to deal with how complex the world has become. It’s clearly not as nefarious as the various conspiracy theories circulating online, but of course any movement attracts shysters and wannabe leaders, in search of money or influence, and hence there are now people offering to teach others how to move between universes.

In one sense this is no less valid than teaching people how to chant mantras, say the rosary or engage in any other religious practice that is more metaphorical than metaphysical. But one of the catalysing aspects of online culture is the ability for people to find like-minded people. Hence conspiracy theorists can find communities where their toxic ideas are cultivated, while multiversers can source validation and endorsement from others who similarly seek to explain the anomalies of their memory or complexities of reality in the same way.

There are no doubt complex reasons to explain why so many people are subject to psychological phenomena like the Mandela Effect, but these explanations do not include watching YouTube videos on how to meditate your way into another universe while in the shower.

Both the multiverse and the metaverse offer simplistic and ultimately unsuitable resolutions to the ever-growing complexity of modern existence. Fundamentally, these escapist dreamscapes are coping mechanisms for dealing with this complexity.

The world is already too complex for any individual mind to comprehend, and probably too complex for even artificial intelligences to ever understand. But we can’t, or at least shouldn’t, escape it. Instead, we should try to understand it, and the best way to do that is to escape not from the world but from our online echo chambers.

If we can learn again to speak to one another, identify areas of agreement and try to find ways to foster collaboration despite disagreement, we stand a much better chance of improving our own collective futures.

At Sapienship, we believe everyone has a story to tell and all those add up to the story of us. We think everyone needs to be heard, and debated, and engaged with. It’s not easy, but it’s clearly the best way to resolve the major issues that face us, our planet and our reality.

We don’t need to hide in virtual realities or imagine alternative universes when the one we have is so rich with possibility and potential. Instead we need to come together to realise our hopes.

Have we been seduced by Dyst-hope-ia?

I am a scholar of dystopia – a dystopian if you will. I am an aficionado of dystopia, a connoisseur of the literary and artistic genre in its myriad of forms and nightmares.

I consider dystopian thinking to be an evolution, or sometimes an extrapolation, from the precautionary principle, which warns against change for the sake of change. Dystopia is a form of negative imagining, an attempt to envision and render in realistic terms a truly ‘negative place’, the etymological meaning of the term.

In this sense, I find dystopian thinking to be significantly more culturally useful than utopian thinking, which to a large extent has been reduced to a singular political ideology derived from a Marxist strain of post 1960s counterculture.

Whereas utopian thinking has devolved to activist academic attempts to plot routes towards one particular ‘positive place’ future, dystopian thinking has instead remained more broad and wide in its purview. After all, there are many nightmares.

If there is a structural flaw to both modes of art and thinking, it is that in practice they generally extrapolate forward to complete visions, the totalising utopia or dystopia. Rarely if ever do we see depicted the many incremental stages between the world as we know it and the heavenly or nightmare future world depicted.

Where utopian thinkers in particular have addressed the explicit or implicit developments towards utopia or dystopia, they have, to my mind, missed the point somewhat. The terms ‘critical utopia’ and ‘critical dystopia’ emerged some four decades or so ago to describe incomplete elements of depicted utopias and dystopias. Thus these key depictions of complexity, nuance and evolution in such literature and art (and philosophy) were reduced to anomalies which could either be countered (in the case of ‘critical utopias’) or fostered (in the case of ‘critical dystopias.’)

This was an innovative way of looking at things then, but it was always reductive, and ideologically driven, and at this point its limitations are becoming quite obvious. Actual examination of how society develops towards utopia or dystopia tends to be quite thin on the ground, despite examples existing all around us.

The exception if there is one is the regularly bruited risk of a return to 1930s-style fascist governance in current democratic societies. The election of leaders with an authoritarian populist rhetoric, be they Trump, Orban or Meloni, is now routinely accompanied by dire extrapolations (and often incomplete historical parallels) which overtly suggest that a slippery slope to neo-Nazi rule is already well underway.

But dystopia as I said takes a myriad of forms, and each form evolves and devolves in different forms and at different rates in different cultural and historical circumstances. As a dystopia thinker, I try to look for patterns, for trends, which suggest dystopian vectors of society, ways in which society is moving towards a less civilised state of being for most people.

In this way, many instances seem to pass under the radar. In fact, very often when they do occur, they are depicted as the opposite of what they are. They are reported as beacons of hope, anomalies which ‘critical utopias’ habitually accommodate in their positivist post-Enlightenment progress ratcheting ever forwards.

These instances are a little like ‘magic eye’ pictures, which were popular a generation back. Once you see it, you can’t unsee it, as they say. I refer to them as examples of dyst-hope-ia, as they are fundamentally dystopian developments, though usually incremental rather than totalising, swathed in a good-news suit of hope to make the bitter pill go down more easily.

In this way, a ratcheting towards a more dystopian society occurs in an almost Huxleyan sense, with the passive acceptance and approval of the population who actually are encouraged to associate such instances with hope rather that its opposite.

This is a little difficult to explain in abstract, so let me offer some concrete examples. Many years ago, I noticed a large building being erected in my district in Dublin. Over many months the grand edifice came together. I didn’t pass it often, so didn’t know what the building was intended to be, until one day in the local newspaper I read that it was due to open the following week. It was a new unemployment welfare office.

The local paper depicted this as a good thing. It was reported as a net good that the unemployed of the area now had a better, bigger dedicated office to deal with them efficiently. But beneath this patina of hope, one swiftly discerns that the expenditure of millions of euro in such a building is a commitment to societal unemployment in the area.

It is in fact an admission of failure – the failure to regenerate the area, or to provide employment for its inhabitants. At the time of its opening I wrote in my journalist’s notebook, “who approved this investment in indolence?” (I used a lot more alliteration in those days.)

Another example comes in today’s news from Britain, which in recent times can be relied upon as a stable and consistent source of examples of dyst-hope-ia. The emergence of a social charitable phenomenon called ‘warm banks’ (though the term is never used) is a classic example of dyst-hope-ia.

What is a ‘warm bank’? Based on the similar concept of food banks, a warm bank is a public charitable space where people who cannot afford to heat their homes may go to stay warm during opening hours. Bloomberg is one of many outlets who report approvingly of the concept here.

The welcoming warm bank, depicted as a jolly public community space – image courtesy of Getty Pictures.

Surely the hopeful depiction is legitimate? After all, the idea of the community rallying around to offer protection and support to the most vulnerable among them is a supremely positive and human thing. This is the hope in dyst-hope-ia, the positive cloak in which the nightmare clothes itself, the sheep’s clothing on the dystopian wolf.

Because, under this surface reaction is the initial action causing the need for such support – the vastly and rapidly escalating food and fuel costs which have left many vulnerable people in Britain with a choice between eating and heating.

And as with food banks before them, warm banks will function not only as a precarious safety net for the vulnerable, but also as a creeping normalisation of a more dystopian society, one in which it is normalised for people not to be able to afford food or heat their homes.

What dystopian thinking teaches me is not to dismiss this patina of hope cynically, nor to be seduced into thinking of the overall scenario as a positive development either. It allows me instead to see through the sheep’s clothing to the wolf beneath.

I suggest always lifting the surface of the good news story to check what might be smuggled into normality underneath. I admire the efforts of each and every person who contributes their time or money to keeping their community warm. But I refuse to allow that kind-heartedness to obscure the fact that the government is attempting to normalise the concept of citizens who cannot heat their own homes.

Are we Sleepwalking into Slavery?

Usually, hardcore technophiles get hurt in the pocket. I still recall people spending £800 on VHS video recorders (about £3,900 in today’s money) only for them to fall to a fraction of that soon afterwards. Likewise with early laptops and cellphones.

May be an image of 1 person and text
Cutting edge technology c. 1980.

What’s concerning about AI’s earliest adopters is both their blasé attitudes to its many flaws and weaknesses, and their insistence on foisting AI-driven “solutions” upon the rest of us.

Which brings us to the Synthetic Party. On paper no doubt it sounds great. Remove those problematic humans from decision-making. But politics takes place in the very real world of human society, not on paper or in bits and bytes.

This scenario – actually of an AI coming to power – was workshopped at the Athens Democracy Forum by a very interesting organisation called Apolitical. Our collective conclusion was very clear that AI isn’t ready to rule – and perhaps never will be.

Even if the advent of AI was at worst likely to punish enthusiasts financially, as with previous technology early adopters, I’d still have issues with it. AI needs to be fed with data to learn, and that data is your and my personal information, whether gathered legitimately with full consent or not.

However, AI could have ramifications far beyond our worst current nightmares. As always, we dream negatively in Orwellian terms, fearing technology will turn on us like Frankenstein’s monster or the Terminator, when history suggests that dystopia more often manifests in Huxleyan terms.

We are sleepwalking into this, and judging by these Danish early adopters, we will happily embrace our own slavery. It would be much preferable if the cost of AI was merely financial. But the ramifications are likely to be much more impactful.

Already in many democracies, a large proportion of the electorate simply don’t engage. And when they do, a growing proportion are voting for parties with extreme ideologies. On our current vector, we could easily end up volunteering for our own obsolescence.

What the Synthetic Party promise is true technocracy – rule by machines and algorithms rather than rule by unelected officials as we currently understand the term. As always, be careful what you wish for.

Nobel Pursuits

Already it’s October, when the leaves turn red and fall from the trees, the nights grow longer and the days colder, and the Nobel prizes are awarded.

The Nobel committee for lit does tend to go leftfield when possible. One is therefore required to read into their decisions, a little like ancient haruspices reading the entrails of chickens or 20th century Kremlinologists interpreting the gnomic actions of the politburo.

How then should we read the decision to anoint the sparse, harsh and uncompromising pseudo-autobiographical work of Annie Ernaux?

To me it seems like a commentary upon Michel Houellebecq and Karl Ove Knausgård. All three are known for writing their big books of me, but perhaps the men are better known than Mme Ernaux internationally. Equally, both Houellebecq and Knausgård have been heavily criticised, among other things, for their misogyny. Awarding Ernaux seems to me to be a reaction to their popularity and the fact that both have been tipped for this prize previously. Your mileage may vary.

(Full disclosure: I’ve never read Knausgård or Ernaux and have at best a passing familiarity with Houellebecq, who I found to be a very rude interviewee at the Dublin Impac Award in a previous millennium.)

Also elevated to laureate this year was Svante Pääbo, the man who proved that ancient hominid species such as Neanderthals did not entirely die out but in fact persist to this day within non-African human genomes. In fact, I likely owe some Neanderthal ancestor the gene which oversees my melanocortin-1 receptor proteins, which gave me my once russet beard.

What’s intriguing personally for me about this year’s Nobels for medicine and literature isn’t that I’d not previously heard of the literature recipient, nor that I had previously heard of the medicine recipient, but the fact that both these things occurred in the same year. I guess my interests have shifted over the decades away from solely literary pursuits, and towards scientific interests, especially in early hominids. This year’s prizes have brought that home to me, and congratulations to the winners.

I’ve long criticised the Nobel Prize for Peace, because the Norwegian parliament committee which awards it has a knack for often choosing inappropriate recipients. Hello Henry Kissinger, Aung San Suu Kyi, Barack Obama, UN “peace-keeping” forces, etc.

Nevertheless, I’d argue they got it right this year. The 2022 Nobel Peace Prize has been awarded to human rights advocate Ales Bialiatski from Belarus, the Russian human rights organisation Memorial and the Ukrainian human rights organisation Center for Civil Liberties. Congratulations to them too.

POST-SCRIPT: The newest Nobel physics laureates have also been announced and their award is for proving that reality, as we understand it currently, is not real in the ways we think it is. Not awarded, though clearly the forefather of all of this research (which aimed to prove his hypotheses) is my compatriot John Stewart Bell, who alas died in 1990 while the experiments proving him correct were still in process.

John Stewart Bell

Congratulations to Alain Aspect, John F. Clauser and Anton Zeilinger for proving once again that the universe is not only stranger than we think, but most likely as Heisenberg noted, stranger than we can think.