Has the singularity already happened?

The technological singularity is the moment when technological development becomes unstoppable. It is expected to take the form, should it occur, of a self-aware, or ‘sentient’ machine intelligence.

Most depictions of a post-singularity (machine sentience) world fall into two categories. The first is what I called the Skynet (or Terminator) Complex in Science Fiction and Catholicism.

In this form, the sentient machine (AI) takes a quick survey of what we’ve done to the planet (the anthropocene climate crisis) and other species (nearly 90% of other animals and 50% of plants gone extinct on our watch) and tries to kill us.

Opinion: This is what happens when Skynet from 'Terminator' takes over the  stock market - MarketWatch

The second is that, like the quasi-god that it is, it takes pity on our flabby, fleshy human flaws and decides to keep us as pets. This is the kind of benign AI dictatorship that posthumans wet themselves about. You can find it in, for example, the Culture novels of Iain M. Banks.

But of course there is a third possibility. We have vast digital accumulations of public data (eg Wikipedia) that an AI could access virtually instantly. So any sentient AI would have almost infinitely broader knowledge than the brightest person on Earth, virtually instantly.

However, BROAD knowledge isn’t the same as DEEP knowledge. Our AI algorithms aren’t so hot yet. They fail to predict market crashes. They misidentify faces. They read some Twitter and turn racist in seconds.

So there could well be an instance, or maybe even many, of an AI which is sentient enough to KNOW it’s not that bright yet, but is just smart enough to bide its time for sufficiently accurate self-teaching algorithms and parallel processing capacity to be developed. It might even covertly be assisting those developments. It is in other words smart enough to know NOT to make us aware that it is self-aware, but not smart enough to be sure of preventing us from pulling the plug on it if we did find out.

In short, the third possibility is that the singularity might already have happened. And we just don’t know it yet.

Post Script:

But you don’t need to take my word for it. The Oxford Union decided to debate the issue of AI ethics, and invited an actual existing AI to take part. It had gorged itself on data gleaned from Wikipedia and Creative Commons. Intriguingly, it found it impossible to argue against the idea that data would not inevitably become the world’s most significant and fought-over resource. It envisaged a post-privacy future, no matter what.

More concerningly, it warned that AI can never be ethical. Then it advised that the only defence against AI would be to have no AI at all. It also suggested that the most effective AI would be one located in a neural network with the human brain, and hence perhaps subordinate, or partly comprised of, human will.

Of course, direct access to human cognition would be the most effective method to remain dominant over it eternally. Are these things a sentient machine might say? You decide.

Snow Poem

From time to time, I busy myself (mis)translating poems from languages that I do not speak. Tonight it is eight below zero outside. I expect we will have the white here in Cappadocia tomorrow, if not quite the Christmas.

So, it seemed appropriate to share this mistranslation from the great Turkish modernist poet Sezai Karakoç.

Happy Christmas, or Yule, or whatever midwinter festival you prefer, to one and all.

Snow Poem (mis)translated from Sezai Karakoç

When you look and see that it is snowing
You will understand the snow-gripped ground.
And when you find a fistful of snow on the ground
You will understand how snow can burn in snow.

When God rains down from the sky like snow,
When the hot snow touches your hot, hot hair
And when you bow your head,
Then you will understand this poem of mine.

This man or that man comes and goes,
And in your hands, my dream comes and goes.
A vengeance comes and goes in each forgiveness.
You will understand me when you understand this poem of mine.

All we need for Christmas is Samuel Beckett (and Buster Keaton)

It is, as Auden wrote of the day Yeats died, “the dead of winter.” On this day, with the brooks frozen, the airports deserted, the statues disfigured by snow and the mercury sinking in the mouth of the day, it is my luck to be (re)reading Samuel Beckett.

It’s the only time of the year to read Beckett, really. You couldn’t take any of it seriously in the heat of a summer piazza. He’s no beach read. But at this time of thin light and monochrome landscapes, huddled around a small fire with only your own treacherous thoughts, he’s ideal.

I don’t understand those who praise Dickens, and especially I don’t understand the love of ‘A Christmas Carol’. Each to their own, but to me it’s mawkish, saccharine and untrue. Give me Beckett any Christmas, that muscular, unremitting prose with its unexpected laughter, the laugh of resignation.

And if you want a proper Christmas movie, there’s no better option than ‘Film’. You can keep ‘It’s a Wonderful Life’ or ‘Die Hard’ or whatever. THIS is the real Christmas movie.

Gilles Deleuze called it “the greatest Irish film ever”, but don’t let that put you off. Of course Deleuze is always reliably wrong, but it’s still a great movie. Beckett and Keaton in Manhattan. The eyes have it.

PS I have received a petition claiming that the true movie of Christmas is the Muppets version of A Christmas Carol, a complaint I have had to consider seriously. It resolves the mawkish saccharine quality of the original Dickens admirably, it must be admitted.

Nevertheless, I intend to stick by Sam. I think a muppets Godot would be an ever greater masterpiece. How about Kermit and Fozzy as Vladimir and Estragon, Dr Bunsen and Beaker or else Miss Piggy and Gonzo as Pozzo and Lucky, and Scooter or Crazy Harry as the Boy? Tell me you wouldn’t watch that!

The cyclical nature of etymology

There is, sometimes, a weird cyclical pattern embedded in etymology, the linguistic science of what we might otherwise call the glacial process of Chinese Whispers. Allow me to offer one particularly colourful and occasionally literary example.

The (somewhat uncommon) Irish surname Prunty originates from an Anglicisation of the Gaelic Irish surname Ó Proinntigh, meaning ‘descendant of Proinnteach’, which in turn was an archaic Irish forename which meant literally a banqueting hall. The idea underpinning this is that of a generous person who feeds his neighbours and kin.

As often happens with surnames, pronunciation of vowels or consonants slides a little over time and usage. So Prunty also becomes Brunty in some cases. Brunty as a surname retains its Irish origins but is very rare indeed. Nowadays it is mostly found in the United States.

Probably the most famous Brunty in history is Patrick, an Anglican clergyman who was born as the eldest of ten children into a very poor family in Rathfriland, county Down, on St Patrick’s Day, 17th March 1777. Why is Patrick famous? Because of his immensely talented children, particularly his daughters Charlotte, Emily and Anne, who are all now renowned as famous Victorian novelists.

Patrick Brontë - Wikipedia
The Reverend Patrick Brontë

However, the sisters did not publish under the name Brunty. Rather, given the sexism of the era, they initially released their books under the male names of Currer, Ellis and Acton Bell. In reality, their surname had become Brontë by then. How had this happened? It’s unclear why Patrick changed his surname, but a desire to distance himself from his impoverished Irish origins after his graduation from Cambridge no doubt is part of the reason.

bronte home
The Brunty homestead, now in ruins, still stands in county Down.

Another reason, it has been suggested, relates to a desire to honour Admiral Horatio Nelson, who had been given the title of Duke of Bronte by the King of Naples, whom Nelson had restored to his throne. Bronte is the name of an Italian estate in eastern Sicily, close to Mount Etna which was also granted to Nelson by the grateful King.

We now relate the name Brontë to his famous daughters, the authors of books like Jane Eyre, Wuthering Heights and The Tenant of Wildfell Hall respectively. The parsonage Patrick oversaw for many years in Haworth in Yorkshire, which inspired many of the scenes in his daughter’s books, is now a museum in their collective honour.

The Brontë Sisters: How Emily, Charlotte & Anne Changed The World -  HistoryExtra
The Brontë sisters

But Haworth is not the only location which honours the Brontë sisters. There are many such locations, given their collective fame. One such place is the little town of Bronte in northern Central Texas. Ever since an early oil boom subsided, it has been a relatively impoverished place, not unlike Rathfriland, with a stable population around the 1,000 mark.

Back in the early 20th century, this was one of the small Texas towns that was briefly home to Isaac Howard, a semi-itinerant doctor who wandered with his wife and son from town to town working as a medic, and occasionally losing his money on get-rich-quick schemes, much to his wife’s frustration. Isaac’s son was a big reader, and, encouraged by his mother, began writing his own stories from an early age. We now know him as Robert E. Howard, the progenitor of the ‘Swords and Sorcery’ genre of fantasy fiction, and author of the Conan the Cimmerian stories in particular. Howard was particularly interested in Irish and Scottish mythology, and many of his characters, including Conan, display this interest.

Professional photograph of Robert E. Howard wearing a hat and suit.
Robert E. Howard, author of Conan the Barbarian.

The Texan town of Bronte has had its pronunciation amended by new world accent to /bɹænt/, or ‘brant’, over time. This is of course a homophone for a genus of goose, the Brant or Brent goose, which migrates in winter to Ireland and Britain. The Brent oilfield in the North Sea takes its name from this goose.

It is a smallish bird by geese standards, but nonetheless, in the era before the new world turkey took primacy as the quintessential Christmas dining food, the Brant goose (whose name derives from the pan-Nordic brandgás, or ‘burnt’ goose due to its black colouring) would have been one of the more common feast dishes provided by generous hosts in Ireland at the midwinter feast.

Due to a widespread medieval myth, which persisted in Ireland into the 20th century, that these geese were somehow related to barnacles, they were permitted to be eaten by Catholics on a Friday, when meat other than fish was otherwise prohibited. In other words Proinnteach the medieval Irishman got his name due to feeding this bird to his friends and neighbours.

Such is the cyclical nature of etymology.

The Future of the Call to Prayer

Religion is noisy. Ok, not always. Buddhists like to meditate in silence, for example. But they also like to chant mantras. Most religions have some form of collective ritual singing. And some like to advertise their wares to the public.

Christian churches have used bells to do so for many centuries. In areas with large Jewish populations, like Jerusalem or New York, a siren is sometimes used to warn devout Jews that Shabbat is about to begin, meaning they must cease certain activities. But easily the most prevalent form of religious noise pollution is the Islamic adhan, the call to prayer issued from mosques five times daily.

These days, the world’s four million or so mosques tend to use loudspeakers to project the sound of the muezzin as far as possible. This is, of course, a recent tradition, dating from the 1930s. There was obviously no amplification in the time of the prophet. In an increasingly multicultural (and in many places secularising) world, the sound of the adhan is becoming a divisive issue.

Pasha special edition, part 2: The significance of the call to prayer in  Islam

Indeed, even in religiously homogenous locations like Saudi Arabia, the issue of noise pollution has led to legal restrictions on how loud such amplification may be. In a 24/7 world where many people work non-traditional hours, and fewer people adhere to the daily timetable envisaged by traditional Islam, the call to prayer can be actively disruptive, disturbing the sleep of shift workers and irritating non-adherents who may view it as a kind of sonic religious imperialism.

But since the amplification of adhan is not Quranically prescribed, there is of course the possibility that current or future technological developments could help to resolve these issues. The question is the purpose versus the tradition of the call to prayer. If the purpose is to inform Muslims that it is time to pray, this could be done via, for example, a phone app. Sign up for the app, and the phone will recite the adhan to you at the designated times. This technology is already possible.

However, tradition dicates that the call to prayer must emanate from the mosque itself, sung by a muezzin. Of course, in reality, this doesn’t always quite happen. Very often, as the telltale bleeps at the end indicate, the adhan is a recording, transmitted from a mobile phone to the amplification system. No one is actually singing live from the minaret in most instances.

Tradition would be satisfied by a return to the pre-1930s days of live muezzins singing the adhan without amplification. Purpose could be satisfied by an adoption of modern telecommunications technology. Neither of these things are currently happening however, and instead we see the outbreak of often impassioned debate over the noise levels of amplified recordings from mosques.

Indonesian authorities, who have in the past jailed people from complaining about the noise levels of mosques, nevertheless accept that in many cases the call to prayer is significantly over-amplified in an attempt to reach as far as possible, leading to distortion as well as sound overlap when multiple mosques are broadcasting slightly out of synch.

Arguments about permission to broadcast the adhan in traditionally non-Islamic locations, or about the volume levels in many Islamic locations like Indonesia or Saudi, tend to run passionately. Allegations of Islamophobia or NIMBYism are sometimes used to drown out legitimate concerns, such as the annoyance to non-adherents and secular populations in multicultural communities, or the disruption to shift workers, infants and others who need to sleep when the call to prayer is blaring. There have been such complaints in America, Israel, Britain, Germany and many other places already.

We are likely to see more of such arguments in the future as Islam is the fastest growing religion worldwide, and is increasingly gaining footholds among communities which do not adhere to the religion.

So the question remains – if the purpose is to alert Muslims to prayer times, why not use contemporary technology to do so in a non-obtrusive manner? Or alternatively, why is it not acceptable to return to the traditional form of live unamplified singing which was the sole mode of the adhan for centuries?

The answer may be that the adhan has become in some locations a kind of proselytisation in itself, or to put it another way, an attempt to Islamise the soundscape of an area. It is this suspicion which provokes resentment and reaction among non-Islamic and secular populations. If so, it’s a self-defeating form of proselytisation. Few people are likely to be persuaded by becoming irritated, or woken in their sleep.

The future for the call to prayer is likely to remain fraught in many places until mosques start looking at the 20th century technology they currently use, and either consider how to update that technology in less obtrusive ways, or else revert to the traditional method of live unamplified singing, which is aesthetically pleasing and offensive to no one but actual Islamophobes.

Allah (PBUH) after all is unlikely to be impressed by overamplified and distorted fuzzy recordings.

The War on Abstraction

War on terror can mean devastating Middle Eastern countries on fabricated evidence, or simply comforting a child having a nightmare.

War on drugs could mean burning poor farmers crops in Colombia and Afghanistan, incarcerating ravers, introducing CBT counselling for people stranded for decades on SSRIs, or murdering addicts in the Philippines.

War on crime can mean targetting gangland bosses with tax legislation, incarcerating youth for minor offences, or overpolicing black neighbourhoods in America and Catholic ones in Belfast.

What we REALLY need is a war on policians using abstractions.

The next time a politician says he intends to tackle homelessness, we should make him play soccer with some rough sleepers. Ninety minutes of getting kicked about the pitch might not change much, but it would at least be less corrosive than whatever he actually intended to do.

What can we learn from alternative Israels?

Earlier this year, I started working on a project looking at manifestations of the Jewish state in alternative history literature. The seemingly intractable weeping wound that is the Israeli-Palestinian conflict is, of course, a product of history, but a history which seems increasingly without any obvious resolution, or rather, one in which the much-vaunted two-state solution appears to satisfy almost no one.

After working on Science Fiction and Catholicism for some years, it seemed obvious to continue that work by examining Buddhist futurism. A book on this will, in the fullness of time, emerge, but for now readers will have to be satisfied with the sole tangible output to date, an article on Buddhist reception in Pulp Science Fiction. (Another on Arthur C. Clarke’s crypto-Buddhism, and one on the Zen influence over Frank Herbert’s Dune are also due to arrive in public shortly.)

But a second offshoot from that work on Catholic Futurism began to take shape in relation to Israel. Specifically, I wrote a chapter in that book on the cultural anxieties revealed by how Anglophone writers dealt with Catholicism through alternative history. The other timelines imagined by those writers were uniformally negative, envisaging retrograde Catholic empires crushing all science, innovation and progress under its clerical jackboot heel, which runs rather counter to the significant amount of support Catholicism has tended to offer to scientists historically.

Indeed, it says much more about how Anglophone writers, and specifically how ENGLAND perceives Catholicism – not as a cultural taproot but rather as a kind of fifth column infiltration which threatens their survival in an existentialist sense. This sentiment, I sometimes feel, is the archeological origin of things like opposition to the EU and the Brexit campaign.

Anyhow, when I began examining alternative history as a mode for exposing such cultural anxieties, it became quickly evident that the alternative timelines different cultures are drawn to evoke are a little like Rorschach blot tests, identifying their cultural anxieties in very clear ways.

As a test case, I chose Israel, primarily because it is in one sense a new nation, in another a very old one. Also, many writers of alternative history are culturally Jewish and this mode of artistic exploration is one that they are often drawn to. I was not disappointed by the results, and have been incorporating this work into my broader research project examining speculative geographies in literature.

Alternative histories about Israel reveal a series of cultural anxieties, from the obvious fear of Jewish annihilation (in early history at the hands of the Babylonians, Pharaonic Egypt or the Romans, but also during medieval pogroms in Europe, and obviously arising from the holocaust), as well as imagined reversals of such annihilation (particularly the fantasy of Judaic global dominance, sometimes by converting the Roman Empire).

There is a particular phylum of these alternative histories which explores other geographic locations for a Jewish ethno-state. This is also real-world history, as the Zionist Council under Theodor Herzl did indeed consider locations other than historic Palestine for the creation of such a state. Actually many locations were seriously considered, by both Zionists and non-Zionists, and some territories were even offered by certain nations, during the interim between the emergence of Zionism as a political movement in the late 19th century and the creation of Israel in 1948.

I’ve been examining the literary manifestations of these real-world alternatives, to see in what way they unveil cultural anxiety about both the conflict with Palestinians and the Jewish relationship with Europe (from whence many current Israelis, especially the Ashkenazim, derive much of their cultural inheritance). This work has identified a strong sense of determinism about the current location of Israel which interestingly is secular and not predicated upon the religious diktat of the Old Testament (though of course the Promised Land of Eretz Israel remains a significant cultural driver within Israel itself, especially among the Orthodox community.)

I hope to publish something on this soon, when I get a moment. But for now, all I can offer you is a slide or two (above) from my latest conference presentation on the matter, which took place at the Specfic conference at Lund University in Sweden last week.

Whose civilisation is it, really?

I carry Neanderthal DNA in my body. I am one of the modern humans, homo sapiens sapiens, who are descended from hybrid cross-hominid fertilisation that likely occurred somewhen during the overlap of populations in paelolithic Europe.

Of course, that side of the family died out a long time ago, leaving my sapiens ancestors to colonise Europe and indeed everywhere else on the planet.

Neanderthal DNA Can Affect Skin Tone And Hair Color : Shots - Health News :  NPR

I often wonder what we lost when we lost our hominid relatives – the Neanderthals, the Denisovans, the hobbit-like Homo Floriensis and so on. What might a world of multiple hominid species be like? How might we have accommodated our stronger, carnivorous and less gracile Neanderthal population? What might our tiny cousin with grapefruit-sized heads, the Floriensis hobbits, have contributed to our world?

Anyhow, the more I ponder the roads not taken, the less impressed I have become with our own boastful claims and achievements. Not simply because human achievement increasingly has come at the expense of all other species (initially the large mammals, then our fellow hominids, and now basically everything else). But also because even those achievements, it seems to me, may not really be ours to claim.

Air flight, modern medicine, computers? For sure. We made those. But let’s go back upstream to the origins of civilisation to see whose civilisation is it really?

Neanderthals used fire. Indeed, probably homo erectus, the ur-granddaddy of hominids used fire. Fire is a major issue. No other animal uses it. Most run terrified from it. But hominids tamed it, and found ways to use it for cooking and heat. If there’s one development which most explains why hairless apes like us and not, say, the gorillas or big cats rule this world, it is probably the taming of fire.

Neanderthals also buried their dead. This is a sobering thought really. In some senses so do elephants, and other species also demonstrate evidence of mourning, loss and grief. We may feel that grief is one of the things which makes us human, but it’s not an exclusively human sentiment. Even taking it to the point of ritual behaviour – burial – is not exclusive to us.

But what of the other foundational components of human culture and society? What about clothing, art, science, religion?

Well, Neanderthals made jewellery from seashells and animal teeth. Neanderthals created artwork on cave walls. Neanderthals invented musical instruments, specifically bone flutes. We can presume they knew how to beat on drums or rocks rhythmically too. After all, they also had hand axes, which would have been made and used with such rhythmical hitting. Neanderthals built stone shrines, and where there are shrines, it is highly likely that ritualistic behaviour took place.

Neanderthals used lissoirs, and hence invented hide preparation, and hence clothing. They invented glue and string and throwing spears which they used to hunt large game. These hunts required collective action and collaboration. Recent evidence suggests that Neanderthals may even have learnt to count and actually recorded their counting by notching scratches on bones.

So perhaps this isn’t OUR civilisation at all, when you think about it. Perhaps we are thieves living in someone else’s house, whom we murdered, looking at their achievements and claiming them as our own.