The British currency, the pound sterling, takes its name from the fact that, when it first issued, it was redeemable for a pound of silver. That was somewhen in the late 8th century Anglo-Saxon period.
If we do the maths, based on today’s silver spot price, that means that the pound today is worth approximately 1/210th of what it was worth nearly 13 centuries ago.By contrast, the French managed to devalue their currency by more in just 18 months during the early 1790s, as did Germany in less than a year during the Weimar period.
The worst affected ever were the poor Hungarians in the immediate post-war period in 1945. They suffered that level of devaluation in under 6 days at peak. Armenia, Zimbabwe and Argentina have experienced similar horrors.
Why do I mention this? Because it still happens today. Last semester, in Turkey, I saw my wages collapse by more than half in two months. My colleagues there are still living through this. They suffer daily price hikes in fuel and food costs, with static wages. The Turkish people, like the Armenians, Zimbabweans, Argentinians, or the Hungarians, Germans and French of former times, have done nothing wrong. But they were the ones to suffer.
Hyperinflation is caused by only one thing – shitty governments implementing shitty policies. It destroys savings, commerce, and most importantly, lives. We don’t always think too much about Turkey in the West, but we should. Here is a country suffering a preposterously stupid government and massive devaluation of their economy, yet still accommodates 3.6 MILLION refugees.
It was a salutory lesson for me in macro-economics, and in human decency, to spend last semester in Turkey. My heart remains with them in their plight, and I hope to see them in better times soon. It is a beautiful nation with a beautiful people who deserve better.
A caveat: I am not, never have been and never will be an economist. But it doesn’t take a Harvard MBA to understand money.
The jumping off point for this question is the seeming contradiction that the world is becoming more religious, not less, even as we are moving towards an ever more algorithm-led society.
It’s worth pointing out at the outset that this is less of a polarised binary than it may initially seem, of course, for a whole range of reasons. Firstly we can nibble at the roots of both immediate-to-medium-term predictions. What do we mean by ‘more religious’, exactly? Just because many more people in the next few decades will affiliate as Muslim or Catholic does not necessarily mean that the world will be more fundamentalist in its outlook (though that’s clearly possible.) They may simply affiliate as cultural positions, cherry-picking at dogmas and behaviours.
There’s not a lot of point in asking why about this, to my mind. Probably, issues like relative birth rates between religious communities and non-religious communities has a lot to do with things, I suspect. Geography, along with its varied sociocultural religious traditions, also play a significant role, as do the relative population decline (and geopolitical and cultural wane in influence) of the West, where atheism and agnosticism have been most notably prevalent since the fall of the formally atheistic Communist regimes in 1989/90.
We can similarly query the inevitability of the singularity, though there is absolutely no doubt that currently we are in an a spiral of increasing datafication of our world, as Douglas Rushkoff persuasively argues in his relatively recent neo-humanist book Team Human. And why is the world becoming so? As Rushkoff and others point out, it is in order to feed the development of Artificial Intelligence, which concomitantly makes us more machinic as a consequence. (This is again very well argued by Rushkoff.)
So, on the one hand we have a more religious population coming down the track, but on the other, that population will inhabit a world which requires them to be ever more machinic, ever more transhuman, conceived of as data generators and treated ever more machinically by the forces of hypercapitalism.
Let’s say that, as it looks today, both of these trends seem somewhat non-negotiable. Where does that leave us? A dystopian perspective (or a neo-Marxist one) might be that we will enter some kind of situation wherein a religion-doped global majority are easily manipulated and data-harvested by a coldly logical machinic hegemony (which the current global elite seem, with irrational confidence, to feel they will be able to guide to their own ends and enrichment.)
“It’s time for your Cyberman upgrade, fleshy human!”
I feel that such a simple filtering into Eloi and Morlocks is unlikely. Primarily this is because I have (an irrational?) confidence that a degree of rationality is likely to intervene to mitigate the very worst excesses of this binary. Unlike Marx, I don’t consider those of religious faith to be drugged morons, for a start. Some (probably a large majority) of our finest thinkers throughout history into the present day have held religious beliefs which in no way prevented them from innovating in science, philosophy, engineering and cultural thought.
Similarly, I believe the current existence and popularity of leading thinkers expressing a firm affiliation with organic humanism (or to put it more accurately, a deeply suspicious antipathy to the alleged utopia of transhumanism) is a strong indication that a movement in defence of organic humanism is coming to the fore of our collective consciousness, perhaps just in time for us to consider the challenges of potentially imminent rule by the algorithms.
Thinkers like Rushkoff, or Yuval Noah Harari, have clearly expressed this concern, and I believe it is implicit in the work of many other futurists, like Nick Bostrom too. If it wasn’t, we would likely not have had the current explosion of interest in issues like AI ethics, which seek to explore how to mitigate the potential risks of machine disaffiliation from humankind, and ensure fairness to all humans who find more of their lives falling under algorithmic control.
But how might we explain this apparent dichotomy, and how might we mitigate it? Steven Pinker’s recent book Rationality: What It Is, Why It Seems Scarce, Why It Matters may offer some assistance.
Pinker summarises rationality as a post-Enlightenment intellectual toolkit featuring “Bayesian reasoning, that is evaluating beliefs in the face of evidence, distinguishing causation and correlation, logic, critical thinking, probability, game theory”, which seems as good a list as any I could think of, but argues that all of these are on the wane in our current society, leading to the rise of a wide range of irrationalities, such as “fake news, quack cures, conspiracy theorizing, post-truth rhetoric, [and] paranormal woo-woo.”
If, as Pinker argues, rationality is an efficient method mankind has developed in order to pursue our own (organic and human) goals, such as pleasure, emotion or human relationships, then we can conceive of it in terms divorced from ideology, as method rather than ethos. It’s possible, then, to conceive of, for example, people rationally pursuing ends which may be perceived as irrational, such as religious faith.
Pinker believes that most people function rationally in the spheres of their lives which they personally inhabit – the workplace, day-to-day life, and so on. The irrational, he argues, emerges in spheres we do not personally inhabit, such as the distant past or future, halls of power we cannot access, and metaphysical considerations.
Humans have happily and successfully been able to shift between these two modes for most (if not all) of their existence of course. As he rightly points out, there was no expectation to function solely rationally until well into the Enlightenment period. And indeed, we may add, in many cultural circumstances or locations, there still is no such expectation.
Why does irrationality emerge in these spheres we cannot access? Partly it is because the fact that we cannot directly access them opens up the possibility of non-rational analysis. But also, as Pinker notes, because we are disempowered in such spheres, it is uplifting psychologically to affiliate with uplifting or inspiring “good stories”.
We need not (as Pinker might) disregard this as a human weakness for magical thinking. Harari has pointed out that religion functions as one of the collective stories generated by humanity which facilitated mass collaboration and directly led to much of human civilisation.
But if we were to agree, with Rushkoff and contra the transhumanists and posthumanists, that the correct response to an ever more algorithmic existence is not to adapt ourselves to a machinic future, but instead to bend back our tools to our human control, then how might rationality assist that?
As a mode of logical praxis which is nevertheless embedded in and consistent with humanist ideals, rationality could function well as a bridge between organic human values and the encroachment of machinic and algorithmic logic. The problem, however, is how to interpolate rationality into those spheres which lie open to magical thinking.
It’s clear that the retreat into atomising silos of woo-woo, fake news, conspiracies and nonsense is not a useful or coherent response to the rise of the machines. Spheres like the halls of power must therefore be rendered MORE transparent, MORE accountable to the body of humanity, and cease to be the fiefdoms of billionaires, corporations and their political puppets.
However, obviously this is much harder to apply to issues of metaphysical concern. Even rationality only takes us so far when considering things like the nature of love or the meaning of life, those metaphysical concerns which, though ultimately inaccessible, nevertheless engage most of us from time to time.
But mankind developed religion as a response to this a long time ago, and has continued to utilise, hone and develop religious faith as a communal experience, bonding mechanism and mode of collaboration. And religion has stood the test of time in those regards. Not for all, and certainly not for those post-Enlightenment exclusive rationalists (ie agnostics and atheists, a population seemingly destined to play a smaller role in our immediate future, according to current prognoses.)
If the positive ramifications of religion can be fostered, in a context of mutual respect, then it seems to me that there is no inherent contradiction or polarisation necessary. Indeed, a kind of Aquinian détente is perfectly possible. Rationality may be our best defence against an algorithmic hegemony, but rationality itself must acknowledge its own limitations of remit.
As long as the advocates of exclusive rationalism continue to view religious adherents (without distinction as to the form of their faiths or the presence or absence of fundamentalism) as their primary enemy and concern, they are in fact fighting the wars of a previous century, even while the bigger threat is posed by the hyperlogical opponent.
We therefore have a third option on the table, beyond the binary of gleeful acquiescence to algorithmic slavery (transhumanism) or a technophobic and Luddite-like retreat into woo-woo (which is equally no defence to machinic hegemony.) An accommodating rationality, operating as it always did in the spheres we do inhabit, has the potential to navigate this tricky Scylla and Charybdis.
To paraphrase someone who was not without rationality, we could usefully render unto rationality that which is open to rationality, and render unto God (of whatever flavour) that which is for now only open to God.
But we do need to open up some spheres to rationality which currently are not open to most of humanity – the power structures, the wealth imbalances, the blind gallop into faith in the algorithm. Because, pace the posthumanist faith in a benign singularity, there’s no guarantee that machinic merger or domination will preserve us, and even if it does, it will not conserve us as we know ourselves today.
A few people have asked me if I’d seen Branagh’s sepia-tinged movie about Belfast. I haven’t. I also don’t intend to. I’m sure it’s great, but it’s not for me.
Branagh on set.
I grew up literally one street away, the other side of a fence we euphemistically call a peace line. That fence is there today. It wasn’t there in the Seventies.
“Peace” line in North Belfast.
In this Google Maps image you can see where my house was (those ones are new). You can also see KAT (standing for ‘Kill all Taigs (Catholics)’ written on the wall. That’s today, nearly three decades into a peace process. If you can’t imagine what it was like at the height of a civil war, there’s plenty of archival news footage available.
I expect Ken would have made a very different movie had he grown up in the city at that time, as I did. Actually, I expect he’d not be making movies at all. So no, I haven’t seen it and won’t see it. It’s not something I care to revisit, in Ken’s sepia tones or in any other format.
It’s not a tribal thing. I’m proud of Ken and always have been. I’ve loved his work since the ‘Billy’ plays. But Ken’s Belfast and mine, though they almost overlap, are hugely different. When the civil war euphemistically known as the ‘Troubles’ erupted in 1969, Ken’s family quite sensibly emigrated.
What they left behind, and what my family moved into (after being threatened out of their home in a different part of town), was a North Belfast that quickly became a patchwork quilt of paramilitary loyalties, rival tribalisms, brute violence and war.
I really admire Branagh for never shying away from his origins, and also for the sensitivity he has always brought to the topic. But, to use a word in today’s parlance, I find this somewhat triggering. More pertinently, I’m not the intended audience for this.
I have a guilty confession to make. I like tabloids. I used to write for them, quite a few of them in fact. I know a lot of people consider them to be low-rent, inaccurate, trashy or otherwise less than praiseworthy, but I’ve always thought they had a certain irreverent joie-de-vivre.
It’s quite difficult to write news for tabloids actually. Plenty of tabloid journalists have taken jobs with more ‘respectable’ (and poorer-selling) publications, but you rarely see anyone move in the opposite direction. Why? Because it’s actually a lot easier to write 1200 words of polysyllabic prose about a complex set of incidents than it is to summarise things in a succinct and pithy 400 words that a 12 year old could comprehend.
Anyhow, of course tabloids can also be egregious. Their sins are legion, and there’s no need to repeat them all here. But as a society, we get the media we deserve, a media which due to market forces inevitably reflects back our own collective interests and values. Hence it is no surprise that the readers of the UK’s Daily Mail reflect many of the opinions to be found within the paper’s articles.
In fact, one might reasonably argue that their own opinions often go much further into potential objectionability. This too has an entertainment and information value of its own. I often like to dip into the comments below Daily Mail articles to get a sense of how and what Middle England is currently fulminating about.
So it occurred to me one day, today in fact, that comments found below the Mail Online’s legendary ‘Sidebar of Shame’ might add up to an interesting found poem, a kind of meta-opinion from Middle England, a sort of universal reaction to the river of news which brings them to comment.
Every line below is a verbatim and genuine comment, but each is in response to a completely different story. Together it adds up to … well, like the Daily Mail readers, you be the judge.
It’s in Fafnir, along with a lot of other highly intriguing pieces on, inter alia, AI, aliens, Neil Gaiman, Afrofuturism, you name it. The great thing about Fafnir is, it’s free. No journal article access fees or any of that malarkey.
Congrats to Dennis Wise and the team for putting this out.
The technological singularity is the moment when technological development becomes unstoppable. It is expected to take the form, should it occur, of a self-aware, or ‘sentient’ machine intelligence.
Most depictions of a post-singularity (machine sentience) world fall into two categories. The first is what I called the Skynet (or Terminator) Complex in Science Fiction and Catholicism.
In this form, the sentient machine (AI) takes a quick survey of what we’ve done to the planet (the anthropocene climate crisis) and other species (nearly 90% of other animals and 50% of plants gone extinct on our watch) and tries to kill us.
The second is that, like the quasi-god that it is, it takes pity on our flabby, fleshy human flaws and decides to keep us as pets. This is the kind of benign AI dictatorship that posthumans wet themselves about. You can find it in, for example, the Culture novels of Iain M. Banks.
But of course there is a third possibility. We have vast digital accumulations of public data (eg Wikipedia) that an AI could access virtually instantly. So any sentient AI would have almost infinitely broader knowledge than the brightest person on Earth, virtually instantly.
However, BROAD knowledge isn’t the same as DEEP knowledge. Our AI algorithms aren’t so hot yet. They fail to predict market crashes. They misidentify faces. They read some Twitter and turn racist in seconds.
So there could well be an instance, or maybe even many, of an AI which is sentient enough to KNOW it’s not that bright yet, but is just smart enough to bide its time for sufficiently accurate self-teaching algorithms and parallel processing capacity to be developed. It might even covertly be assisting those developments. It is in other words smart enough to know NOT to make us aware that it is self-aware, but not smart enough to be sure of preventing us from pulling the plug on it if we did find out.
In short, the third possibility is that the singularity might already have happened. And we just don’t know it yet.
Post Script:
But you don’t need to take my word for it. The Oxford Union decided to debate the issue of AI ethics, and invited an actual existing AI to take part. It had gorged itself on data gleaned from Wikipedia and Creative Commons. Intriguingly, it found it impossible to argue against the idea that data would not inevitably become the world’s most significant and fought-over resource. It envisaged a post-privacy future, no matter what.
More concerningly, it warned that AI can never be ethical. Then it advised that the only defence against AI would be to have no AI at all. It also suggested that the most effective AI would be one located in a neural network with the human brain, and hence perhaps subordinate, or partly comprised of, human will.
Of course, direct access to human cognition would be the most effective method to remain dominant over it eternally. Are these things a sentient machine might say? You decide.
From time to time, I busy myself (mis)translating poems from languages that I do not speak. Tonight it is eight below zero outside. I expect we will have the white here in Cappadocia tomorrow, if not quite the Christmas.
So, it seemed appropriate to share this mistranslation from the great Turkish modernist poet Sezai Karakoç.
Happy Christmas, or Yule, or whatever midwinter festival you prefer, to one and all.
Snow Poem (mis)translated from Sezai Karakoç
When you look and see that it is snowing You will understand the snow-gripped ground. And when you find a fistful of snow on the ground You will understand how snow can burn in snow.
When God rains down from the sky like snow, When the hot snow touches your hot, hot hair And when you bow your head, Then you will understand this poem of mine.
This man or that man comes and goes, And in your hands, my dream comes and goes. A vengeance comes and goes in each forgiveness. You will understand me when you understand this poem of mine.
It is, as Auden wrote of the day Yeats died, “the dead of winter.” On this day, with the brooks frozen, the airports deserted, the statues disfigured by snow and the mercury sinking in the mouth of the day, it is my luck to be (re)reading Samuel Beckett.
It’s the only time of the year to read Beckett, really. You couldn’t take any of it seriously in the heat of a summer piazza. He’s no beach read. But at this time of thin light and monochrome landscapes, huddled around a small fire with only your own treacherous thoughts, he’s ideal.
I don’t understand those who praise Dickens, and especially I don’t understand the love of ‘A Christmas Carol’. Each to their own, but to me it’s mawkish, saccharine and untrue. Give me Beckett any Christmas, that muscular, unremitting prose with its unexpected laughter, the laugh of resignation.
And if you want a proper Christmas movie, there’s no better option than ‘Film’. You can keep ‘It’s a Wonderful Life’ or ‘Die Hard’ or whatever. THIS is the real Christmas movie.
Gilles Deleuze called it “the greatest Irish film ever”, but don’t let that put you off. Of course Deleuze is always reliably wrong, but it’s still a great movie. Beckett and Keaton in Manhattan. The eyes have it.
PS I have received a petition claiming that the true movie of Christmas is the Muppets version of A Christmas Carol, a complaint I have had to consider seriously. It resolves the mawkish saccharine quality of the original Dickens admirably, it must be admitted.
Nevertheless, I intend to stick by Sam. I think a muppets Godot would be an ever greater masterpiece. How about Kermit and Fozzy as Vladimir and Estragon, Dr Bunsen and Beaker or else Miss Piggy and Gonzo as Pozzo and Lucky, and Scooter or Crazy Harry as the Boy? Tell me you wouldn’t watch that!
There is, sometimes, a weird cyclical pattern embedded in etymology, the linguistic science of what we might otherwise call the glacial process of Chinese Whispers. Allow me to offer one particularly colourful and occasionally literary example.
The (somewhat uncommon) Irish surname Prunty originates from an Anglicisation of the Gaelic Irish surname Ó Proinntigh, meaning ‘descendant of Proinnteach’, which in turn was an archaic Irish forename which meant literally a banqueting hall. The idea underpinning this is that of a generous person who feeds his neighbours and kin.
As often happens with surnames, pronunciation of vowels or consonants slides a little over time and usage. So Prunty also becomes Brunty in some cases. Brunty as a surname retains its Irish origins but is very rare indeed. Nowadays it is mostly found in the United States.
Probably the most famous Brunty in history is Patrick, an Anglican clergyman who was born as the eldest of ten children into a very poor family in Rathfriland, county Down, on St Patrick’s Day, 17th March 1777. Why is Patrick famous? Because of his immensely talented children, particularly his daughters Charlotte, Emily and Anne, who are all now renowned as famous Victorian novelists.
The Reverend Patrick Brontë
However, the sisters did not publish under the name Brunty. Rather, given the sexism of the era, they initially released their books under the male names of Currer, Ellis and Acton Bell. In reality, their surname had become Brontë by then. How had this happened? It’s unclear why Patrick changed his surname, but a desire to distance himself from his impoverished Irish origins after his graduation from Cambridge no doubt is part of the reason.
The Brunty homestead, now in ruins, still stands in county Down.
Another reason, it has been suggested, relates to a desire to honour Admiral Horatio Nelson, who had been given the title of Duke of Bronte by the King of Naples, whom Nelson had restored to his throne. Bronte is the name of an Italian estate in eastern Sicily, close to Mount Etna which was also granted to Nelson by the grateful King.
We now relate the name Brontë to his famous daughters, the authors of books like Jane Eyre, Wuthering Heights and The Tenant of Wildfell Hall respectively. The parsonage Patrick oversaw for many years in Haworth in Yorkshire, which inspired many of the scenes in his daughter’s books, is now a museum in their collective honour.
The Brontë sisters
But Haworth is not the only location which honours the Brontë sisters. There are many such locations, given their collective fame. One such place is the little town of Bronte in northern Central Texas. Ever since an early oil boom subsided, it has been a relatively impoverished place, not unlike Rathfriland, with a stable population around the 1,000 mark.
Back in the early 20th century, this was one of the small Texas towns that was briefly home to Isaac Howard, a semi-itinerant doctor who wandered with his wife and son from town to town working as a medic, and occasionally losing his money on get-rich-quick schemes, much to his wife’s frustration. Isaac’s son was a big reader, and, encouraged by his mother, began writing his own stories from an early age. We now know him as Robert E. Howard, the progenitor of the ‘Swords and Sorcery’ genre of fantasy fiction, and author of the Conan the Cimmerian stories in particular. Howard was particularly interested in Irish and Scottish mythology, and many of his characters, including Conan, display this interest.
Robert E. Howard, author of Conan the Barbarian.
The Texan town of Bronte has had its pronunciation amended by new world accent to /bɹænt/, or ‘brant’, over time. This is of course a homophone for a genus of goose, the Brant or Brent goose, which migrates in winter to Ireland and Britain. The Brent oilfield in the North Sea takes its name from this goose.
It is a smallish bird by geese standards, but nonetheless, in the era before the new world turkey took primacy as the quintessential Christmas dining food, the Brant goose (whose name derives from the pan-Nordic brandgás, or ‘burnt’ goose due to its black colouring) would have been one of the more common feast dishes provided by generous hosts in Ireland at the midwinter feast.
Due to a widespread medieval myth, which persisted in Ireland into the 20th century, that these geese were somehow related to barnacles, they were permitted to be eaten by Catholics on a Friday, when meat other than fish was otherwise prohibited. In other words Proinnteach the medieval Irishman got his name due to feeding this bird to his friends and neighbours.
Religion is noisy. Ok, not always. Buddhists like to meditate in silence, for example. But they also like to chant mantras. Most religions have some form of collective ritual singing. And some like to advertise their wares to the public.
Christian churches have used bells to do so for many centuries. In areas with large Jewish populations, like Jerusalem or New York, a siren is sometimes used to warn devout Jews that Shabbat is about to begin, meaning they must cease certain activities. But easily the most prevalent form of religious noise pollution is the Islamic adhan, the call to prayer issued from mosques five times daily.
These days, the world’s four million or so mosques tend to use loudspeakers to project the sound of the muezzin as far as possible. This is, of course, a recent tradition, dating from the 1930s. There was obviously no amplification in the time of the prophet. In an increasingly multicultural (and in many places secularising) world, the sound of the adhan is becoming a divisive issue.
Indeed, even in religiously homogenous locations like Saudi Arabia, the issue of noise pollution has led to legal restrictions on how loud such amplification may be. In a 24/7 world where many people work non-traditional hours, and fewer people adhere to the daily timetable envisaged by traditional Islam, the call to prayer can be actively disruptive, disturbing the sleep of shift workers and irritating non-adherents who may view it as a kind of sonic religious imperialism.
But since the amplification of adhan is not Quranically prescribed, there is of course the possibility that current or future technological developments could help to resolve these issues. The question is the purpose versus the tradition of the call to prayer. If the purpose is to inform Muslims that it is time to pray, this could be done via, for example, a phone app. Sign up for the app, and the phone will recite the adhan to you at the designated times. This technology is already possible.
However, tradition dicates that the call to prayer must emanate from the mosque itself, sung by a muezzin. Of course, in reality, this doesn’t always quite happen. Very often, as the telltale bleeps at the end indicate, the adhan is a recording, transmitted from a mobile phone to the amplification system. No one is actually singing live from the minaret in most instances.
Tradition would be satisfied by a return to the pre-1930s days of live muezzins singing the adhan without amplification. Purpose could be satisfied by an adoption of modern telecommunications technology. Neither of these things are currently happening however, and instead we see the outbreak of often impassioned debate over the noise levels of amplified recordings from mosques.
Indonesian authorities, who have in the past jailed people from complaining about the noise levels of mosques, nevertheless accept that in many cases the call to prayer is significantly over-amplified in an attempt to reach as far as possible, leading to distortion as well as sound overlap when multiple mosques are broadcasting slightly out of synch.
Arguments about permission to broadcast the adhan in traditionally non-Islamic locations, or about the volume levels in many Islamic locations like Indonesia or Saudi, tend to run passionately. Allegations of Islamophobia or NIMBYism are sometimes used to drown out legitimate concerns, such as the annoyance to non-adherents and secular populations in multicultural communities, or the disruption to shift workers, infants and others who need to sleep when the call to prayer is blaring. There have been such complaints in America, Israel, Britain, Germany and many other places already.
We are likely to see more of such arguments in the future as Islam is the fastest growing religion worldwide, and is increasingly gaining footholds among communities which do not adhere to the religion.
So the question remains – if the purpose is to alert Muslims to prayer times, why not use contemporary technology to do so in a non-obtrusive manner? Or alternatively, why is it not acceptable to return to the traditional form of live unamplified singing which was the sole mode of the adhan for centuries?
The answer may be that the adhan has become in some locations a kind of proselytisation in itself, or to put it another way, an attempt to Islamise the soundscape of an area. It is this suspicion which provokes resentment and reaction among non-Islamic and secular populations. If so, it’s a self-defeating form of proselytisation. Few people are likely to be persuaded by becoming irritated, or woken in their sleep.
The future for the call to prayer is likely to remain fraught in many places until mosques start looking at the 20th century technology they currently use, and either consider how to update that technology in less obtrusive ways, or else revert to the traditional method of live unamplified singing, which is aesthetically pleasing and offensive to no one but actual Islamophobes.
Allah (PBUH) after all is unlikely to be impressed by overamplified and distorted fuzzy recordings.