Post Truth

“I can handle big news and little news. And if there’s no news, I’ll go out and bite a dog.” This is how Charles Tatum, played by Kirk Douglas in Billy Wilder’s noir classic Ace In The Hole (1951), pitched for a job at the Albuquerque Sun.

Manufacturing the news has always been a part of the job. Newspapers were founded, like the stock market, in the coffee houses of 17th Century London where false news stories were often published to shift the value of stocks. Pamphleteers, like Daniel Defoe, were adept at blurring fact with fiction. The exploits of Robinson Crusoe were, for years, passed off as true. Fake news pays. In ancient Greece, corn merchants spread rumours of disastrous storms and shipwrecks to raise the price of grain. False rumours were considered so powerful that, in Athens in 413BC, a barber was tortured for spreading unrest after repeating news he had heard from an escaped soldier of a disastrous military defeat in Sicily. Even though the story was true they gave him, as Roger McGough called it, “a short back and insides.”

It wasn’t until 1914 that a journalists’ creed appeared which included “clear thinking and clear statement, accuracy and fairness are fundamental to good journalism.”

Even so, when I started work as a journalist in the 1980s, one of our standard playbook persuasive tactics to get people to talk was: “if you don’t, I’ll just have to make it up.” The line was credible because journalists weren’t and the threat was so effective I never did.

That’s not to say, on slow news days, small stories weren’t ‘embellished’. With their policy of never confirming or denying stories, the royals were always a good target for a little fiction; which is probably why the Daily Mail is currently baiting the Duchess of Sussex to see how far they can go in painting her a bossy Hollywood diva from hell… before she bites. On a slow day at one national newspaper, I once wrote about a football match between two youth clubs in Hyde Park which Prince Charles had passed by. By the time the story got to the copy desk, the Prince of Wales had stopped, shown off some fearsome keepy-uppy and even scored a goal. The Palace said nothing and as Tatum puts it, “It’s a good story today. Tomorrow, they’ll wrap a fish in it.” … but then you’re going to have to start all over again and find something else to “report on.

The temptation to disregard facts is driven by the media’s endless demand for content; a relentless pressure that used to be reserved for journalists. Now, with the monetizing of click-bait, a vast number of vloggers, bloggers and twitter-feed cloggers, experience that same pressure.

Just look at the volume of news in comparison to “comment” in any modern newspaper, (including this one), or news-site or news programme like Radio 4’s Today, and it’s clear that fact, accuracy, and empirical evidence consistently loses out to outrage, opinion, belief, fluff or uninformed celebrity views. And this decline has been going on for decades; while the armchair rage rousing internet content providers are only accelerating it.

Facts, reality and truth have become dangerously plastic. Brexiteers and Trump not only dismiss expert opinion but facts too. When challenged with telling an outright lie, White House spokesperson Kelly Anne Conway coined the phrase “Alternative Facts.” Breitbart News editor Joel Pollak argued this was a harmless and accurate term in a legal setting, “where each side of a dispute will lay out its own version of the facts for the court to decide.” Justifying one “Alternative Fact” with another.

But Pollak unwittingly pointed to an age old fundamental problem in the Anglo-American pathway to truth… our systems are adversarial. Instead of facts or the balance of probability we allow the concept of multiple truths until we decide which is the most compelling. Which means that in most Court disputes one side has to lie. And if that’s true, why aren’t the majority of court verdicts accompanied by perjury charges? Could our tolerance of multiple truths, be leading us to lose our grip on reality?

For centuries, we have believed that we are the children of the Enlightenment and that we use empirical truth as the basis of our values. Capitalism is built on the agreement of value. But the great Capitalist manifestos were all written when money was still commodity backed (like the gold standard). Money represented something unwaveringly material. With the loss of Empire after WWI, Britain could no longer claim new gold reserves ripped from its colonies, and for a small island with finite precious resources, money had to become notional, virtual, a matter of conviction. Today money in exchange for goods or labour, is just an idea, an agreement, numbers in computers, a belief, and so it’s vulnerable to intangibles, human frailties, opinions and feelings such as hope, despair, confidence, and trust.

Value is an act of faith. It is as real, rational, or empirically substantial as gods, demons, fairy tales, vampires and the origin of a denied fart. It is a mass delusion based on the stories we tell. Banksy adds an impossible story about a shredder on standby for a decade in the frame of a work of art and doubles its value. I trust my expensive Duchy organic milk is really produced by cows listening to Mozart and not weeping from their udders because their calves have been placed in the next pen so they can hear and smell them but will never be allowed to feed them. In every shape and form, stories and provenance create value, and fiction is an integral part of stories.

What is so amazing is that it is so hard for us to see that. Non-western cultures have watched in awe at the grandstanding of our culture backed by a fiction as ridiculous, unprovable and unsubstantial as any faith.

In Homo Sapiens, Yuval Noah Harari argues that our willingness to believe in fictions has been the secret of success for our species. “We are the only mammals that can cooperate with numerous strangers because only we can invent fictional stories, spread them around, and convince millions of others to believe in them.”

Good stories have grains of truth but the balance of truth and fiction in our stories has passed the tipping point. In 2008, when the subprime mortgage story – that overpriced housing can be supported by selling poor people into continuous debt – was exposed as pure fiction, the banking world collapsed. The consequences of that lapse in confidence has sent us spinning ever since. Distrust of “The Man” has informed the Brexit and Trump “base”. “Truth be buggered”, they say, “nothing was ever true.”

Democracy, the cornerstone of enlightened politics, is proving unfit for a culture willing to believe stories while no longer needing to trust the source. The democracy that May is currently defending from a second referendum is as much a sham as a Russian voting booth or “fun size” Mars bars.

In HyperNormalisation (available on iPlayer), filmmaker Adam Curtis argues that Russia has for years, purposefully created entertaining chaos to keep its populace thrilled and scared and confused and at odds with each other: and so unable to work together in a democratic way as consensus is always frustrated. Vladislav Surkov, Putin’s chief political technologist, calls it non-linear warfare. He would finance different extremist factions, even anti-Putin groups, and then reveal that he had backed them. Nobody knew what was real or not. In a broken post-perestroika Russia there was only one thing you could be absolutely sure of, Vladimir Putin.

Surkov harnessed the anxiety of falling down the rabbit hole to keep Putin in power for years. Through the internet this policy of undermining reality has been leaking into the West; most obviously with Trump and Brexit, both now provably supported by Russian disinformation tactics.

If the first casualty of war is the truth, we have to realise we’ve been in a non-linear war now for some time. And even if there are no bombs dropping, and even though we cannot be sure who our enemies are or what is really happening, with the Brexit implosion it seems harder and harder to deny we are at war, that we’re losing, and that’s the truth.

 

 

 

A version of this article first appeared in

I Vant To Be Alone

“Alone now, there is nothing but my breath.

I scream and that turbulent fury that rages from my mouth,

Is silence. In the darkness.”

Marius Brill (Aged 14)

So, whilst wiry, post-pubescent, me was wondering if Faber & Faber was going to recognise my genius; fat, middle-aged, me would happily put the gun I was clearly looking for, to the temple of that spotty youth and end his tortuous self-obsession whilst simultaneously sparing the world his awful poetry. And, you know, juvenile me would have thanked the wrinkly old git with the Glock because, “yeah, I’ll be dead and then they’ll be sorry… and then someone’ll publish my poetry and the world will mourn the loss of my depth and potential.”

But somehow future me shouted loud enough through the temporal ether to convince teen me to bury my cheerless odes in a woollies notebook at the back of a cupboard. And though I was almost certainly anticipating a time when the world would finally recognise my virtuosity and cry out for my youthful insight, I can now only be grateful that it went no further than beneath a shoe box of multi-coloured Kickers.

Rediscovering that blacnk verse is not just acutely embarrassing, it’s a painful reminder of how difficult and angst ridden and confusing and troubling becoming a teenager can be. But if the pain of post-adolescence is like permanently having fillings drilled, being a parent of teenagers is unceasing root canal treatment.

When 14-year-old me stomped up to my bedroom to be alone it wasn’t actually to be alone. Yes I wanted to physically create the separation I felt from the parents who were failing to understand my vast emotional intellect, I wanted them to feel guilty for their shortcomings, but mainly I wanted to be with my pen, a spiral bound notepad, and my entire, imagined, future audience of adoring poetry-reading fans. Which, as it turned out, was exactly the same number as my present, real, adoring poetry-reading fans.

Now my youngest child has entered the teen trials. And she wants to be alone and she stomps up to her bedroom but not, fortunately, to write nauseating poetry (I don’t think) but to be alone with all her peers who have also disappeared, phones in hand, to be “alone” too. You’re never alone with a phone! Except, you’re always alone with your phone. The teenage bedroom has, since its invention, been the human cocoon from which, hopefully, a butterfly will eventually eat her way out of. And as hard as it is to be an adult and end a sentence with a preposition, it is harder still to transition from a constantly hovering “helicopter parent” to a long distance bomber parent who rarely gets flight clearance so tends to store up complaints to carpet bomb a month’s worth of problems in explosive, eye-streaming, outbursts.

Being alone is never actually about being alone, it’s about not being with others. The times change, the technology changes, the aspirations change but the urge, and the history, repeats itself from one generation to the next. Being alone is an act of glorious selfishness, only really possible in the narrow vision of an adolescent mind, because it tries to deny, and by that prove, you have an integral role as part of a larger community. The temptation to indulge in a “splendid isolation,” when you feel misunderstood or treated unfairly, is partly to punish those you are leaving and partly to try to prove independence. Both, of course, core adolescent impulses.

Kids need to grow up and feel independent. What’s wrong with a bit of alone time?

Nothing…. when you’re talking about kids.

But of course, I’m not. The problem comes when an entire nation behaves like a moody, depressed, anxiety-ridden teenager.

The term “splendid isolation” is one John Major has used several times in his attacks on Brexit. It, in fact, refers to the 19th century British government practice of avoiding permanent alliances with other European countries; particularly during Lord Salisbury’s premiership from 1885-1902. Historian Harold Temperley summarised it as, “Non-intervention; no European police system; every nation for itself, and God for us all; balance of power; respect for facts, not for abstract theories; respect for treaty rights, but caution in extending them … England not Europe… Europe’s domain extends to the shores of the Atlantic, England’s begins there.”

Of course we had an Empire to fall back on, so we held all the cards in trade agreements, but by not getting involved, by withdrawing our political strength, we allowed European countries to escalate their posturing and military peacocking. Aspirant empire builders, Russia, France, Germany, Austria-Hungary, even Belgium went through a series of treaties, agreements and alignments trying to leverage their own might. By the turn of the century when a Britain vs Germany naval arms race began, politicians started to realise the country was dangerously exposed by its lack of European allies. In 1898, Colonial Secretary Joseph Chamberlain, who tried twice unsuccessfully to negotiate an alliance with Germany, said, “We have had no allies. I am afraid we have had no friends … We stand alone.”

By abandoning involvement in Europe, Britain had no skin in the game. It was unable to broker peace and the bristling European powers were getting ready to prove their superiority. As any GCSE history student will begin their ubiquitous 20th Century essay, “There were many causes of the First World War…” But first among them was a policy that came to be known as the “Splendid Isolation.”

This month, a hundred years after the guns fell silent, when the death toll could finally be tallied, we will remember the 40 Million soldiers and civilians killed in the war and by the flu, famine and disease that followed in its wake; the war to end all wars. So why are we forgetting what got us there in the first place?

Yes, we’ve all wanted to do a Greta Garbo and get some time out. We’ve all wanted to storm up to our rooms and lock the doors. But, when we leave and prioritise ours needs, we ignore how intrinsic we are to the framework, the stability, the unit that as social animals we are all a part of.

No one, thankfully, will remember my teen poetry, and policy during Britain’s “Splendid Isolation” is historical marginalia in comparison with what followed, but on the eleventh hour of the eleventh day of the eleventh month, we will remember. At the going down of the sun and in the morning, we will remember.

 

 

 

A version of this article first appeared in print in

Ctrl-Z

I’ve done some pretty shitty things in my time. I’ve definitely put the ain’t in Saint and, what’s more, too spineless to admit my own flaws, I constantly lie to myself about it. So in my version of the world, after drunkenly stealing some roadworks diversion signs, and thinking it hilarious to reroute all the traffic on Cheyne Walk the wrong way up Old Church Street, none of the potentially fatal car crashes I may have caused ever happened. Nope. Never happened.

Through the 20th century, as the fledgling science of “psychology” blossomed, we started to realise how flawed the human brain is. Now neuroscience can show our brains physically rewire in response to stimuli; we are biologically capable of rewriting past experiences. Hangover amnesia is probably the most common phenomenon of this neurological self-editing. “Drinks? I’ve had a few. But then again… I’ve rerouted my synapses so I’ll never be reminded.” Done something you’re not proud of? Don’t worry, your brain will fix it because it still needs to function and get some sleep.

If we’re rewriting the events we’re uncomfortable with, what we’re replacing them with are just, well, stories. Concepts such as “the truth” and “the past” have not only become subject to relativism, but the very fact that I can call them “concepts” without your alarm bells ringing “logical fallacy” shows how far down the rabbit hole we’ve already descended.

Dr Blasey Ford did herself no favours explaining how she could remember US Supreme Court Judge nominee Brett Kavanaugh assaulting her over 30 years ago. “Just basic memory functions,” she said, “and also just the level of norepinephrine and the epinephrine in the brain that as you know encodes that neurotransmitter that codes memories into the hippocampus.” The entire inquisitorial panel stared at her with open mouths and then dismissed it as clearly a herstory/history.

It seems that, before the Western Liberal orthodoxy began to implode in 2016, no one had noticed how far reality and truth had actually drifted from each other. Now we can only gawp at Trump’s flat denials or the Boris Mogg glorious pre-Euro, British idyll – when there definitely wasn’t abject poverty, child-killing measles, general strikes, and whole generations toiling in servitude – a mad as a cricket bat, village green, fantasy that Brexit can take us “back” to.

“Make America Great Again” relies on a mass delusion that America was ‘great’ at some point. America has done great things: won wars, assured peace, policed the world, and promoted democracy. But, from its founding Native American genocide, through slavery, to the “No Coloreds” white picket fences of the Madison Avenue / Hollywood dream sellers, America has only ever been a “great” place to live in stories and for a privileged few. The version of the past that the AltRight succeeds in selling is a selective nostalgia.

So if the victors write history and our memories are neurological shifting sands, can we say the past actually exists?

I know that’s the sort of question philosophers like to ask whilst skinning up rollies outside the jobcentre. But then, if we didn’t have philosophers to ask stupid questions we’d have to rely on Piers Morgan and no one, not even Mrs Morgan, wants to do that. Of course it’s possible that, like homework eating dogs, working late at the office or some really bad heroin, the past could be one gigantic fabrication we spin ourselves to explain the terrible state we’re in right now. But, beyond the solipsism, even if we can’t agree on what the past is, we all seem to function by agreeing that there is one.

In The Battle of the Books (1704), Jonathan Swift contended that all arguments eventually come down to one difference: past or future, you’re either an Ancient or a Modern. Ancients look to the past and fight to preserve it, maintaining the status quo (read conservatism) whilst the Moderns want to move forward, improve, reform and change (read whig/liberalism). It’s Stasis vs Kinetic, Stability vs Innovation, Status Quo vs Prog Rock.

You would think that, with the growth of our post-Freudian relativism about the past, as it gradually fractured and became less distinct, conservatives would have had less to hold on to and the progressives would have won the day. But 21st century conservatives have found a new opportunity in all this. Can’t be sure of the past? Build a better one! They’ve finally realised it’s all in the stories you tell. There’s a spitfire forever flying in the blue skies above Kent in their makebelieve past, it’s plucky Britain against the world, before all the foreigners arrived. Make your story compelling enough, and you’ll attract 17.4 million moths to your flame, many apparently willing to fly right in, and go down in smoke.

But the real fantasy behind Trump’s pre-civil rights America; the Brexiteers’ blitz spirit Britain; ISIS’s “Caliphate”; Russia’s rerun of Cold War sabre rattling; the Altright’s Climate Change denial and even the Anti-Vaccs hankering for lost childhood diseases, is the notion that there’s a Ctrl-Z for life, an undo button, something that will magically revert us to a happier place, even if it’s a version that never existed in the first place.

Along with nearly every Remain voter, and the 1.1 million Leave voters who say they regret the way they voted, I too want us to go back in time. For us though, it’s not to some Downton Abbey fantasy Britain but, specifically, to 23rd June 2016 so that, infinitely better informed, completely aware of how messy, ridiculous and utterly damaging the alternative is, we vote again.

As any salesman will tell you, if you’ve sold someone a crock and a story, better close quick. You can’t give the suckers time to think. You certainly can’t give them a two year cooling off period when they might realise how the dream just doesn’t stack up with reality: the tearing up of the Good Friday Agreement, the return of the IRA and expensive foreign holidays while nurses and doctors disappear… they might just change their minds. They might just want a peoples’ vote.

We’re constantly rewriting history. Now, more than ever, you need to help rewrite it for the better, give parliament a reality check.

 

 

 

A version of this article first appeared in

Noted

Holidays are over. Back to the grind, school, work, uni, the girl locked in the basement. With summer, lazy days, late rising and endless sunshine, all still fresh in the memory, the misery of the graft is all the more obvious, the contrast more jarring; the post-vacation life is a depressing bitter pill we all swallow. Unless you’ve got yourself a doctor’s note – in which case enjoy a few more days of the holiday vibe.

For some, the note will become permanent. Not because they’ve been sacked or contracted a terminal illness, just because the back-to-work shock was too much. If you’re going to make a life changing decision, to jack it all in and pursue your dreams, it’s the second week after the holidays in which you’re most likely to do it. You’ve had just enough time to think (and not enough time to think again): bugger the drudgery, what’s the point?

So I’ll tell you about Sammy. At school Sammy had it made. We all envied him.  When we were splashing through freezing slimy puddles, spattering our legs with foul smelling mud, Sammy was warm in the library clutching his “note”, his cherished paper prize that perpetually excused him from the cross country.  In fact, having been diagnosed with mild dyspraxia, Sammy’s note was continually updated by his concerned parents, who excused him from all sports for his entire school career.

Today though, the envy has gone.  You see, Sam has lived his life as if he always had a note excusing him from any of its potential adventures.  Pushing fifty now, his circumference has been multiplied by pie; chronically obese, he suffers from continuous vertigo, countless phobias and, for most of his life, desperately unhappy singleness. Had he been born twenty years before, or to a less privileged class, his dyspraxia would have just been called “clumsiness”, for which there was no known note, and he would have been plunging the puddles with everybody else, or at least behind everybody else.

This could just be the old nurture/nature debate. If you’re a nurturist you’d argue that: had Sammy got in the habit of exercise, and practiced some coordination exercises, at an early enough stage in his life, he might have been spared such a waisted life. Or if you’re a naturist, put some clothes on, it’s getting chilly.

It’s always possible that Sam was genetically “programmed” to end up as he did. But I have a terrible feeling that it wasn’t his genes or his dyspraxia that did for him, it was his “note”.

I honestly think we all have, or long for, our own notes. The desire to have some way, some thing, that excuses us for behaving the way we actually do – rather than the way we know we should – seems an almost quintessential human urge.  So many advances in the human sciences get redeployed as steps towards that Holy Grail: The Universal Excuse Note.

From Humanism to Phrenology, social theories and the genome project, we have quickly hijacked each philosophical, sociological or scientific breakthrough to furnish more excuses for our own uncivilised conduct.

My own “note” lurks somewhere in the back of my head, waiting for the moment I’m caught being me, and not being the someone I know I really ought to be:

“To whom it may concern.

Please excuse Marius’ thoughtless borderline racism today; he suffers from the chronic social pressures of his white middleclass background.  Please will you also excuse him from any punishment for cheating on his wife – I’m sure you understand that he is merely a hostage to his selfish genes vying for survival. Lastly, if you catch him picking his nose, please will you excuse him as he is, after all, only a slightly evolved chimpanzee.”

Under Construction

Brain at work.

In the last few years, with advances in electroencephalograph (EEG) scanning, the brain and its workings have become the latest fodder for our universal “note”.  The Scientific American publishes a monthly magazine devoted to the brain, the Science and Self-Help sections of bookshops are bursting with brain books and you can tell when a subject has become truly ubiquitous – there’s a “Rough Guide” to it.

As our understanding of the workings of the “normal” mind, and the chemicals that are released in the face of various stimuli, improve, lawyers – and parents – are armed with an ever more sophisticated arsenal of mitigating and evidential factors.  Today, a murderer influenced by the brain-chemical surges of pre-menstrual tension, despite it being an apparently “natural” part of body function, is not a murderer.  Or, as I may once have explained to the parents of two of my son’s school mates, “He’s not actually an aggressive bully who enjoys sticking smaller boys heads down toilets, he’s suffering from melatonin underproduction and struggling to manage his teenage testosterone spikes.” And I could tell from their contrite gawping that they truly understood that he was just as much a victim as their own precious offspring.

But, if we cannot control the chemicals and processes of our minds and bodies that effect our behaviour, are we actually responsible for anything we do?

At what point might we be forced to recognise that “my” brain, “my” rushes of adrenalin or floods of pheromones, is “me”?  If, when healthy, we can’t or won’t, take responsibility for our own brains – and by association our own minds – then aren’t we in danger of losing our identities, our individuality, ourselves?

Perhaps the question shouldn’t be “why are we looking for excuses?”, but “why do we feel so guilty in the first place?”  Before humanism, we saw evil, or vanity, or stupidity, we did awful or idiotic things, but we had the ultimate “note”, “The devil made me do it.”

Now, although we’re still reaching for excuses, albeit more “scientific” ones – so that we can act like the venal selfish animals that we actually are with some impunity –  the very fact that so many of us are looking for a “note” means we also recognise when we fail to be what we aspire to be: better. We know we need to evolve and evolution requires constant failure to recognise the virtue of success.

Unfortunately, the danger of the “note” will always be if we start to believe it and allow ourselves to stop striving; as Sam did.

Sammy’s parents, concerned for their child, gave in and encouraged Sammy to do the same.  Dyspraxia is, after all, a neurological disorder.  Had they looked deeper at neurological findings, they would have discovered that brains, especially young ones, have a phenomenal ability to “rewire” themselves through familiarity and conditioning.  It’s called “learning” or, if you prefer, “the process of creating new synaptic pathways in the brain”.

It seems counterintuitive to take the punishment, to run the race, to pound through the stinking puddles rather than find an excuse, but tearing up your own “note” might just be the best thing you’ll ever do.

 

 

 

A version of this article first appeared in

On/Off

Battleship, Earl, Slate, Ash, Dolphin, Overcast, Elephant, Silvery, Smoker’s Lung, it’s easy to forget there used to be innumerable shades of grey. When the Apple Macintosh was launched in 1984 its screen boasted 256 shades of grey. By the time E. L. James’s soft-porn bonkbuster was published, there were only 50 (even if she found a few more for the sequels). Now shades of grey, along with stretched metaphors for subtlety, are so last century.

One of the main causes for the death of our cognitive greyscale was a little invention, also in 1984, of psychiatrist Karen Kempner and dentist Ed Zuckerberg: Little Marky, the Dr Frankenstein of Facebook.

In the early 90s, when Ed bought an Atari 800 to teach his son some BASIC programming, he thought he was ahead of the curve, little realising that the very idea of a curve, with all the continuous subtle gradations of angles it implies, would be an anathema to his kid who found more empathy with the machine. Things are much simpler in computer languages, ultimately, everything is either on or off. Things either are or they are not. It is the Danish distillation: “to be or not to be” – why bother with anything else?

And the political climate that Mark, and our other techno i-dols, grew up in was dominated by Dubya Bush’s rhetoric and the neo-con world view. There was good and there was evil, right and wrong, black and white.

So when the blessed geeks inherited the Earth, maybe it was inevitable that their genius for binary, designing a new world out of ones and zeroes, would end up seeping through everything they made becoming the dominant factor in the way we receive information, process ideas and even think.

My kids claim their constant screens make them more connected than any previous generation, they’re more in-tune with their peers, and the world, than I ever was; limited to a land-line and a tiny circle of geographically local kids. And yet, for all that, all they seem capable of communicating is endless self-contained, unarguable, micro statements and gurning snapchat, emoji aping, poses. The idea that they are both connected and simultaneously disconnected, that the medium both enables and limits and changes and shifts the very multilogue they believe they are having in a million different subtle ways, seems impossible. Either they are connected or they’re not. On/Off.

There is no “conversation” on social media. The intricacies of different rhetorical devices utilised to argue or persuade or flirt or cajole or urge or elicit are lost and the only audible voices are the ones in caps, with the simplest messages. LEAVE, REMAIN, TRUMP IN, TRUMP OUT, LIVE, DIE, WHATEVER.

So do not wonder why our elected representatives seem to have no way of understanding, or feel empowered to work through, a negotiated settlement. For them you either have a deal or no deal. You’re in power or you’re out of power, you’re anti-Semitic or you’re not anti-Semitic. The black and white referendum question allowed no room for subtlety or “third ways” which is presumably why it was originally billed as an advisory straw poll; a fact ignored because it is incomprehensible in the on/off digital binararium we now live in. Even if the switch is 52% off and 48% on, then we’re off. Obvs.

Is it any wonder that we feel we’re living in an age of extremes when extremes are all that are talked about?

And there’s money in extremism. Last month, Channel 4’s Dispatches sent an undercover reporter to work for Facebook and find out how Zuckerberg’s minions decide what is good for the platform to share.

Mark_Zuckerberg – CEO of FaceBookstein

Working as a Content Moderator, the reporter discovered videos of child abuse allowed to stay and being shared thousands of times. Good for Facebook were: eating live animals, beatings from school bullies, disturbing images of self-harm and eating disorders. Racist content and hate speech was flagged to stay; far-right pages supporting Britain First and Tommy Robinson got special protection thanks to their large and rabid followings. Censorship was simply “bad for business”, the reporter’s boss pointed out. What kept people on the site was shocking content and extreme politics, more clicks mean more exposure to advertising and more money for Facebook.

Extremism is in the DNA of social media. When you see something moderately likeable you might read it and forget it, but something repulsive or incredible you’ve got to react, like, share and drive advertising as you do.

Every day we find ourselves needled and prodded to react by our social networks courting controversy. Our knees involuntarily jerk as we decry or promote, like, share, frown and retweet the messages that appear to agree or disagree with the bare surface of our own polarising opinions. We are being lead into extremism by a digital version of a happy slapper, someone who spills their own drink in order to get into fight with you, someone who will inanely insist on the opposite to any orthodoxy as long as it offends and baits clicks. Forget research, forget reasoned argument, forget ethics. Facebook, Twitter, Reddit, Buzzfeed etc. is mass personal sensationalism, it’s Piers Morgan, it’s blood sports – it’s the Colosseum in ancient Rome – it’s bread and circuses – who would be surprised if it was the decline and fall?

Before global warming, Britain offered a vast array of drab greys, from the soot covered buildings to the smogs and endless overcast rainy skies. We understood our greys. When looking out of a window, rather than at a screen, was still a thing, we understood how to look for the tiniest chink of light in the unvarying gloom. Our dreary environment taught us to interpret all our greys, and gave us the skills to spot the tiniest disparities of brightness that might be trying to shine through; and maybe even appreciate the barely perceptible subtleties of life’s often infinitesimal variations to find some colour within. Socialising, conversation, with all the body language and facial expressions that accompanied it, was multi-layered, complex, alchemical, and nuanced. It took effort and maturity to decipher; everything our spectrumed geeklords found disturbing growing up. Why try to work out the human when you have a machine? And slowly their software and interfaces have eroded all our subtleties, now “we’re all on the spectrum.” It turns out that, far easier than programming machines to interpret humans was making machines to programme humans. Perhaps it’s time to Make Britain Grey Again.

 

 

 

A version of this article first appeared in

Feedback – Everyone’s a Critic

This short documentary was made based on the original article below and was selected for the Earl’s Court Film Festival 2018

Everyone’s A Critic

No. Literally. Everyone is a critic. Not just personally judgemental, but you, me, we’re all verified published critics, our words heeded by countless others, our opinions disseminated around the world.

“Feedback” used to describe the ear piercing noise made when a microphone got too close to a speaker. Now we’re all squawking, screaming, feedback. Anything we buy, any hotel or tea shoppe we visit, every experience we have, from a hospital appointment to the dog kennel, we’re expected to rate and review. Our digital opinion matters. Soon, you can forget getting a eulogy at your funeral: I’m expecting three and half stars, six likes and a bunch of negative feedback reviews – five of them rated as “helpful”.

Between the penis enlargements and the diet pills, a third of my spam is feedback requests. And the requirement for feedback has mechanised me, I’ve become a human response bot. I’ve applied artificiality to my intelligence and learned to be like a computer – I have a document with a list of stock feedback phrases that I copy and paste to fulfil my social network contract.

“Brilliant eBayer, great communication, thanks.”

“On time, well packaged, delivered safely, many thanks.”

“Stiff for hours, best blue pills I’ve come across [winky face].”

But the question is: why are they asking me? Why don’t they ask someone who actually knows what they’re talking about? Or maybe even someone who gives a f***?

B.B., Before Bezos, “Critic” was an actual, real life, job description. Budding wannabe reviewers worked their way up through journalism, they became trusted as knowledgeable experts in their fields and then they would find inventive and entertaining ways to praise, admonish, suggest improvements and inform readers about what was good or bad in the world. It wasn’t easy: universal ideas of “taste” had to be found, agreed and applied. Mozart’s music was good, William McGonagall’s poetry was bad and nobody bothered to have an opinion about All Trade Adhesive Cloth Duct Tape.

Now, A.B., the duct tape gets four and half stars and 193 people have, apparently, reviewed it on Amazon; it’s received more user insight than my last novel – but then it’s stickier and, I have to accept, a lot more effective at preventing kidnapees from screaming.

Professional critics are hard to find nowadays, mainly because most of them are stuck at home, chained to their laptops, zero-hour minimum-wage contractor drones hired to big up products on online market places. For today’s critic, “Verified Purchaser” is the only expert credential needed because “The people of this country have had enough of experts.”

The most terrifying thing about Gove’s blinkered pro-brexit-despite-all-the-evidence pronouncement is its insight, its awful truth, we really have had enough of experts. I mean obviously we still want them flying our aeroplanes and diagnosing our tumours but bugger their opinions, that’s not democracy.

The reason a national newspaper could get away with calling our judiciary “The enemies of the people” (Daily Mail 4 Nov 2016) is because, since the advent of online consumerism, we have accepted that the only valid judgement is by the people. Not twelve peers good and true, but hundreds, thousands, even millions, and the only criteria for their judgement to be counted is being in a wealth and age bracket that puts them online. We have become conditioned to populist ideology through the relentless drive of giant online retailers desperate to sell stuff. Even satire, the biting funny bone of personal criticism, has been overwhelmed by the internet meme. Produced, shared and adapted by hundreds of thousands of anonymous photo-shop users, they’ve collectively elevated the idea of the repeated joke ad infinitum et absurdum.

Now we numbly accept that judgement must be crowdsourced, despite the well-worn fallacy: “faeces is good because a billion flies can’t be wrong.” But we should probably be asking, is it really “the wisdom of crowds”? Or is it a “mob mentality”?

Look at what really happens when we face this democratisation of opinion; now that we’re forced to try and understand value and worth from the massed judgements of equally ignorant amateurs.

Booking a holiday? Want to know if you’ll love the place you’re going? Go on Trip Advisor of course. This is why your heart sinks when you think about it. Because you know that now you’re going to have to painstakingly trawl through hundreds of reviews paying attention to what’s being said but, at the same time, trying to assess the character of each reviewer from their digital signature. With a Sherlockian eye to human psychology you must review each reviewer: who is holding a grudge, who brought the rose-tinted specs, who is petty minded, who is OCD about bathroom sanitation, who got a freebie, who was “a bit menstrual”, who was paid, who got laid and who just wanted to get their review read, their opinion, and experience verified, who’s hungry for the approval of other users?

Forming judgements by aggregating judgements is exhausting, time consuming and usually self defeating. Eventually you just shrug and book anyway based on the price, location and photos.

No one, however well read, or experienced, or educated in the nuances of a thing being reviewed, has a voice any louder than anyone else’s. In fact the loudest voices are the ones that game the system, who know how the machine works, who sock puppet and put the work in rating their own opinions to feature higher on the site. So now we need to search through the 193 Duct Tape reviews to see if we can find someone, anyone, who thinks a bit like us, who might fit the voice of our own echo chamber or, failing that, someone who might have had a duct that actually needed taping.

One day we may look back and realise that “Critic” was the first non-manual job to be entirely wiped out by computers. Not by artificial intelligence, but by the hive mind. And yet, even though we may have had to start thinking like machines to get our judgements heard, programmed by the system to be opinionbots, the critic, however inexpert or unpaid, may also be the last human voice to succumb to the singularity.

Alan Turing proposed that a human might judge Artificial Intelligence successful when he or she fails to tell the difference between a computer’s response and a human one. The flaw though, was not in his understanding of computers, but of humans. The human urge to judge and be critical and superior, would never allow us to admit a failure in our judgement, we love the sound of our own voices too much. So it’ll be our voices, our vicious little opinions and petty judgements, which will be the very last thing the robots wrest from our dead larynxes.

IMHO

 

 

 

A version of this article first appeared in

Let me demonstrate

At times it’s a struggle to keep up with modern Middle Class DOs: smile at minorities; encourage your daughter to date one, not just for the inclusivity points but she’ll “get it out of her system”; moderate your voice even when speaking to people who voted Brexit; confidently bypass the Quinoa and Kale sections in Waitrose because you’re ahead of the curve and on to Cassava; counter bigotry by bravely using the awkward silence or the slight nod and a “I hear what you’re saying,”– convince yourself this has nothing to do with saving a claim on your BUPA insurance from getting into a fight but because you’re passionate about free speech; oh and remember to follow a handful of right and left wing nutters with deplorable views on Twitter to keep your ‘bubble’ in check.

Virtue has its benefits; which has long been the argument against the possibility of true altruism – but does that mean you shouldn’t try?

To practitioners of the art of MiddleClassdom, it’s all just riffing on good old-fashioned Anglican-style pragmatic tolerance. But the Right call it “virtue signalling,” (as if virtue becomes a vice when it’s shown) the Left call it “bourgeois” (which in itself, using the French, seems tres bourgeois), while both see it as self-satisfied dissembling that perpetuates neoliberal elitism and economic inequality. In other words, they hate it.

One thing being Middle Class will have, up until now, excluded you from, is demoes, political rallies or protest marches. Why? Because being Middle Class also places you, give or take a little wriggle left or right, bang in a centrist political milieu which is ever the resting ground between the swings. Political isms come and go but in the end there’s always pragmatism waiting patiently to bury them. So who goes on demoes to support the orthodoxy? Aren’t they just a catharsis for the marginalised?

Demonstrations are a complex meme. They are at once, for the protestor, an acknowledgement that democracy isn’t working, and at the very same time, for the onlooker, a confirmation that it is; free speech being as central to democracy as the freedom to ignore it. A march on Downing Street always seemed to me, like golf to Mark Twain, “a good walk spoiled.”

But amassing in protest is, according to current bestselling Sapiens author Yuval Noah Harari, one of humanities finest achievements. “One on one, even ten on ten,” he writes, “we are embarrassingly similar to chimpanzees. Significant differences begin to appear only when we cross the threshold of 150 individuals, and when we reach 1,000–2,000 individuals, the differences are astounding. If you tried to bunch together thousands of chimpanzees into Tiananmen Square … the result would be pandemonium. By contrast, Sapiens regularly gather by the thousands in such places.”

A few protest rallies through history have become the tinderboxes for revolutions, but most of the time the officials in power barely twitch their curtains to gaze out at the rabble shouting slogans at the walls. The protest can be ignored because it’s not the ballot box. That’s where the sensible Middle Class lodge their displeasure before turning on the Archers on Radio 4.

Corbyn’s Momentum movement takes pride in its ability to assemble a horde whenever their leader is challenged. Right-wing activists are no less capable. Within hours of the Islam-antagonistic, English Defence League founder, Tommy Robinson being arrested a couple of weeks ago, a vast crowd of right wing protestors had mobilised and blocked Whitehall. They knew the ropes, they knew what would get noticed, they knew how to push the media’s buttons. Of course that didn’t stop Robinson getting a 13 month sentence, handed down by one of the Daily Mail’s “Enemies of the People,” AKA a judge. Even so, by then his supporters, as marginal as they might appear, had marshalled the internet to create an international pressure group with Change.org, hosting a petition to free Robinson, scoring half a million votes within days. Protest today is a 21st Century fusion of high tech crowd sourcing and good old-fashioned boots – hob-nailed or steel toe-capped – on the ground.

Pity then the poor moderate centrists who have no experience, or history, of protest; they’ve never had a reason to raise a mob, to march or storm the gates. But with Brextremists both in power and in opposition, centrists are realising that they need to learn. Fast. They have already been marginalised to such an extent the cool young left feel justified coining the shame name: “Centrist Dad”. Which means if you still believe in a middle ground, reasonable compromise or liberal values you’re basically old, in line for a hip replacement (and they know about hip) and way behind the times. You probably remember D:Ream’s Things Can Only Get Better and the Blair ‘third way’ with nostalgia; basically YOU, you moderate, you Middle Class, middle of the road, middle muddler, with your university degree and holidays abroad, you are the minority now… you just haven’t realised it yet.

And they’re not entirely wrong. The middle class meme is a late comer to human evolution, only really gathering strength in the 14th century, born out of the Black Death. With the sudden shortage of workers, peasant plague survivors began to take control of the economy, the feudal barons and kings could only watch as a merchant class emerged and prospered. Capitalism nurtured an ever-expanding band of people who neither worked the land nor entirely owned it. By 2011 71% of the country defined themselves as Middle Class. But in the last seven years, Austerity has taken its toll. Yesterday’s aspiring are, today, perspiring again, dragged back into working, or benefits, class survival. The “squeezed middle” has got ever tinier as the belt has inexorably tightened.

June 23rd 2018 saw a March for a ‘People’s Vote’, 100,000 people or more requesting a referendum on the final Brexit deal. Even the demand was pathetically reasonable: not a demand to stay in the EU, no full capitulation of a failing Brexit-frozen government, just a jolly practical, inclusive, ‘People’s Vote’ on whether the Brexit terms are actually a good idea or not. Veteran protestors from left to right, from Animal Rights activists to Poll Tax rioters, could only sit back and laugh into their balaclavas as these finally fired up, finally spurred, finally angry, moderate liberal (small L) centrist nouveau protestors, with only the sketchiest idea of how it’s done, marched on Parliament. How does one find one’s way from the prosciutto counter at Waitrose to the frontline of a mass protest? Do the Molotov cocktails require Maraschino Cherries? Is one’s Smythson A2 card rigid enough for a placard?

And yet they came out on that Saturday afternoon, feeling a bit Pankhurst or Ghandi, railing against both a Government and an Opposition that nearly all of them voted for, one way or the other, in the last election. And, although these new naïfs to the world of protest represented 48% of the country; and by most recent polls in this failing Brexit environment, a lot more, as far as protesting went, these rally virgins who could only presume ‘kettling’ refers to whose turn it is to make the tea, were given a soft ride. Police kept a pro-brexit rally of a few hundred stoked people half a mile away. One elderly vicar I met on the march, a veteren of Anti-vietnam and poll-tax marches was saddened by the lack of conflict. “It’s important,” he said, “it’s how you get noticed. It shows how passionate we are.”

We’ve come a long way for people to have to protest for some common sense, some moderation, some centrist middle of the road rationality. But finally we’re starting to make our voice heard – at a moderate volume of course.

 

 

 

A version of this article first appeared in

shortlink: https://wp.me/p1NVro-T6

Re: Born

The leaves are out, the flowers blossoming, spring has not only sprung, its coils are stretched and it’s bloomin’ bouncing like a budding summer’s day. It’s all glorious and we should be enjoying it and though I’m loath to poop a party, I still feel we shouldn’t forget that it’s Springtime, at least the notion of it, that directly led to one of the most dangerous ideas, if not THE most dangerous idea, to affect humanity.
Rebirth.
The word itself is a paradox and an oxymoron and shouldn’t, by rights even exist. It is a meme that has survived centuries, has barely evolved, and remains the basis of all sorts of mystical thinking. Even though, empirically, nothing, ever, gets reborn. Thousands of years of human history and there’s not a jot of confirmable evidence that something dead can come to life again. Religions are founded on the concept, wars fought in their names, millions of lives lost in the fighting – and not a soul knowingly returned. All we know is life starts and ends but the idea that we might come back, or live on elsewhere, in some other sphere, is so attractive we’ve been unable to shake the rebirth meme; however enlightened we think we are.
The idea of Rebirth seems to come from early man’s fundamental misunderstanding of nature. Trees lose their leaves and plants bear no fruit in the Winter. They die. And then each Spring they come back to life.

But, of course, trees don’t die in the winter. They undergo “dormancy”, when everything slows down: metabolism, energy consumption, growth. They don’t need food so they have no use for leaves that require energy to maintain. Abscisic Acid is produced in the tip of each stem that connects to a leaf and the leaf falls off.
We can’t blame early man, even if his wife did for being premature. He knew nothing of acids or metabolism and didn’t even have a decent labcoat. He did, however, have a lot to fear, especially death. It’s easy enough to think of plants as dead in Winter and alive again in Spring. And if plants can die and come back to life, why not humans? It’s no coincidence that the day celebrating Christ’s Resurrection is not a fixed date like Christmas. It’s more laden with rebirth symbolism and each year changes depending on the first full moon after the Spring Equinox. Rebirth is the thing for Spring.
Although we may, centuries from now, still be classed as “early man”, I’d like to think we’ve come on a little since assigning gods to everything we don’t understand and witches to duckweed. We know, now, hibernation and death are very different things.
And yet there is a moment in a David Blaine TV show when he is talking to some New York cops. He finds a “dead” fly on the ground, picks up the lifeless body and holds it in his hand. After a few moments the fly starts to move, then walk and then flies off. The cops are agog. There is a palpable delight that such a thing could happen. It is Lazarus in miniature, a return from the dead. It’s also, I imagine, an appliance of science: a refrigerator.


Of course I don’t know exactly how Blaine did it. If I did I couldn’t tell you because, as a member of the Magic Circle, I have sworn to keep magical secrets. But conceptually we know flies go into suspended animation when their temperature drops; it’s why flies start appearing from nowhere when warm weather returns after hibernating through the winter. So, I suppose it’s just possible Blaine could have cooled down the fly, dropped it on the street, picked it up when the cameras were rolling and, from the heat of his hands, reanimated the fly. Of course I don’t know this, it could have been a miracle, or some fancy trickery, but the effect itself bought into our deepest fears and greatest hopes, that death is not forever and life could be.
Maybe it’s our egos that insist we are too grand, too elevated, too important, that it would be too great a loss, if our deaths were permanent. And as soon as we allow ourselves the possibility of one tiny non-empirically provable concept, we have Frankenstein’s Monster, we plunge into the murky waters of magical thinking, of miracles and religious credo. So embedded is the magical concept of rebirth, the most influential cultural turning point in modern history was named after it: the Renaissance. The flowering of lost classicism was subliminally used to perpetuate the “not dead forever” notion.
Whether it’s ghosts or reincarnation, Heaven, Limbo or Hell, an eternity for the soul has, no doubt, also brought some comfort to millions who would not go gently into that good night. The hope of being reborn has an aspirational allure for anybody contemplating their own mortality or not ready to face the loss of one they loved. Eternal life, even if it involves remerging followed by placenta every now and again, what’s not to like?
Well, for one: False Hope, one of the more depressing falsies we’ve come up with. History tells us that when it’s curtains, when it’s time to shuffle off the mortal coil, when you kick the bucket, it stays kicked, shuffled, drawn. But belief in rebirth, life after death, the beyond, the “other side”, the immortal realm, just encourages the living to accept their lot. Why rise up? Why change the world? All we have to do is be good and die and we’ll come back as something much better. It is the way religious and societal hegemonies maintain their hierarchies; keep their one-percenters at the top. Re-birth, born again, life eternal, it is snake oil for the masses, it is the suppression of the poor, it is the dream of the put-upon, it is the squalid deaths of billions wanting something better, and it all starts with Spring.
But the flowers are out, the blossoms are lovely, go out and enjoy.

First Appeared in

Democrafatigue

It’s raining. I’m upstairs on the no. 49 to Clapham Junction in a humid breath fug. Two gents are sitting in front of me, “May 3rd. More bloody elections. Might as well move in to the voting booth I’m in there so often.”

“Yeah.”

“It’s bigger than most flats round here anyway.”

“And you get a free pencil.”

“I don’t understand why MPs can’t decide anything on their own without asking everybody to go and vote all the time.”

I groan slightly at the luxury of their Democrafatigue and they shut up. But I’m left with a couple of thoughts: Democracy’s clearly not getting the respect it used to and, it’s been over a hundred years since he was cited in the Court of Appeal but the man on the Clapham Omnibus/Routemaster is alive and well and reeking of Old Spice.

On the other hand, Democracy in the internet age seems much less robust.

For a start, the technology – a piece of paper, a pencil and a balsa wood booth – is centuries behind most other opinion collating mechanisms. According to the Office of National Statistics, “90% of men and 88% of women” and “virtually all adults aged 16 to 34 years (99%)” are internet users. Compare that to the historic high voting booth user rate of just 68.8% in the last election.

Most of the UK has access to instant polling and voting on everything from a blue/gold dress to feeding celebrities live cockroaches and they attract more voters than your average election. More people know how to use Facebook than a voting slip. If we cherish democracy, rather than just give it lip service, and expect it to be relevant, we really need to help it adapt to modern life. For my university-aged kids, visiting a school hall with makeshift booths – that don’t even take whacky photos to upload to Instagram – is akin to entering some ancient church. You go there out of respect for an historical idea but everyone knows it’s barely fit for purpose when, in their pockets, they have the ability to instantly connect to the 65 million other people who live in the UK, to say nothing of the rest of the developed world.

You might think that the enforced slow speed of our pencil and paper democracy might encourage serious contemplation of the issues being voted

Plato knew a thing or two

on but, nowadays my opinion changes seven times before breakfast. I scroll through headlines on my phone before I get dressed, listen to the radio as I shave, and before the first oat hits the bottom of my bowl I’ve encountered countless arguments and I’m ready to voice my outrage, or support, for things I had no idea existed an hour before, let alone cared about.

The system has problems. Even back in its early days, when chaps with beards were still working out how to plague generations of schoolchildren with fiendish calculations for working out hypotenuses and other tangents, Plato spotted the fundamental flaw in democracy. “The insatiable desire of [freedom],” he says in his Republic, “introduces the change in democracy, which occasions a demand for tyranny. … When a democracy which is thirsting for freedom has evil cupbearers presiding over the feast, and has drunk too deeply of the strong wine of freedom, then, unless her rulers are very amenable and give a plentiful draught, she calls them to account and punishes them.” Democracy, Plato argues, naturally leads to tyranny unless democratic leaders are benign. He recognised democratic voters elect personalities not policies. We vote in our image. We vote for people. Which means that the policies of those with charisma, or sheer force of personality, always trump (forgive the pun) boring people with ideas of good governance. Democracy leads to populism, populism is a cult of personality and the ultimate personality is a demagogue, a tyrant, a dictator.

Democracy is one of humanity’s most sacred memes, even Popes get elected. It’s an idea that’s been around so long you’d have thought we would have a handle on it by now. Yet it remains so nebulous Wikipedia just says “No consensus exists on how to define democracy, but legal equality, political freedom and rule of law have been identified as important characteristics.” Like ‘Art’ its lack of definition is both its strength and its weakness. We can’t dismantle it but we can’t enshrine it either. It’s anything we want it to be. A chore in Clapham, a liberation in Soweto. Whether it’s to elect another bunch of narcissists into Parliament or rip away our European citizenship, it can be both inane and profound.

On Thursday 3rd of May, on the face of it, all we’re doing is electing a bunch of faceless Council bureaucrats to oversee our parking permits and deny us planning permissions for our basements. But, in reality, this is the last official democratic opportunity to put our opinion to Parliament before Brexit.

It can’t be overstated how important this local election is. If it doesn’t matter what colour of politics runs your rubbish collection, if you believe that British sense of decency and fair play means tolerance even of bally foreigners, if your life (like mine) only exists because immigrants were allowed to escape here from war-torn Europe, if you’re happy being a European, if you were born a European, if your family are European, if you have children who may need the work opportunities that a 27 country bloc can offer, if your property is losing value as London loses international significance, if you want the Troubles in Northern Ireland to never return, if you want the economy to turn around, if you want the focus of our politicians to go back to genuinely pressing domestic issues like the funding of the NHS… then only vote for a party that definitively supports a referendum on the final Brexit deal: Vote Lib Dem, Vote Green, Vote Renew, Vote Remain. Send a message and use democracy like it’s still in fashion.

This article first appeared in

Would smell as sweet

Merry Arse. Mari-Arty. Fat Slob. Dick. Bastard. Cockroach, Arsehole. Shit. Muvverfucca. Weirdo. Lefty Loon, Twat. Complete (Emmanual) Kant. Weirdo. Disingenu. Freak. And according to one of my teachers who struggled to get beyond the 18th century: Dionysian Strumpet. But that’s enough about me, what do they call you?

It’s marvellous how one person can mean so many different things to different people but I earned each and every one of those names. I’m not exactly proud of them but at least the things I’ve done have inspired people to reach into their personal lexicons to find an appropriate way to define me, as much as my birth inspired my parents to call me something which even I am not sure how to pronounce.

But nowadays I, and I dare say you too (unless you are using this newspaper as insulation on a frosty park bench in which case I recommend the collected works of Don Grant sufficient to keep a gentleperson toasty for life), find ourselves with a name which we did nothing to earn. A name that just collects us together as one side, a contemptible enemy. For you and I, almost certainly, are the “Metropolitan Elite.” We have been united in a name, despite the fact that we have probably never met, and even if we did we might find the only thing we had in common was our desire to meet someone better looking.

“Metropolitan Elite” is just a handy term reserved for hate speech, one that smacks of exclusivity and money in a time of austerity because (a) you can afford somehow to live or work in a city and (b) that in itself makes you elite. There’s no name for the other side – The Rural Rabble? – because this collection is an illusion. Over half the country lives or works in a metropolitan area. But then “One Percenters” probably seemed a little too small, and actually elite, to explain 48% of the UK voting to Remain or the same percentage of Americans voting for Hillary.

Hate terms can, of course, be adopted and repurposed by the hatees. Rappers use the N-word as an empowered and exclusive term of brotherhood. And for the rest of us, to hear it and never wish to say it, lets us never forget where it came from; but then it perpetuation also makes it all the more attractive for white supremacists to use it as a badge of bravery, doubling down and challenging the taboo. Do I like being called “Elite”? You can bet your Top Gun I do; the OED says an “elite” is “superior in terms of ability or qualities to the rest of a group or society.” I didn’t ask to be one but if you insist…

We have been branded like very posh cattle, so would any of us adopt “Metropolitan Elite” as a badge of honour? Unless we own it and try to change its meaning it will always sound like we fret over avocado shortages at Waitrose and the dreadful accent the nanny is teaching Imogen and Hugo.

For 20th Century Marxists the “Bourgeoisie” was the collective bête noire. But the word literally means “those who live in a borough,” city dwellers or, if you prefer, “Metropolitans”. Living in a city seems to inspire a political paranoia: all those people living near each other must be colluding against the interests of the rest. Creating a collective enemy from an economic perception is an old political con. Bolsheviks inspired a poor, mainly rural, Red Army to march on Russian cities. Pol Pot’s Khmer Rouge forced all city dwellers on long marches to start farming; declaring anybody with glasses “intellectual” and put to death. In 1918 in Russia, in 1970’s Cambodia, it didn’t end well.

Despite these examples its not an exclusively Left Wing strategy. “Metropolitan Elite” is a term bandied by Left and Right because, if you hold an extreme view, you need to create an enemy of the middle, a way to define them as a collective who mean you harm. It’s no good just going after your polar opposite, that’s just sectarianism. So for Nazis it was the Jewish conspiracy, for ISIL or Al Qaeda it’s the “Kafir,” literally anyone who does not believe the same thing.

The neatest part of political paranoia is that once you start attacking your made up enemy, you force those you’ve declared as working together to, well, work together; you create the conspiracy you made up in the first place.

Name calling is as old as the meth user Methuselah, and in the modern era of Twitter and Snapchat, limited by either number of characters or, simply, juvenile vocabulary, it is the easiest shorthand to express complex ideas. There is no room for subtlety or debate on social media. Just statements or reactions, anger or mockery, puffs or put downs.

America’s Right have forged “young liberals” into “special snowflakes”, leading what might have been disperate “snowflakes” to adopt the insult and rebut it with a “beware of avalanches” rhetoric. The Alt-Right have created their enemy, now they can start recruiting.

But even the sound of names can have an effect. Time and again, names that dominate the political narrative lead the day. Both Trumps and Clintons are types of cards, but Trumps win. “Brexit” with its novelty portmanteau, its plosive and fricative phonemes, defined the entire referendum, while “Remain” sounding weak and ineffectual was always the opposition rather than the lead. Even now the people pushing the leave campaign are “Brexiteers” sounding romantic and swashbuckling whilst “Remoaners” fail to set the agenda. Why haven’t we got Brexshits and e-Uniters at the very least?

When the political middle is given a name, forced to become a side, extremism is on the rise. And, in this climate, we need to ask whether we should adopt the names we’re called, refute them or try to ignore them? Metropolitan that’s me, Elite if you say so. But if you call us that to dismiss us, we need to stand… for something.

 

This article first appeared in