A letter of resignation, in both the active and emotional senses of the word

Dear pals, followers and denizens of the internet,

I’m off for a bit. Well, sort of.

To be precise, I’m retiring – for the time being, at least, but I pretty much plan/hope forever- from the bafflingly extended social media ‘universe’ that exists under the name ‘ravcasleygera.’

Bye, Twitter. Sayonara, Tumblr. Arriverderci, Google+. Not to mention WordPress, Quora, more WordPress, more Tumblr, a few Bloggers, Flickr, Friendfeed, and god knows how many others I’ve forgotten about.

Let me clear about what this isn’t – some Paul Miller-style fleeing of the internet. I’m not even going to stop using social media. I have a Twitter account as @casleygera, devoted to my (hopefully) developing development career, and I intend to keep using that. And I have a professional website, too, which I can kinda-sorta blog on.

But the way I use social media under that name is going to be – has to be ­ ­­- very different.

A brief history of ME (in online form)

I got my first blog in 2004, I think. It was on an entirely wretched service called ‘20Six’. Before long, it moved to Blogger, then WordPress.org, then WordPress.com. Then I was sort-of-on-Tumblr for a while, though I had a Tumblr before that, too.  Somewhere along the line I set up a Blogger-hosted micro-blog, before moving to Twitter when that came along. And I used a variety of services, like Friendfeed, Lifestream.fm and others I can’t remember, to keep track of it all. (I had my Friendfeed RSS Feedburner’d, so people could subscribe to everything I did by RSS!)

I tried auto-importing comments I made on the web into my blog, then turned that off again. I had a public Facebook page, and agonized about whether I could find a way to transfer the ‘ravcasleygera’ username from my profile to the page. Then I changed my mind and deleted the page. Then I thought about bringing it back again, but didn’t.

(The nadir probably came early on, actually, when instead of having a blog, my Blogger page was just a background image with two embedded RSS feed modules, one showing the feed of another blog called ‘Things Rav Likes’, and another showing ‘Things Rav Made’. Because that makes sense!)

As recently as two weeks ago, I was still fiddling, deciding to ‘blog on Google+’ (that lasted two days) and signing up for Medium (where my only post is called something like ‘This is a Test Post of blogging on Medium’).

Finally, slowly, over the last couple of years, it dawned on me that the way I was using social media was not really about connecting with people, having conversations, learning, sharing interests. It was about just about plastering myself all over the internet, the way a child might right their name in big letters again and again on their exercise book, their school bag, their pencil case. Grabbing the username, grabbing the custom URL, putting it on my Flavors.me profile. It was about obtaining Full Spectrum Dominance.

“And I can’t tell you when you’ll see your name up in lights,” sang Bros. But web 2.0 – which, to be clear, I believe has tremendous value – makes it very, very easy to see your name up in lights – or at least, beautifully typographied on the PC screen. I spent hours – hours! – designing and redesigning how my name would appear on my blog. I put those header images on the background of my Twitter. I changed the colours of my Friendfeed to match the colours of my blog. I, obviously, frequently changed profile pictures, and went to pains to get them matched up across the slew of services I was signed up to.

I had a toolbar, folks. There was a Rav Casley Gera toolbar. I was 27! Not 12.

At one point my brother pointed out to me that it might be better to have one blog, about something, with a title, than to just plaster my name on every website that would have it and construe more-and-more-complex ways of bringing it all together (god forbid that anyone would miss anything). I felt genuine shock at the idea. Eventually, I relented, and the website long known as ‘Rav Casley Gera’s blog’ finally got a title, and a good one at that: ‘Here’s the Thing’. But of course I was careful to ensure my name was still right there at the top, and in the URL, of course.

I don’t want to make it sound like this was all ego. Some of it was just plain old-fashioned geekiness. Where older nerds might have geeked out maximising their PC performance or something, I geeked out maximising my social media presence. Here’s the Thing had its own Twitter feed, as did my short-lived photo blog. They had Facebook pages, too. God knows why.

(Not to mention the fact that a lot of this was repeated under different personas – I had my silly anonymous account that friends knew about, which had a blog, Twitter, Tumblr and home page; and a couple of other, anonymous, short-lived personas, too, that I used to talk about things I feared personal or professional consequences on.)

And during all of this – needless to say: nobody was paying any attention.

I don’t remember the first time I realised that nobody was actually really reading any of the acres of gunk I was spewing online. The first time I looked at my wordpress stats, probably. I guess I’d always assumed that I was doing better on Twitter – friends would reply, I’d get the occasional retweet from a random. But when Twitter opened up their analytics tools a few months ago, I got a nasty shock – most of the links I was sharing on @ravcasleygera got zero, or maybe one or two, clicks. It was at that moment I realised that maybe this kind of unfocused, look-at-me social media engagement was no good to anyone.

I’m not whining here. Why should anyone have been paying attention? There was nothing much to look at – no expertise, no pathos, not much humour. The sad fact undermining web 2.0 is that most people don’t actually have anything to say that’s of interest to that many other people. There’s a reason for 99% of the population (of those places folks even have the internet) ‘social media’ means ‘Facebook, Snapchat and Whatsapp’ – tools people use for communicating, privately, with their real-life friends. Other online communities, like Reddit and forums and whatever, can connect people with strangers of shared interests, and that’s swell. But the here’s-a-broadcasting-platform aspect of social media, that underpins blogging, Twitter, YouTube and Tumblr – most of us really don’t have anything worth broadcasting.

Of course, an ordinary person can achieve a following on social media – look at Joe, for example. But I’ve realised that people tend to thrive on one social media service. You’re a YouTube star, or a Twitter genius, or a blogger. You probably have a thing you talk about – it might just be your life, but it won’t be a set of wannabe-opinion-writer material about a bewildering array of topics. You’re probably genuinely funny, as opposed to saying a lot of vaguely gnomic semi-amusing remarks.

The future

And that brings me back to the future, and why @casleygera is my focus from now on – because it’s going to be about something I at least aspire to be truly expert about. Hopefully people will have a reason to pay at least a little attention. More to the point, though, it won’t take up as much of my time – not because I won’t be producing ‘content’, but because I won’t be fiddling around signing up under that name for every new social media service under the sun.

For example: the Full Spectrum Dominance approach would be to have a Twitter, a WordPress blog, maybe a Tumblr for image posts and quotes, and a .com as a sort of central repository for it all. Well, fuck Full Spectrum Dominance. I’ll post blogs on my main website so people can learn more about me if they visit. I could set it up so people can follow it on Facebook, but until someone tells me they actually want me to, what’s the point?

This is an approach grounded in actually thinking about how you can interact with people, rather than just about being everywhere you possibly can. If you write something useful for people, they’ll find it, regardless of whether you have a presence in their social network of sites.

At least, I hope so.

I could keep up some small level of @ravcasleygera activity, of course. I could try to focus on one thing – US politics, tech, pop culture, or one of the other million topics I’m interested in outside development – and edit myself better and publish regularly and try again to build a ‘following’. But I don’t see the point. I’ve always been more enthusiastic about setting up social media accounts than updating them. And I worry that if I don’t draw a line under all of this, I’ll find myself drawn back in to fiddling about instead of producing – yuck – ‘content’.

So, no. It’s over. If I have something to say, be it an opinion or a joke or whatever, I’ll send it to a friend who I think will like it. Maybe, if it deserves more of an airing, I’ll put it on Facebook, for my friends to see. But I won’t put it on Twitter for a theoretical audience of millions and an actual audience of about 12.

Let me be super-clear – I’m not saying any of you should quit Twitter, or anything else. I’m not even quitting Twitter entirely, after all, just this account. I think most people who use these tools use them in a way that’s broadly mature and healthy. (Although the recent wave of rage over Google+’s restricted usernames shows that many people still value vanity URLs and the like very highly, so I’m not alone.) Like an alcoholic who’s the only one giving up drinking among his friends, I have to accept I’m the one who can’t have these toys until I learn to play nicely with them – in a way that’s grounded in something more than OCD and look-at-me ego.

In the meantime, I’m not taking anything down. Go read my 2000-word post about Whitney Houston, my fast food reviews, my elaborate analyses of the Obama campaign. Some of them aren’t even shit.

Take care of yourselves, diffuse online ‘audience’. I look forward to checking the stats next week and finding out seven whole people have read this – and then not telling you all about it.

The end of ‘coming out’

In death, Sally Ride did what singer and rapper Frank Ocean did a few weeks ago – ‘came out’ without actually, really, coming out. Which is to say, that neither Ride’s obituary nor Ocean’s oddly poetic statement actually used the word ‘gay’, ‘lesbian’, ‘homosexual’ or ‘bisexual.’ Ocean admitted that his first love had been a man; Ride’s family admitted, presumably with her pre-death consent, that she had a female romantic partner.

I have a feeling this is going to become more and more the way ‘coming out’ is done in future, both at a personal level and in public life.

Read more on the blog

‘Right’ and ‘left’ are directions, not places

In truth, the public’s view on the big basic questions of political structure hasn’t shifted all that much in forty years. They don’t believe in nationalised industry. They do believe in nationalised public services, but aren’t opposed to some private sector role if it’s controlled and improves results. They believe in the right of unions to strike, but not to cripple the economy. And so on.

When the electorate supposedly swings from left to right, in truth, they’re changing their mind about which party will push things towards the relatively centrist reality they want. In the 1970s, the country’s structure was further to the left – with a bigger role for the state – than the electorate wanted, so they chose the party that would push towards a smaller state. But of course, Thatcher overreached, and by the 1990s it was clear the state was too small – principally, that public services were desperately under-funded. So the public turned to a left-wing(ish) party to swing the pendulum back. By 2010, things had swung a little too far in the state direction, at least in terms of its sheer spending size (the public is still in favour of a bigger role in terms of things like regulation). So the public swang a little in favour of a state-shrinking party again. And of course, the Tories seem to be overreaching again, which may explain Labour’s lead in the polls.

Politicians almost inevitably overinterpret their mandate. The genius of Tony Blair was that, despite his vast victory, he never fooled himself that the electorate had ‘swung to the left.’ In truth, the electorate had stayed pretty much put – but by 1997 the parties had moved far enough to the right that it was Labour, not the Tories, who were closer to electorate’s ideas of what form the state should take.

Read more on the blog

A short, untitled piece on homoerotic – or not – art

The Advocate has a nice slideshow of paintings by Henry Scott Tuke, a once highly-regarded member of art’s Cornish School (after a few years in my much-loved Newlyn, he did most of his work in Falmouth) now mostly enjoyed by collectors of so-called ‘gay art’. (Elton John has many of his originals.)

Writing about Tuke hedges carefully over the question of his own sexual orientation, merely noting his friendships with the likes of Oscar Wilde (who was, of course, gay) and John Singer Sargent (who many people think was). But while Tuke produced painting after painting of boys bathing nude, there’s very little in his work that you’d call homoerotic in the strict sense – or erotic at all. There’s no playful splashing about and certainly no touching. The boys in Tuke’s paintings are enjoying each other’s company, but they’re not enjoying each other – and neither is the viewer invited to, except in the wholesome sense that we’re invited to enjoy the body of Michaelangelo’s David. Indeed, unlike in the case of David (or, for that matter, Sargent’s most famous male nude), there’s no actual sex organs in sight in Tuke’s sunny world by the sea.

It would be naive to suggest the popularity of Tuke’s work with modern gay men is entirely based on an aesthetic appreciation of Tuke’s pleasant brushwork. But the popularity of images such as this, of innocent, sexless, homosocial nudity, with gay men raises all sorts of interesting questions.

Read more on my blog

The real Golden Rule of cultural nostalgia

Since at least the 1980s, the primary focus of nostalgia has always been three decades ago. In the 80s, the 50s were far more a stylistic influence than the 1940s. Think of Levi’s Norman Rockwell-like TV ads with their early Sam Cooke soundtracks. Stonewash denim. The pastel-shaded, wide-lapel styles of the B52s and late Talking Heads. Stand by Me. Back to the Future, for Christ’s sake. Think, in fact, of Converse All-Stars: that revival began well before the Berlin Wall fell.

In the 1990s, pop culture was heartily obsessed with the 1960s. Remember that interminable period when the Anthology series was released and it seemed that, in the words of one magazine I read at the time, “the remainder of this decade has been legally handed over to the Beatles”? At the cinema, there was Austin Powers and the revival of James Bond; in music, the 60s were inescapable. Gopnik cites Arctic Monkeys as evidence of the noughties’ 60s adoration, but what about Blur, Oasis, and other Beatles-obsessed 90s bands literally too numerous to list?

The noughties, I’ll admit, were rather all over the place; it does seem as if the greater cultural complexity allowed by the shift of culture from mass platforms like TV to customisable tools like the internet will undermine this trend somewhat. But even so, a thorough thread of 70s revival ran through the noughties – think of The Strokes, skinny jeans and That 70s Show. (Towards the end of the decade, recession, energy crisis and a pervading sense of general decline also helped conjure up that dreary decade.)

And the current decade, whatever we end up calling it, has been thoroughly 80s-obsessed. Shops are full of neon, leopard-print and tribal patterns; shoulder pads, leg-warmers and even the moustache have seen a revival. Synths are unavoidable across the pop music world, while our cinemas have seen the Transformers series (which began last decade; the 80s revival did arrive ahead of schedule), The A Team, and Super 8, a film openly designed in tribute to Spielberg’s early-80s heyday.

Read more on the blog

The failure of Google+ – and the failures of Twitter

Endless column inches have been devoted, deservedly, to Facebook and its impact on the world. And as a tool for helping people connect and organise, it’s without parallel – witness its role in helping kick off Egypt’s Tahrir Square protests.

But on a day-to-day basis it’s on Twitter, not Facebook, that serious civic engagement and discussion happens online. Most journalists and public intellectuals are active on Twitter and barely visible, at least publicly, on Facebook.

A vast public conversation, with serious minds and ordinary Joe’s taking part, all with equal status? That sounds remarkably like the online incarnation of Jurgen Habermas’ ideal of the public sphere. The potential for such a vivid, rapid-fire and radically open conversation is immense. My personal Twitter feed is packed with friends, thinkers and people of note from all over the world. I should be frothing with excitement every day at the thought of logging on and seeing what’s being talked about.

But I’m not. Instead, reading through Twitter seems a complete chore, because it means decoding bizarre IRL-speak, looking at ugly visible web links, and generally feeling like you’re in a late-90s bulletin board.

Read more on the blog

So I wrote 2000 untitled words about the death of Whitney Houston, to general bemusement

A funny thing happened to me last week. I heard that Whitney Houston died and I found that I was really quite upset.

I mean, I didn’t cry or anything. But neither was this the standard ‘oh, shame’ response you get when a celebrity dies. I was more upset, for example, than I was when Amy Winehouse died, although Winehouse was younger and her death therefore perhaps more tragic. (I know, I know, but come on: we all agree the death of an 88-year-old is less tragic than that of an 8-year-old; how close do the ages have to be before we stop seeing age as significant?)

Now, a disclaimer: I was drunk. But still, my mood of vague despondency continued for several days. But why? I’m an intelligent adult with a life of my own; I know that hundreds of people die tragic drug-related deaths every day; I know that musically speaking, Whitney is much less significant than Etta James, whose death I barely noticed. And despite writing about it for a living sometimes, I generally see the world of celebrity for the mildly distracting nonsense it is. Why – shoehorned song title alert! – did Whitney’s death make me So Emotional?

Read more on Here’s Another Thing