Showing all posts tagged twitter:

Algorithmic Networks and Their Malcontents

The thing that really annoys me about the death of Twitter1 is that there is no substitute. As I wrote:

none of these upstart services will become the One New Twitter. Twitter only had the weight it had because it was (for good and ill) the central town square where all sorts of different communities came together. With the square occupied by a honking blowhard and his unpleasant hangers-on, people have dispersed in a dozen different directions, and I very much doubt that any one of the outlet malls, basement speakeasies, gated communities, and squatted tenements where they gather now can accomodate everyone who misses what Twitter was.

It’s worth unpacking that situation to understand it properly. Twitter famously had not been growing for a long time, leading users to speculate that:

Maybe we already saw the plateau of the microblog, and it turns out that the total addressable market is about the size that Twitter peaked at. It is quite possible that Twitter did indeed get most of the users who like short text posts, as opposed to video (Tik Tok), photo (Instagram), or audio.

In their desperation to resume growing, Twitter started messing with users’ timelines, adding algorithmic features that were supposedly designed to help users see the best content — but of course, being Twitter, they went about it in a ham-fisted way and pissed off all the power users instead of getting them excited.

The thing is, Twitter is far from the only social network to fail to land the tricky transition to an algorithmic timeline. All of the big networks are running scared of the Engagement that TikTok is able to bring, but they seem to have fundamentally misunderstood their respective situations.

All of the first-generation social networks — Twitter, Facebook, LinkedIn — rely on the, well, network as the key. You will see posts from people you are connected to, and in turn the people who are connected to you will see your posts. Twitter was always at a disadvantage here, because Facebook and LinkedIn built on existing networks: family and friends for Facebook, and work colleagues and acquaintances for LinkedIn. Twitter always had a "where do I start from?" problem: when you signed up, you were presented with a blank feed, because you were not yet following anybody.

Twitter flailed about trying to figure out how to recommend accounts to follow, but never really cracked that Day One problem, which is a big part of the reason why its growth plateaued2: Twitter had already captured all of the users who were willing to go through the hassle of figuring that out, building their follow graph, and then pruning it and maintaining it over time. Anyone less committed bounced off the vertical cliff face that Twitter offered in lieu of an on-ramp.

The Algorithm Shall Save Us All!

TikTok was the first big network to abandon that mechanism, and for good reason: at this point, all the other networks guard their users’ social graphs jealously for themselves. It is hard to bootstrap a social network like that from nothing. Instagram famously got its start by piggybacking on Twitter, but that’s a move you can only pull off once. Instead, TikTok went fully algorithmic: what you see in your feed is determined by the algorithm, not by whom you are connected to. The details of how the algorithm actually works are secret, controversial, and constantly changing anyway, but at a high level it’s some combination of your own past activity (what videos you have watched), the activity of people like you, and some additional weighting that the network applies to show you more videos that you might like to watch.

This means that a new account with no track record and no following will be shown a feed full of videos when they first sign in. The quality might initially be a bit hit or miss, but it will refine rapidly as you use the platform. In the same way, a good video from a new account can break out and go viral without that account having to build a following first, in the way they would have had to on the first-wave social networks.

When people started talking about algorithmic timelines like this, Twitter thought they had finally struck gold: they could recommend good tweets, whether they were from someone the user followed or not. This would fill those empty timelines, and help onboard2 new users.

The problem is that users who had put in the effort to build out their graph placed a lot of value in it, and were incandescently angry when Twitter started messing with it. I liked Old Twitter because I had tuned it, over more than a decade, to be exactly what I wanted it to be, and I know a lot better than some newly-hatched algorithm what sort of tweets I want to see in my timeline.

An algorithmic timeline doesn’t have to be bad, mind; Twitter’s first foray into this domain was a feature called "While you were away" that would show you half a dozen good tweets that you might have missed since you last checked the app. This was a great feature that addressed a real user problem: once you follow more than a few accounts, it’s no longer possible to be a "timeline completionist" and read every tweet. Especially once you factor in time zones, you might miss something cool and want to catch up on it once you’re back online.

The problem was the usual one with algorithmic features, namely, lack of user control. Twitter gave users no control over the process: the "While you were away" thing would appear whenever it cared to, or not at all. There was no way to come online and call it up as your first stop to see what you had missed; you just had to scroll and hope it might show up. And then they just quietly dropped the whole feature.

Sideshow X

Twitter then managed to step on the exact same rake again when they rolled out a fully-algorithmic timeline, but, in response to vociferous protests from users, grudgingly gave the option of switching back to the old-style purely chronological one. Initially, it was possible to have the two timelines (algorithmic and chronological) in side-by-side tabs, but, apparently out of fear that the tabbed interface might confuse users, Twitter quickly removed this option and forced users to choose between either a purely chronological feed or one managed by a black-box algorithm with no user configurability or even visibility. Of course power users who used lists were already very familiar with tabs in the Twitter interface, but this was not a factor In Twitter’s decision-making.

To be clear, this dilemma between serving newbies and power users is of course not new nor unique to Twitter. This particular variation of it is new, though. Should social networks focus on supporting power users who want to manage their social graph and the content of their feed themselves — or should they chase growth by using algorithms to make it as easy as possible for new users to find something fun enough to keep them coming back?

There is also one factor exacerbating the dilemma that is somewhat unique to Twitter. Before That Guy came in and bought the whole thing, Twitter had been consistently failing to live up to an IPO valuation that was predicated on them achieving Facebook levels of growth. Instead, user growth had pretty much stalled out, and advertisers looking for direct-action results were also not finding success on Twitter in the same way as they did on Facebook or Instagram. The desperation for growth was what drove Twitter to over-commit to the algorithmic timeline, in the hope of being able to imitate TikTok’s growth trajectory.

There is irony in the fact that an undersung Twitter success story saw them play what is normally more a Facebook sort of move, successfully ripping off the buzzy new entrant Clubhouse with their own Twitter Spaces feature and then simply waiting for the attention of the Net to move on. Now, if you want to do real-time audio, Twitter Spaces is where it’s at — and they achieved that status largely because of Clubhouse’s ballistic trajectory from Next Big Thing to Yesterday’s News, with the rapidity of the ascent ruthlessly mirrored by the suddenness of the descent.

A more competently managed company — well, they wouldn’t have been bought by That Guy, first of all, but also they might have learned something from that lesson, held firm to their trajectory, and remained the one place where everything happened, and where everything that happened was discussed.

Instead, we have somehow wound up in a situation where LinkedIn is the coolest actually social network out there. Well done, everyone, no notes.

🖼️ Photos by Nastya Dulhiier and Anne Nygård on Unsplash

  1. Yeah, still not calling it X. That guy destroyed my favourite thing online, I’m not giving him the satisfaction. 

  2. Verbing weirds language. 

Can You Take It With You?

Here’s a thought: could Threads be a test case for social graph portability?

I am thinking here of both feasibility (can this be done technically) and demand (would the lack of this capability slow adoption). I am on record as being sceptical on both fronts, pace Cory Doctorow.

the account data is not the only thing that is valuable. You also want the relationships between users. If Alice wants to join a new network, let's call it Twitbook1, being able to prepopulate it with her name and profile picture is the least of her issues. She is now faced with an empty Twitbook feed, because she isn't friends with anyone there yet.

People like Casey Newton are asserting that Instagram can serve as a long-term growth driver for Threads, but I’m not so sure, precisely because of the mismatch in content. I don’t use Instagram, but what I hear of how people use it is all about pretty pictures and, more recently, video.

This is the point I made in my previous post: should a relationship in one social network be transitive with a different network? Does the fact that I like the pretty pictures someone puts out mean that I also want to consume short text posts they write? Or is it not more likely that my following on Threads would be different from that on Instagram, much as my following on Twitter is?

The closest direct comparison to the sort of fluid account portability that Cory Doctorow advocates for would be in fact if it were possible to import my Twitter following directly into Threads or Bluesky, since those services are so very similar. Even such a direct port would still run afoul of the dangling-edges problem, though: what if the person I have a follow relationship with on Old Twitter isn’t on I Can’t Believe It’s Not Twitter? Or what if they have different identities across the two services?

I still have questions about how much actual demand is out there for the format that Twitter (accidentally) pioneered. Maybe we already saw the plateau of the microblog, and it turns out that the total addressable market is about the size that Twitter peaked at. It is quite possible that Twitter did indeed get most of the users who like short text posts, as opposed to video (Tik Tok), photo (Instagram), or audio2.

On the other hand, I am also not too exercised about the fact that Threads users are already spending less time in the app. It’s simply too early to tell whether this is an actual drop-off in usage, or just normal behaviour. Users try something once, but they have not had the time to form a habit yet — and there isn’t yet the depth of content being generated on Threads to pull them into forming that habit.

Anyway, this question of portability or interoperability between networks is the aspect of the Threads story that I am watching most closely. For now, I continue to enjoy Mastodon, so I’m sticking with that, plus LinkedIn for work. When the Twitter apps shifted to 𝕏, I deleted them from my devices, and while I have viewed tweets embedded in newsletters, I haven’t yet caved in and gone back there.

🖼️ Photo by Graham Covington on Unsplash

  1. Twitbook: that’s basically what Threads is. I hereby claim ten Being Right On The Internet points. 

  2. Audio is interesting because it feels like it is still up for grabs if someone can figure out the right format. Right now there is a split between real-time audio chat (pioneered by Clubhouse, now mostly owned by Twitter Spaces), and time-shifted podcasts. I think it’s fair to say that both of those are niches compared to the other categories. 

Pulling On Threads

No, I have not signed up for Threads, Facebook’s1 would-be Twitter-killer, but I couldn’t resist the headline.

I am also not going to get all sanctimonious about Facebook sullying the purity of the Fediverse; if you want that, just open Mastodon. Not any particular post, it’ll find you, don’t worry. Big Social will do its thing, and Mastodon will do its thing, and we’ll see what happens.

No, what I want to do is just reflect briefly on this particular moment in social media.

Twitter became A Thing due to a very particular set of circumstances. It arrived in 2006, at roughly the same time as Facebook was opening up to the masses, without requiring a university email address. Twitter then grew almost by accident, at the same time as Facebook was flailing about wildly, trying to figure out what it actually wanted to be. Famously, many of what people today consider key features of Twitter — at-replies, hashtags, quote tweets, and even the term "tweet" itself — came from the user community, not from the company.

This was also a much emptier field. Instagram was only founded in 2010, and acquired by Facebook in 2012. LinkedIn also stumbled around trying to get the Activity Feed right, hiding it before reinstating it. Mastodon was first released in 2016, but I think it’s fair to call it a niche until fairly recently.

The lack of alternatives was part of what drove the attraction of early Twitter. Brands loved the simplicity of just being @brand; you didn’t even have to add "on Twitter", people got it. Even nano-influencers like me could get a decent following by joining the right conversations.

Bring Your Whole Self To Twitter

A big part of the attraction was the "bring your whole self" attitude: in contrast to more buttoned-down presentations elsewhere, Twitter was always more punk, with the same people having a professional conversation one moment, and sharing their musical preferences or political views the next. Twitter certainly helped me understand the struggles of marginalised groups more closely, or at least as closely as a white middle-class cis-het2 guy ever can.

This "woke" attitude seems to have enraged all sorts of people who absolutely deserved it. The problem for Twitter is that one of those terrible people was Elon Musk, who not only was a prolific Twitter user, but also had the money to just buy out the whole thing, gut it, and prop up its shambling corpse as some sort of success.

The ongoing gyrations at Twitter have prompted an exodus of users, and a consequent flowering of alternatives: renewed and more widespread interest in Mastodon, the launch of Bluesky by Twitter founder Jack Dorsey (and if that endorsement isn’t enough to keep you away, I don’t know what to tell you), and now Threads.

Where Now?

My view is that none of these upstart services will become the One New Twitter. Twitter only had the weight it had because it was (for good and ill) the central town square where all sorts of different communities came together. With the square occupied by a honking blowhard and his unpleasant hangers-on, people have dispersed in a dozen different directions, and I very much doubt that any one of the outlet malls, basement speakeasies, gated communities, and squatted tenements where they gather now can accomodate everyone who misses what Twitter was.

The point of Twitter was precisely that it brought all of those different communities together — or rather, made it visible where they overlapped. Now, there is not the same scope for spontaneous work conversations on the various Twitter alternatives, because LinkedIn is already there. In the usual way of Microsoft, they have put in the work and got good — or at least, good enough for most people’s purposes. You can follow influential people in your field, so the feed is as interesting as you care to make it (no, it’s not just hustle-porn grifters). Those people have separate lives on Instagram, though, where they post about non-work stuff, with a social graph that only overlaps minimally with their LinkedIn connections.

Would-Be Twitter Replacements

So, my expectation is that Mastodon will continue to be a thing, but will remain a niche, with people who like tinkering with the mechanics of social networks (both the software that runs them and the policies that keep them operating), and various other communities who find their own congenial niches there. Me, I like Mastodon, but there is a distinct vibe of it being the sort of place where people who like to run Linux as a desktop OS would like to hang out. Hi, yes, it me: I did indeed start messing with Linux back in the 90s, when that took serious dedication. It also has a tang of old Usenet, something that I caught the tail end of and very much enjoyed while it lasted. Lurking on alt.sysadmin.recovery was definitely a formative experience, and Mastodon scratches the same itch.

Threads will have at least initial success, thanks to that built-in boost from anyone being able to join with their Instagram account — and crucially, their existing following. There is an inherent weirdness to Threads being tied to Instagram, of all Facebook’s properties. Instagram is fundamentally about images, while Threads is aiming to be a replacement for Twitter, which is fundamentally about text. Time will tell whether the benefit of a built-in massive user base outweigh that basic mismatch.

The long-term future of Threads is determined entirely by Facebook’s willingness to keep it going. Not many people seem to have noted that signing up for Threads is a one-way door: to delete your Threads account, you have to delete your whole Instagram account. This is a typical Facebook "We Own All Your Data"3 move, but also guarantees a baseline of "active" accounts that Facebook can point to when shopping Threads around to their actual customers — advertisers.

Bluesky? I think it’s missed its moment. It stayed private too long, and fell out of relevance. The team there got caught in a trap: the early adopters were Known Faces, and they quite liked the fact that Bluesky only had other people like them, with nobody shouting at the gates. Eventually, though, if you want to grow, you need to throw open those gates — and if you wait too long, there might be nobody outside waiting to come in any more.
I may be wrong, but that’s what it looks like right now, in July 2023.

🖼️ Photo by Talin Unruh on Unsplash.

  1. I’m not going to give them the satisfaction of calling them "Meta" — plus if they’re not embarrassed by the name yet, they will be pretty soon. 38 active users, $470 in revenue (not a typo, four hundred and seventy dollars). By the numbers, I think this may be the rightest I have ever been about anything. 

  2. Not a slur, don’t fall for the astro-turfing and engage with the latest "controversy" — and if you’re reading this in the future and have no idea what I’m talking about, thank your lucky stars and move on with your life. 

  3. We won’t get into the fact that Threads wasn’t even submitted for approval in the EU. The reason is generally assumed to be that its data retention policy is basically entirely antithetical to the GDPR. However, since it doesn’t really seem to differ significantly from Instagram’s policy, one does wonder whether Instagram would be approved under the GDPR if it were submitted today, rather than being grandfathered in as a fait accompli, with ever more egregious privacy violations salami-sliced in over the years by Facebook. 

Twitter of Babel

It's fascinating to watch this Tower of Babel moment, as different Twitter communities scatter — tech to Mastodon, media to Substack Notes, many punters to group chats old & new, and so on.

Twitter used to be where things happened, for good or for ill, because everyone was there. It was a bit like the old days of TV, where there was a reasonable chance of most people around the proverbial office water cooler having watched the same thing the previous evening. We are already looking back on Twitter as having once filled a similar role, as the place where things happened that we could all discuss together. Sure, some of the content was reshared from Tumblr, or latterly, TikTok, but that's the point: it broke big on Twitter.

Now, newsletter writers are having to figure out how to embed Mastodon posts, and meanwhile I'm having to rearrange my iPhone screen to allow for the sudden explosion of apps, where previously I could rely on Twitter in the dock and an RSS reader on the first screen.

Whether Twitter survives and in what form, it's obvious that its universality is gone. The clarity of being @brand — and not having to specify anything else! — was very valuable, and it was something that Facebook or Google, for all their ubiquity, could never deliver.

There is value in a single digital town square, and in being able to be part of a single global conversation. Twitter was a big part of how I kept up with goings-on in tech from my perch in provincial Italy. Timezones aside, Twitter meant that not being in Silicon Valley was not a major handicap, because I could catch up with everything that was begin discussed in my own time (in a way that would not have been possible if more real-time paradigms like Clubhouse had taken off).

Of course town squares also attract mad people and false prophets, for the exact same reason: because they can find an audience. This is why it is important for town squares to have rules of acceptable behaviour, enforced by some combination of ostracism and ejection.

Twitter under Musk appears to be opposed to any form of etiquette, or at least its enforcement. The reason people are streaming out of the square is that it is becoming overrun with rude people who want to shout at them, so they are looking for other places to meet and talk. There is nothing quite like the town square that was Twitter, so everyone is dispersing to cafes, private salons, and underground speakeasies, to continue the conversation with their particular friends and fans.

These days few of us go to a physical town square every day, even here in Italy where most of the population has access to one. They remain places where we meet, but the meeting is arranged elsewhere, using digital tools that the creators of those piazzas could not even have immagined.

As the Twitter diaspora continues, maybe more of us — me included! — should remember to go out to the town square, put the phone away, and be present with people in the same place for a little while.

Then, when we go back online — because of course we will go back online, that's where we live these days — we will have to be more intentional about who we talk to. Intentionality is sometimes presented as being purely positive, but it also requires effort. Where I used to have Twitter and Unread, now I have added Mastodon, Artifact, Substack, and Wavegraph, not to mention a reinvigorated LinkedIn, and probably more to come. There is friction to switching apps: if I have a moment to check in, which app do I turn to — and which app do I leave "for later"?

This is not going to be a purely negative development! As in all moments of change, new entrants will take advantage of the changed situation to rise above the noise threshold. Meanwhile, those who benefited from the previous paradigm will have to evolve with the times. At least this time, it's an actual organic change, rather than chasing the whims of an ad-maximising algorithm, let alone one immature meme-obsessed billionaire man-child.

🖼️ Photo by Inma Santiago on Unsplash

Help, I'm Being Personalised!

As the token European among the Roll For Enterprise hosts, I'm the one who is always raising the topic of privacy. My interest in privacy is partly scarring from an early career as a sysadmin, when I saw just how much information is easily available to the people who run the networks and systems we rely on, without them even being particularly nosy.

Because of that history, I am always instantly suspicious of talk of "personalising the customer experience", even if we make the charitable assumption that the reality of this profiling is more than just raising prices until enough people balk. I know that the data is unquestionably out there; my doubts are about the motivations of the people analysing it, and about their competence to do so correctly.

Let's take a step back to explain what I mean. I used to be a big fan of Amazon's various recommendations, for products often bought with the product you are looking at, or by the people who looked at the same product. Back in the antediluvian days when Amazon was still all about (physical) books, I discovered many a new book or author through these mechanisms.

One of my favourite aspects of Amazon's recommendation engine was that it didn't try to do it all. If I bought a book for my then-girlfriend, who had (and indeed still has, although she is now my wife) rather different tastes from me, this would throw the recommendations all out of whack. However, the system was transparent and user-serviceable. Amazon would show me transparently why it had recommended Book X, usually because I had purchased Book Y. Beyond showing me, it would also let me go back into my purchase history and tell it not to use Book Y for recommendations (because it was not actually bought for me), thereby restoring balance to my feed. This made us both happy: I got higher-quality recommendations, and Amazon got a more accurate profile of me, that it could use to sell me more books — something it did very successfully.

Forget doing anything like that nowadays! If you watch Netflix on more than one device, especially if you ever watch anything offline, you'll have hit that situation where you've watched something but Netflix doesn't realise it or won't admit it. And can you mark it as watched, like we used to do with local files? (insert hollow laughter here) No, you'll have that "unwatched" episode cluttering up your "Up next" queue forever.

This is an example of the sort of behaviour that John Siracusa decried in his recent blog post, Streaming App Sentiments. This post gathers responses to his earlier unsolicited streaming app spec, where he discussed people's reactions to these sorts of "helpful" features.

People don’t feel like they are in control of their "data," such as it is. The apps make bad guesses or forget things they should remember, and the user has no way to correct them.

We see the same problem with Twitter's plans for ever greater personalisation. Twitter defaulted to an algorithmic timeline a long time ago, justifying the switch away from a simple chronological feed with the entirely true fact that there was too much volume for anyone to be a Twitter completist any more, so bringing popular tweets to the surface was actually a better experience for people. To repeat myself, this is all true; the problem is that Twitter did not give users any input into the process. Also, sometimes I actually do want to take the temperature of the Twitter hive mind right now, in this moment, without random twenty-hour-old tweets popping up out of sequence. The obvious solution of giving users actual choice was of course rejected out of hand, forcing Twitter into ever more ridiculous gyrations.

The latest turn is that for a brief shining moment they got it mostly right, but hilariously and ironically, completely misinterpreted user feedback and reversed course. So much for learning from the data… What happened is that Twitter briefly gave users the option of adding a "Latest Tweets" tab with chronological listing alongside the algorithmic default "Home" tab. Of course such an obviously sensible solution could not last, for the dispiriting reason that unless you used lists, the tabbed interface was new and (apparently) confusing. Another update therefore followed rapidly on the heels of the good one, which forced users to choose between "Latest Tweets" or "Home", instead of simply being able to have both options one tap apart.

Here's what it boils down to: to build one of these "personalisation" systems, you have to believe one of two things (okay, or maybe some combination):

  • You can deliver a better experience than (most) users can achieve for themselves
  • Controlling your users' experience benefits you in some way that is sufficiently important to outweigh the aggravation they might experience

The first is simply not true. It is true that it is important to deliver a high-quality default that works well for most users, and I am not opposed in principle to that default being algorithmically-generated. Back when, Twitter used to have "While you were away" section which would show you the most relevant tweets since you last checked the app. I found it a very valuable feature — except for the fact that I could not access it at will. It would appear at random in my timeline, or then again, perhaps not. There was no way to trigger it manually, or any place where it would appear reliably and predictably. You just had to hope — and then, instead of making it easier to access on demand, Twitter killed the entire feature in an update. The algorithmic default was promising, but it needed just a bit more control to make it actually good.

This leads us directly to the second problem: why not show the "While you were away" section on demand? Why would Netflix not give me an easy way to resume watching what I was watching before? They don't say, but the assumption is that the operators of these services have metrics showing higher engagement with their apps when they deny users control. Presumably what they fear is that, if users can just go straight to the tweets they missed or the show they were watching, they will not spend as much time exploring the app, discovering other tweets or videos that they might enjoy.

What is forgotten is that "engagement" just happens to be one metric that is easy to measure — but the ease of measurement does not necessarily make it the most important dimension, especially in isolation. If that engagement is me scrolling irritably around Twitter or Netflix, getting increasingly frustrated because I can't find what I want, my opinion of those platforms is actually becoming more corroded with every additional second of "engagement".

There is a common unstated assumption behind both of the factors above, which is that whatever system is driving the personalisation is perfect, both unbreakable in its functioning and without corner cases that may deliver sub-optimal results even when the algorithm is working as designed. One of the problems with black-box systems is that when (not if!) they break, users have no way to understand why they broke, nor to prevent them breaking again in the future. If the Twitter algorithm keeps recommending something to me, I can (for now) still go into my settings, find the list of interests that Twitter has somehow assembled for me, and delete entries until I get back to more sensible recommendations. With Netflix, there is no way for me to tell it to stop recommending something — presumably because they have determined that a sufficient proportion of their users will be worn down over time, and, I don't know, whatever the end goal is — watch Netflix original content instead of something they have to pay to license from outside.

All of this comes back to my oft-repeated point about privacy: what is it that I am giving up my personal data in exchange for, in the end? The promise is that all these systems will deliver content (and ads)(really it's the ads) that are relevant to my interests. Defenders of surveillance capitalism will point out that profiling as a concept is hardly new. The reason you find different ads in Top Gear Magazine, in Home & Garden, and in Monocle, is that the profile for the readership is different for each publication. But the results speak for themselves: when I read Monocle, I find the ads relevant, and (given only the budget) I would like to buy the products featured. The sort of ads that follow me around online, despite a wealth of profile information generated at every click, correlated across the entire internet, and going back *mumble* years or more, are utterly, risibly, incomprehensibly irrelevant. Why? Some combination of that "we know better" attitude, algorithmic profiling systems delivering less than perfect results, and of course, good old fraud in the adtech ecosystem.

So why are we doing this, exactly?

It comes back to the same issue as with engagement: because something is easy to measure and chart, it will have goals set against it. Our lives online generate stupendous volumes of data; it seems incredible that the profiles created from those megabytes if not gigabytes of tracking data have worse results than the single-bit signal of "is reading the Financial Times". There is also the ever-present spectre of "I know half of my ad spending is wasted, I just don't know which half". Online advertising with its built-in surveillance mechanisms holds out the promise of perfect attribution, of knowing precisely which ad it was which caused the customer to buy.

And yet, here we are. Now, legislators in the EU, in China, and elsewhere around the world are taking issue with these systems, and either banning them outright or demanding they be made transparent in their operation. Me, I'm hoping for the control that Amazon used to give me. My dream is to be able to tell YouTube that I have no interest in crypto, and then never see a crypto ad again. Here, advertisers, I'll give you a freebie: I'm in the market for some nice winter socks. Show me some ads for those sometime, and I might even buy yours. Or, if you keep pushing stuff in my face that I don't want, I'll go read a (paper) book instead. See what that does for engagement.

🖼️ Photos by Hyoshin Choi and Susan Q Yin on Unsplash


So there I was scrolling innocently through Twitter, when this Tweet popped up in my timeline:

A Promoted tweet? What, are they trying to sell me fighter jets?

The account seems legit, although I am baffled as to why the F-35 needs its own Twitter account. Also, who on Earth are these SIXTY-SIX THOUSAND people who follow it?

Anyway, I still don’t get why they are paying to show tweets to me. Fortunately Twitter lets you look into that, but it just raises more questions:

Why on Earth is Lockheed Martin trying to reach adults in Italy? This seems like a gross misuse of social media marketing. Or am I missing something?

This Is Where We Are, July 2017 Edition

A quick review of the status of the Big Three1 social networks as of right now.

It seems Facebook is testing ads in Messenger now, which is an incredibly wrong-headed idea:

Messenger isn’t really a "free time" experience the way Facebook proper is — you use the former with purpose, the latter idly. Advertisements must cater to that, just like anywhere else in the world: you don’t see the same ads on subway walls (where you have to sit and stare) as on billboards (where you have two or three seconds max and your attention is elsewhere).

I always hated Messenger anyway, just out of reflex because they had felt the need to split it off into a separate app. In fact, I kept using Paper until Facebook finally broke it, in no small part because it kept everything together in one app. It also looked good, as opposed to the hot mess of FB’s default apps.

Between that and the "Moments" rubbish junking up the top of every one of the FB apps, I am actively discouraged from using them. At this point I pretty much only open FB if I have a notification from there.

Meanwhile, Twitter is continuing on its slow death spiral. It is finally becoming what it was always described as: a "micro-blogging" platform. People write 100-tweet threads instead of just one blog post, and this is so prevalent that there are tools out there that will go and assemble these threads in one place for ease of reading.

It’s got to the point that I read Twitter (and a ton of blogs via RSS, because I’m old-school that way), but most of my actual interaction these days is via LinkedIn. I even had a post go viral over there - 7000-odd views and more than a hundred likes, at time of writing.

So this is where we are, right now in July 2017: Twitter for ephemeral narcissism, Facebook for interacting with (or avoiding) the same people you deal with day to day, and LinkedIn for actually getting things done.

See you out there.

Photo by Osman Rana on Unsplash

  1. I don’t Instagram, I’m too old for Tumblr, and - oh sorry Snapchat, didn’t see you down there

I have come not to bury Twitter, but to praise it

Last week the Atlantic published a eulogy for Twitter, which of course was widely reshared via... Twitter.

Uhoh, This content has sprouted legs and trotted off.

Then the reactions started coming in. I think Slate is closest to getting it right:

Twitter is not a social network. Not primarily, anyway. It’s better described as a social media platform, with the emphasis on "media platform." And media platforms should not be judged by the same metrics as social networks.

Social networks connect people with one another. Those connections tend to be reciprocal. […]

Media platforms, by contrast, connect publishers with their public. Those connections tend not to be reciprocal.

I have some conversations on Twitter, but mainly I treat it as a publishing medium. I publish my content, and I follow others who publish there. The interactions on Twitter mainly replace what used to go on in various sites' comments.

The value of Twitter is in making it easy to discover and share content. The "social", meaning Facebook-like, aspects of the platform are entirely secondary to the value of the platform. The more Twitter tries to be Facebook, the worse it gets. It should focus on just being Twitter.