Showing all posts tagged work:

Turning Over A New Leaf

Yesterday was the LinkedIn equivalent of a birthday on Facebook: a new job announcement. I am lucky enough to have well-wishers1 pop up from all over with congratulations, and I am grateful to all of them.

With a new job comes a new title – fortunately one that does not feature on this list of the most ridiculous job titles in tech (although I have to admit to a sneaking admiration for the sheer chutzpah of the Galactic Viceroy of Research Excellence, which is a real title that I am not at all making up).

The new gig is as Director, Field Initiatives and Readiness, EMEA at MongoDB.

Why there? Simply put, because when Dev Ittycheria comes calling, you take that call. Dev was CEO at BladeLogic when I was there, and even though I was a lowly Application Engineer, that was a tight-knit team and Dev paid attention to his people. If I have learned one thing in my years in tech, it’s that the people you work with matter more than just about anything. Dev’s uncanny knack for "catching lightning in a bottle", as he puts it, over and over again, is due in no small part to the teams he puts together around him – and I am proud to have the opportunity to join up once again.

Beyond that, MongoDB itself needs no presentation or explanation as a pick. What might need a bit more unpacking is my move from Ops, where I have spent most of my career until now, into data structures and platforms. Basically, it boils down to a need to get closer to the people actually doing and creating, and to the tools they use to do that work. Ops these days is getting more and more abstract, to the point that some people even talk about NoOps (FWIW I think that vastly oversimplifies the situation). In fact, DevOps is finally coming to fruition, not because developers got the root password, but because Ops teams started thinking like developers and treating infrastructure as code.

Between this cultural shift, and the various technological shifts (to serverless, immutable infrastructure, and infrastructure as code) that precede, follow, and go along with them, it’s less and less interesting to talk about separate Ops tooling and culture. These days, the action is in the operability of development practices, building in ways that support business agility, rather than trying to patch the dam by addressing individual sources of friction as they show up.

More specifically to me, my particular skill set works best in large organisations, where I can go between different groups and carry ideas and insights with me as I go. I’m a facilitator; when I’m doing my job right, I break information out of silos and spread it around, making sure nobody gets stuck on an island or perseveres with some activity or mode of thinking that is no longer providing value to others. Coming full circle, this fluidity in my role is why I tend to have fuzzy, non-specific job titles that make my wife’s eyes roll right back in her head – mirroring the flow I want to enable for everyone around me, whether colleagues, partners, or users.

It’s all about taking frustration and wasted effort out of the working day, which is a goal that I hope we can all get behind.

Now, time to blow away my old life…


  1. Incidentally, this has been the first time I’ve seen people use the new LinkedIn reactions. It will be interesting to watch the uptake of this feature. 

Gatekeeping

There has been a bit of a Twitterstorm lately over an article (warning: Business Insider link), in which the executive managing editor of Insider Inc., who "has hired hundreds of people over 10 years", describes her "easy test to see whether a candidate really wants the job and is a 'good egg'": Did they send a thank-you email?

This test has rightly been decried as a ridiculous form of gatekeeping, adding an unstated requirement to the hiring process. Personally I would agree that sending a thank-you note is polite, and also offers the candidate an opportunity to chase next steps and confirm timelines. However, I would not rule out a candidate just because I didn’t receive such a note – and I have also received overly familiar notes which actively put me off those candidates.

The reason these unstated rules are so important is because of the homogenising effect that they tend to have. Only people who are already familiar with "how things are done" are going to be hired or have a career, perpetuating inequality.

This effect has been borne out in a study recently summarised in The Atlantic:

The name that we gave to the culture there was "studied informality" —nobody wore suits and ties, nobody even wore standard business casual. People were wearing sneakers and all kinds of casual, fashionable clothes. There was a sort of "right" way to do it and a "wrong" way to do it: A number of people talked about this one man — who was black and from a working-class background — who just stood out. He worked there for a while and eventually left. He wore tracksuits, and the ways he chose to be casual and fashionable were not the ways that everybody else did.

There were all kinds of things, like who puts their feet up on the table and when they do it, when they swear — things that don’t seem like what you might expect from a place full of high-prestige, powerful television producers. But that was in some ways, I think, more off-putting and harder to navigate for some of our working-class respondents than hearing "just wear a suit and tie every day" might have been. The rules weren't obvious, but everybody else seemed to know them.

I have seen this mechanism in action myself – in much more trivial circumstances, I hasten to add.

One day I was in the office and unexpectedly had to attend a customer meeting at short notice. I was wearing a shirt and a jacket, but I had no tie and had on jeans and sneakers. I apologised to the customer, and there was no issue.

On the other hand, it happened to me to visit a "cool" cloud company in my normal business warpaint, and was told in the lift to remove my tie as "otherwise they won't listen to you"…
Let's not even get into the high-school sociological aspects of people wearing the wrong shoes. Distinctions between suits are subtle, but it's obvious when someone is wearing cheap sneakers versus branded ones.

Instead of unstated and unspoken implicit rules like these, it is much better to have clear and explicit ones, which everyone can easily conform to, such as wearing a suit and tie (or female equivalent – and yes, I know that is its own minefield):

In fact, suits & ties are actually the ultimate nerd apparel. You have to put some effort into shopping, sure, and they tend to cost a bit more than a random vendor T-shirt and ancient combats, but the advantage is that you can thereafter completely forget about wondering what to wear. You can get dressed in the dark and be sure that the results will be perfectly presentable. If you want you can go to a little bit more effort and inject some personality into the process, but the great thing is that you don’t have to. By wearing a suit & tie, you lead people to pay attention to what you say and do, not to what you are wearing. And isn’t that the whole point?

Another unstated gatekeeping mechanism is "culture fit". This is all but explicitly saying "I only want to hire people from my social/class background, and will exclude candidates with a different background".

Uhoh, This content has sprouted legs and trotted off.

Here I do think there is some subtlety that is worth exploring. I attempted to respond to the tweet above, but expressed myself poorly (always a risk on Twitter) and did not communicate my point well.

First of all, there is a collision here between "is this person like me" and "would I want to spend time socially with this person". I feel that the sort of people who do this implicit gatekeeping would indeed only want to associate with people from the same background as them, and so this question becomes problematic in that context.

However, some of the reactions to the original tweet appeared to me to take the objection too far, stating that looking for social compatibility at work was ipso facto wrong. Having made several friends through work, I disagree with that view. In fact, I would go so far as to say that my work friendships are influenced in no small part by the fact that my friends are good at their job, and the factors that make them good professionals are also the factors that make them good friends: intelligent, trustworthy, honest, high EQ, and so on.

The correlation is of course not 1:1; I have known many successful and effective professionals who are not my friends. However, by excluding these factors entirely from the decision matrix, I see a particular failure mode, namely the fallacy that only people with abrasive personalities are effective, and therefore all people with abrasive personalities are good hires because they will be effective. It is not surprising that those sorts of people do not make friends at work.

The particular weight placed upon these factors may vary depending on which role in an organisation is being looked at. Customer-facing positions, where it is important to establish and maintain a rapport, may place particular emphasis on high EQ, for instance.

Of course the opposite failure mode is the one where everybody looks the same, dresses the same, went to the same schools – and only hires people exactly like them. This is why explicit rules and failsafes in the process are important, to avoid "culture fit" becoming – or remaining, if we’re honest – a fig leaf used to cloak institutional racism and classism.

As ever, the devil is in the details.


Images by Kelly Sikkema and Hunters Race via Unsplash

Scheduling

I live my life a half-hour meeting at a time. Nothing else matters: not the mortgage, not the store, not my team and all their bullshit. For those ten seconds or less between meetings, I'm free.

(with apologies to fans of The Fast And The Furious)


🖼️ Photo by rawpixel on Unsplash

Let’s Shed Some Light on This

Do you know what would be really great, when I fly in from a time zone quite a few hours misaligned with the local one? It would be fantastic if you all did not immediately lock me in a room without any windows for several hours. I mean, that’s bad enough for people who have to work there all the time, but it’s murder for those of us whose home timezone is many hours adrift from local time.

Meeting rooms with actual daylight would be a great help to the much-abused circadian rhythms of overseas visitors. That way we can avoid coming down with a bad case of SAD.

Meanwhile, I will continue to self-medicate with my own cocktail of melatonin and gin&tonic. Cheers!


Image by Rawpixel via Unsplash.

The VP of Nope

I have a character in my head, the VP of Nope. This is pure wish-fulfilment on my part: when everyone was in the room taking an utterly wrong and bone-headed decision, I wish there had been someone present who was sufficiently senior to just say "nnnope" and move on.

It seems I’m not the only one to feel that way, judging by the reactions to my tweet where I mentioned this:

(Scoff all you want, but those are pretty big engagement numbers for me.)

The VP of Nope has to be a VP in order not to have to get bogged down in particulars. Software engineers in particular are very susceptible to getting ideas into their heads which are great in a small context, but have all sorts of problems if you take a step back and look at them again in a wider context.

Here’s an example from my own history. I used to work for a company whose products used fat clients – as in, natively compiled applications for each supported platform. This was fine at the time, but web applications were obviously the future, and so it came to pass that a project was initiated to write thin web clients instead. I was part of the early review group, and we were all horrified to find that the developers had opted to write everything in Flex.

If you are not familiar with Adobe Flex1, it had a very brief heyday as a way to write rich web interfaces, but had the very significant drawback of running on top of Adobe’s late, unlamented Flash technology. There were several very significant problems due to that dependency:

  • Corporate IT security policies almost never allowed the Flash plugin to be installed on people’s browsers. This meant that a Flex GUI was either a complete non-starter, or required exceptions to be requested and granted for every single machine that was going to connect to the application back-end, thereby losing most of the benefits of moving away from the fat client in the first place.
  • Thin clients are supposed to be less resource-hungry on the client machine than fat clients (although of course they are much more dependant on network performance). While web browsers were indeed lighter-weight than many fat clients, especially Java-based ones, the Flash browser plugin was a notorious resource hog, nullifying or even reversing any energy savings.
  • While Apple’s iPad was not yet nearly as dominant as it is today, when it is the only serious tablet, it was still very obvious that tablets and mobile devices in general were The Future. Every company was falling over its feet to provide some sort of tablet app, but famously, Steve Jobs hated Flash, and articulated why in his open letter, Thoughts on Flash. All of Steve’s reasons were of course in themselves valid and sufficient reasons not to develop anything in Flex, or indeed to require Flash in any way, but the fact that Steve Jobs was committing to never supporting Flash on Apple devices killed Flash dead (and there was much rejoicing). Sure, it took a couple of years for the corpse to stop twitching, but the writing was on the wall.

Building any sort of strategic application in Flex after the release of that letter in April 2010 was a brain-meltingly idiotic and blinkered decision – and all of us on the early review programme said so, loudly, repeatedly, and (eventually) profanely. However, none of us had sufficient seniority to make our opinions count, and so the rough beast, its hour come round at last, slouched towards GA to be born into an uncaring world.

This Is Not A Rare Event

I have any number of examples like this one, where one group took a narrow view of a problem, unaware of or even wilfully ignoring the wider context. In this particular case, Engineering had determined that they could develop a thin web client more quickly and easily by using a piece of Adobe technology than by dealing with the (admittedly still immature) HTML5 tools available at the time. Given their internal metrics and constraints, this may even have been the right decision – but it resulted in an outcome that was so wrong as to actively damage the prospects of what had been until then perfectly viable products.

In such situations, the knock-on effects of the initial fumble are often even worse than the immediate impact, and so it was to prove in this case as well. First, enormous amounts of time, energy, and goodwill were wasted arguing back and forth, and then the whole GUI had to be re-written from scratch a second time without Flex, once it became apparent to enough people what a disaster the first rewrite was. Meanwhile, customers were continuing to use the old fat client, which was falling further and further behind the state of the art, since all of Engineering’s effort was being expended on either rewriting the GUI yet again, or strenuously defending the most recent rewrite against its critics. All of this wasted and misdirected effort was a major contributing factor to later strategic stumbles whose far-reaching consequences are still playing out now, nearly a decade later.

This is what is referred to as an omnishambles, a situation that is comprehensively messed up in every possible way – and the whole thing could have been headed off before it even began by the VP of Nope, quietly clearing their throat at the back of the room and shaking their head, once.

Their salary would be very well earned.


Photo by Vladimir Kudinov on Unsplash


  1. Originally developed by Adobe, it now seems to be staggering through an unloved half-life as an open-source project under the umbrella of the Apache foundation. Just kill it already! Kill it with fire! 

How To Run A Good Presentation

There are all sorts of resources about creating a good slide deck, and about being a good public speaker – but there seems to be a gap when it comes to the actual mechanics of delivering a presentation. Since I regularly see even experienced presenters get some of this stuff wrong, I thought I’d write up some tips from my own experience.

I Can’t See My Audience

The first question is, are you presenting to a local audience, or is your audience somewhere else? This seriously changes things, and in ways that you might not have considered. For a start, any sort of rich animation in your slides is probably bad for a remote presentation, as it is liable to be jerky or even to fail entirely.

You should definitely connect to a remote meeting a few minutes ahead of time, even if you have already installed the particular client software required, as there can still be weird issues due to some combination of the version of the plugin itself, your web browser, or their server-side software. If the meeting requires some software you have not used before, give yourself at least fifteen minutes to take care of downloading, installing, and setting that up to your satisfaction.

Even when people turn on their webcam (and assuming you can see something useful through it, as opposed to some ceiling tiles), once you start presenting you probably won’t be able to see them any more, so remember to stop every few minutes to check that everyone is still with you, that they can see whatever you are currently presenting, and whether they have any questions. This is good advice in general, but it’s easier to remember when the audience is in the room with you. When you’re just talking away to yourself, it can be hard to remember that there are other people listening in – or trying to.

Fancy "virtual meeting room" setups like Cisco’s TelePresence are all very well – as long as all participants have access to the same setup. Most times that I have used such systems, a few participants were connecting in from desktop devices, from their computers, or even from phones, which of course gave them far less rich functionality. Don’t assume that everyone is getting the full "sitting right across the table from each other" experience!

My Audience Can’t See Me

In one way, presenting remotely without a webcam trained on you can be very freeing. I pace a lot; I do laps of the room while talking into a wireless headset. I think this helps me keep up the energy and momentum of a live presentation, which otherwise can be hard to maintain – both when I’m presenting and when I’m in the audience.

One complication is the lack of presenter mode. I’m on the record as a big fan of presenter mode, and I rely on this feature heavily during live presentations, both for speaker notes on the current slide and to remind myself about the next slide. Depending on the situation, I may also use the presenter view to jump around in my deck, presenting slides in a different order than the one they were saved in. Remote presentation software won’t let you do this, or at least, not easily. You can hack it if you have two monitors available, by setting the "display screen" to be the one shared with the remote audience, and setting the other one to be the "presenter screen", but this is a bit fiddly to set up, and is very dependent on the precise meeting software being used.

This is particularly difficult when you’re trying to run a demo as well, because that generally means mirroring your screen so the remote audience sees the same thing as you do. This is basically impossible to manage smoothly in combination with presenter view, so don’t even try.

Be In The Room

If you are in the room with your audience, there’s a different set of advice. First of all, do use presenter mode, so that you can control the slides properly. Once you switch over to a demo, though, mirror your screen so that you are not craning your neck to look over your own shoulder like a demented owl while trying to drive a mouse that is backwards from your perspective. Make it so you can operate your computer normally, and just mirror the display. Practice switching between these modes beforehand. A tool that can really help here is the free DisplayMenu utility. This lives in your menu bar and lets you toggle mirroring and set the resolution of all connected displays independently.

Before you even get to selecting resolutions, you need to have the right adapters – and yes, you still need to carry dongles for both VGA and HDMI, although in the last year or so the proportions have finally flipped, and I do sometimes see Mini DisplayPort too. I have yet to see even the best-equipped conference rooms offer USB-C cables, but I am seeing more and more uptake of wireless display systems, usually either an AppleTV, or Barco ClickShare. The latter is a bit fiddly to set up the first time, so if you’re on your own without someone to run interference for five minutes, try to get a video cable instead. Once it’s installed, though, it’s seamless – and makes it very easy to switch devices, so that you can do things like use an iPad as a virtual whiteboard.

Especially during the Q&A, it is easy to get deeply enough into conversation that you don’t touch your trackpad or keyboard for a few minutes, and your machine goes to sleep. Now your humorous screensaver is on the big screen, and everyone is distracted – and even more so while you flail at the keyboard to enter your password in a hurry. To avoid this happening, there’s another wonderful free utility, called Caffeine. This puts a little coffee cup icon in your menu bar: when the cup is full, your Mac’s sleep settings are overridden and it will stay awake until the lid is closed or you toggle the cup to empty.

Whether the audience is local or remote, Do Not Disturb mode is your friend, especially when mirroring your screen. Modern presentation software is generally clever enough to set your system to not display on-screen alerts while you are showing slides (unless you are one of those monsters who share their decks in "slide sorter" view, in which case you deserve everything you get), but that won’t save you once you start running a demo in your web browser. Some remote meeting software lets you share a specific application rather than your whole screen, but all that means is that instead of the remote audience seeing the specific text of your on-screen alerts, they see ugly great redacted rectangles interfering with the display. Either way, it does not look great.

I hope these tips have been useful. Good luck with your presentations!


Photos by Headway and Olu Eletu on Unsplash

Work From Home

I was reading an interesting blog post about working from home, by Julia Evans. I also work from home, so it was interesting to compare my experiences of remote work with hers.

The two main benefits are the obvious ones – I get to live where I want (Montreal) and have the job that I want. And San Francisco tech companies in general pay a lot more than Montreal tech companies, so working for a SF tech company while living outside SF is great.

I can confirm this 100%. I live in Italy rather than in Canada, but the same factors apply: I’d rather be here than there, and the salary is very competitive with what I could make locally.

  • I have a lot of control over my working environment. It’s relatively easy to close Slack and focus.

True! I hated working in an open-plan office, and wore headphones a lot so that I could get some peace and quiet. It did not help that none of my team were in that office, so I was only going there to satisfy some HR mandate.

  • I basically haven’t had to set an alarm for 4 years.

Ha. Nnnope – I still have my alarm set far too early every weekday to take the kids to school. On the other hand, I can have breakfast with them and take them to school, and still get a full day of work in in. Part of that is down to time zone shift, which is both good and bad; more on that later.

  • There’s a nice community of remotes across the company. I’ve gotten to know a lot of wonderful people.

Yes! My team is spread out across four sites and three time zones, and so are many other teams, so there isn’t the sort of downside to being remote that there can be if it’s an exception.

  • I can work from another city/country if I want (like I went to Berlin for 6 weeks in 2016 and it wasn’t disruptive, especially since my 2 teammates at the time lived in Europe).

I haven’t tried this one (those kids and their schools again), but I know other people who’ve done it very successfully. This also works if your area of coverage gets large enough. I knew someone who was in charge of one particular technology alliance partner across the whole of EMEA, which meant that he spent a lot of his time flying. Soon, he realised that this meant he didn’t have to be anywhere in particular, as long as he was near an international airport – so he decamped to Barcelona for a year. Why not?

  • I live in a slightly shifted timezone (3 hours ahead of many people I work with), so I can get stuff done before anybody gets to work.

I am shifted a lot more than that: the difference from Italy to San Francisco is nine hours. The upside is I get a nice quiet start to my day to read, write, and think, and then the US wakes up and I start getting into conference calls. The downside is that there are only a few usable hours of overlap in our schedules, so compatible time slots go quickly. Sometimes you have to do careful triage of what actually needs an interactive voice call, and what can be discussed asynchronously over email or Slack. I make it a hard rule to keep family dinner time free, but I do take calls after dinner several times a month, when we can’t work out other slots.

Shout Louder

That last point is important: I joined a team that had previously been able to shout across a table at each other, and suddenly half the team was remote. We had to figure out how to communicate and manage projects across the time zone gap, and there were some stumbles and false starts along the way.

What we ended up figuring out was that different channels work for different tasks. Perhaps not revolutionary, I know, but we took the time while we were all together in person and thrashed it out with a whiteboard: what type of requests should go to which channel, what response times could be expected, and so on.

This is what is known as a "communications charter", and is recommended by HBR for virtual teams:

Communication on virtual teams is often less frequent, and always is less rich than face-to-face interaction, which provides more contextual cues and information about emotional states — such as engagement or lack thereof. The only way to avoid the pitfalls is to be extremely clear and disciplined about how the team will communicate. Create a charter that establishes norms of behavior when participating in virtual meetings, such as limiting background noise and side conversations, talking clearly and at a reasonable pace, listening attentively and not dominating the conversation, and so on. The charter also should include guidelines on which communication modes to use in which circumstances, for example when to reply via email versus picking up the phone versus taking the time to create and share a document.

Get In Their Face

Note that when we were working out our communications charter, we did it with a whiteboard. This is because we made it a goal to get together in person once a quarter or thereabouts. Don’t skimp on this! It’s not cheap: airfare, hotels, and meals all add up. However, the time you spend together face to face will pay off over and over. There is so much that gets done when the team is together, and the benefits continue after the remote team members fly out, because that face time has strengthened relationships and clarified questions.

In fact, face time is so important that it’s the very first point in that HBR list:

It may seem paradoxical to say in a post on virtual teams, but face-to-face communication is still better than virtual when it comes to building relationships and fostering trust, an essential foundation for effective team work. If you can’t do it, it’s not the end of the world (focus on doing some virtual team building). But if you can get the team together, use the time to help team members get to know each other better, personally and professionally, as well to create a shared vision and a set of guiding principles for how the team will work. Schedule the in-person meeting early on, and reconnect regularly (semi-annually or annually) if possible.

Feed The Mind

However, there is one final point that I have not seen listed anywhere else, and that is food. When I work from home, I can make my own meals, and share them with whoever else is around: my kids if they don’t have school, my wife if she is also working from home, or friends who might be taking their lunch breaks at the same time as me.

What do you think? Beats a soggy sandwich at your desk, right?


Top image by Seemann via Morguefile; bottom image courtesy of author.

Unbundling and Rebundling

What is a plumber, anyway?

Tim Harford, better known as the Undercover Economist, always has reliably entertaining thoughts. His latest piece explains Why Microsoft Office is a bigger productivity drain than Candy Crush Saga. Drawn in by that title like moths to a flame, we find the following critique:

Microsoft Office may be as much a drag on productivity as Candy Crush Saga. To see why, consider Adam Smith’s argument that economic progress was built on a foundation of the division of labour. His most celebrated example was a simple pin factory: "One man draws out the wire, another straights it, a third cuts it, a fourth points" and 10 men together made nearly 50,000 pins a day.
[…]
In a modern office there are no specialist typists; we all need to be able to pick our way around a keyboard. PowerPoint has made amateur slide designers of everyone. Once a slide would be produced by a professional, because no one else had the necessary equipment or training. Now anyone can have a go — and they do.
Well-paid middle managers with no design skills take far too long to produce ugly slides that nobody wants to look at. They also file their own expenses, book their own travel and, for that matter, do their own shopping in the supermarket. On a bill-by-the-minute basis none of this makes sense.

Superficially, this take is amusing as ever, but on reflection, I do find it a little disingenuous. Leaving the slides out of it, because actually those are part of my job, nowadays it is true that we all book our own travel and so on – but on the other hand very few of us office drones keep our own vegetable gardens or even know how to, and those that do mostly treat it as a hobby.

All that has happened is that the frontier of specialisation has moved, and what was once common knowledge for everyone is now a specialised job, while what once required specialists is now expected of everyone. Where once everybody knew how to grow their own food, now we delegate that to small groups of professionals using specific tools. Meanwhile, data processing, which used to be literally the preserve of a priestly caste, has been democratised to the point that any office job will require people to know at least the basics.

I would love to have an assistant to do my expenses and so on, and instead here I am toiling in the salt mines – but let’s face it, if your expense platform is at all decent, and you have reasonable IT skills, this should take roughly no time at all. Booking your own travel ensures that you get what you want, making your own compromises inside of the corporate travel policy.

This definitional error has some interesting consequences, as it is certainly true that most people are probably slower typists than professionals who worked in typing pools, when such things still existed. If your measurement of productivity is words banged out on keyboard per minute, it is almost certainly less efficient for professionals to do it themselves. And yet, hitting the keys yourself is always a far quicker way of getting your ideas out than dictating to even the fastest typing pool. How do you measure the productivity difference between an exec tapping out three lines on their iPhone while waiting to board a flight, versus having to wait until they get back to the office on Monday? Sure, those three lines are terse, jargon-filled, and probably stuffed with typos or interesting autocorrect-isms, but they get the point across.

All of this transformation informs Tim Harford’s predictions for 2118:

In an insightful essay from 1996, Paul Krugman predicted that there would be "no robot plumbers" in 2096. I agreed with him then. I am no longer so confident. It seems quite plausible that in 100 years’ time — and perhaps much sooner — plumbers, taxi drivers and many journalists, too, will simply have nothing serious to contribute to the labour market.

I would be seriously impressed by a robot with the combination of agility, strength, and inference reasoning required to work as a plumber. I may well be proved wrong by events (and if so, I will take refuge in probably not being around to be embarrassed by my wrongness), but I expect it won’t quite work out that way. Instead, I suspect that the job of "plumber" is one of the safest out there, and for many of the same reasons that it was impossible to outsource: in addition to knowledge, it requires great situational awareness, problem-solving capabilities, and flexibility – all of which are weak points for automated systems.

More vulnerable are the jobs that delaminate neatly into separate tasks, some of which will be re-bundled into new and different jobs, while others are automated away. The job of "typist" has gone the way of the dodo because it encapsulated a single task which could either be made part of other jobs (we all do our own typing) or automated (mail merge and variables make it easy to generate and circulate even quite complex documents, without human typing).

The job market will certainly be radically different in 2118 – that prediction is fairly safe – but I expect that there will still be jobs, and people doing them - people augmented by automated capabilities.


Photo by Jouni Rajala on Unsplash

Biting My Tongue

So I'm working with a prospect in the fashion and luxury goods area. We've been doing a Proof of Value for the last few weeks, and we're now at the point of presenting the results.

So I built this slide deck as if it were a fashion collaboration, "Moogsoft X $PROSPECT_NAME", "Spring-Summer 2017", and so on. I'm super proud of it - not just the conceit, but also the results we have been able to provide for very little effort - but I'm also kind of bummed that I can never show it to anyone outside the company.

This prospect does not want its name used anywhere, so even if - I mean, when we close the deal, they will only ever appear anywhere as "fashion & luxury goods house".

This is not the first time this has happened to me. At a previous startup, we sold to, umm, let’s call them a certain automotive manufacturer and motorsports team based near Modena. While negotiating the price, the customer asked for "a last effort" in terms of discounting. In exchange, we asked for them to provide us with an official reference. After consulting with their brand marketing people, it turned out that the fee for use of their trademark would have been nearly twice the total value of the software deal… We respectfully declined their kind offer.

After all, the main thing is to do the deal and provide value; even if we can't get the logo on our site, it's still a win.

My only remaining problem (beyond actually getting the deal over the line) is that my wife wants me to be paid for this current opportunity in handbags, while the Moogsoft colleague who helped me out wants her share in eau de toilette…

New Paths to Helicon

I was chatting to a friend last week, and we got onto the topic of where sysadmins come from. "When two sysadmins love each other very much…" - no, that doesn't bear thinking about. BRB, washing out my mind with bleach.

But seriously. There is no certification or degree that makes you a sysadmin. Most people come into the discipline by routes that are circuitous, sideways, if not entirely backwards. The one common factor is that most people scale up to it: they start running a handful of servers, move or grow to a 50-server shop, build out some tools and automation to help them get the job done, then upgrade to 500 servers, and so on.

The question my friend and I had was, what happens when there are no 10 and 50-server shops around? What happens when all the jobs that used to be done with on-premises servers are now done in SaaS or PaaS platforms? My own employer is already like that - we’re over a hundred people, and we are exactly the stereotypical startup that features in big infrastructure vendors' nightmares: a company that owns no physical compute infrastructure, beyond a clutch of stickered-up MacBooks, and runs everything in the cloud.

The 90s and Naughties, when I was cutting my teeth in IT, were a time when there was relative continuity between desktop and enterprise computing, but that is no longer the case. These days you’ve got to be pretty technical as a home user before anything you’re doing will be relevant at enterprise scale, because those in-between cases have mostly gone away. I got my start in IT working at the local Mac shop, but neighbourhood computer stores have gone the way of the dodo. There simply are not many chances to manage physical IT infrastructure any more.

Where Are Today’s On-Ramps?

There is one part of that early experience of mine which remains valid and replicable today. My first task was pure scut-work, transferring physical mail-in warranty cards into the in-house FileMaker Pro "database". After two weeks of this, I demanded (and received) permission to redo the UI, as it was a) making my eyes bleed, and b) frustrating me in my data entry. Once I’d fixed tab order and alignments, I got ambitious and started building out data-queries for auto-suggestions and cross-form validation and all sorts of other weird & wonderful functions to help me with the data entry. Pretty soon, I had just about automated myself out of that job; but in doing so, I had proven my value to the company, and received the traditional reward for a job well done - namely, another job.

That is today’s path into computing. People no longer have to edit autoexec.bat on their home computers just to play games, but on the other hand, they will start to mess around behind the scenes of their gaming forum or chat app, or later on, in Salesforce or ServiceNow or whatever. This is how they will develop an understanding of algorithms, and some of them will go on from there, gradually growing their skills and experience.

A Cloudy Future?

To be clear, this cloud-first world is not yet a reality - even at Moogsoft, only a fairly small percentage of our customer base opts for the SaaS deployment option. More use it for the pilot, though, and interest is picking up, even in unexpected quarters. On the other hand, these are big companies, often with tens or hundreds of thousands of servers. They have sunk costs that mean they lag behind the bleeding edge of the change.

Even if someone does have 50 servers in an in-house server room today, as the hardware reaches its end-of-life date, more and more organisations are opting not to replace them. I was talking to someone who re-does offices, and a big part of the job is ripping out the in-house "data closet" to make more work space. The migration to the cloud is not complete, and won't be for some time, but it has definitely begun, even for existing companies.

What will save human jobs in this brave new world will be "intersection theory" - people finding their niches where different sub-fields and specialisations meet. Intuitive leaps and non-obvious connections between widely separated fields are what humans are good at. Those intersections will be one of the last bastions of the human jobs, augmented by automation of the more narrowly-focused and predictable parts of the job.

There will be other hold-outs too, notably tasks that are too niche for it to be worth the compute time to train up a neural network. My own story is somewhere in between the two, and would probably remain a viable on-ramp to IT - asssuming, of course, that there are still local firms big enough to need that kind of service.

Constant Change Is The Only Constant

To be clear, this is not me opining from atop an ivory tower. Making those unexpected, non-obvious connections, and doing so in a way that makes sense to humans, is the most precise definition I’d be willing to sign up to of the job I expect to have twenty years from now.

As we all continue to reinvent ourselves and our worlds, let's not forget to bring the next generations in. Thinking that being irreplaceable is an unalloyed win is a fallacy; if you can't be replaced, you also can't be promoted. We had to make it up as we went along, but now it's time to systematise what we learned along the way and get other people in to help us cover more ground.

See you out there.