Form over Substance

The Grauniad informs us that Tolkien's myths are a political fantasy.

We’re left to take on trust from Gandalf, a manipulative spin doctor, and the Elves, immortal elitists who kill humans and hobbits for even entering their territory, when they say that the maker of the one ring is evil. Isn’t it more likely that the orcs, who live in dire poverty, actually support Sauron because he represents the liberal forces of science and industrialisation, in the face of a brutally oppressive conservative social order?

This is an amusing undergraduate thought-experiment, but as a realistic critique, it is severely flawed. The writer of the article is hardly the first to point out the problems with the descriptions of the Orcs, not to mention the even more problematic descriptions of the Easterlings and Southrons.

Personally, I always found this passage, where Sam Gamgee witnesses the death of one of these allies of the Enemy at close hand, as settling the argument:

It was Sam's first view of a battle of Men against Men, and he did not like it much. He was glad that he could not see the dead face. He wondered what the man's name was and where he came from; and if he was really evil at heart, or what lies or threats had led him on the long march from his home; and if he would not really rather have stayed there in peace.

Sam at least certainly seems to consider the dead man as fully the equal of other Men1, morally and otherwise. Since Sam is generally the voice of empathy, I think we can safely say that Tolkien did not intend any racism. Here is a more in-depth treatment of the topic.

Regardless, I think it is clear that the world of Men according to Tolkien is more like Bree, which is full of people neither wholly good nor wholly bad, although many of them are weak. After the noble first few Ages have passed, the binary distinctions go away, and with them the possibility of the simple good-versus-evil mythology of his books.

The writer of the Guardian article has got hung up on the form of the books, with a dualistic structure. This form is taken from classic mythology, which is explicitly what Tolkien was engaged in. However, the substance of the books does not bear the article's charges out, as even within that form, Tolkien manages to include many shadings into his supposedly simplistic representation.

Could it be that the Guardian is simply trying to get into the slipstream of attention generated by the last of the Hobbit films with some juvenile right-on conspiracy theorising? In that case perhaps it is hardly surprising that they completely missed the point.


  1. I am using the term as Tolkien does, as placeholder for all humans. Gender relations in Tolkien's work are a whole other kettle of fish, and I am not going to get into that today! 

Not NoOps, but SmartOps

Or, Don't work harder, work smarter

I have always been irritated by some of the more extreme rhetoric around DevOps. I especially hate the way DevOps often gets simplified into blaming everything that went wrong in the past on the Ops team, and explicitly minimising their role in the future. At its extreme, this tendency is encapsulated by the NoOps movement.

This is why I was heartened to read There is no such thing as NoOps, by the reliably acerbic IT Skeptic.

Annoyingly, in terms of the original terminology, I quite agree that we need to get rid of Ops. Back in the day, there was a distinction between admins and ops. The sysadmins were the senior people, who had deep skills and experience, and generally spent their time planning and analysing rather than executing. The operators were typically junior roles, often proto-sysadmins working through an apprenticeship.

Getting rid of ops in this meaning makes perfect sense. The major cause of outages is human error, and not necessarily the fairly obvious moment when the poor overworked ops realize one oh-no-second after hitting Enter that the login was not where they thought it was. What leads to these human-mediated outages is complexity, so the issue is the valid change that is made here but not there, or the upgrade that happened to one component but did not flow down to later stages of the lifecycle. These are the types of human error that can either cause failures on deployment, or those more subtle issues which only show up under load, or every second Thursday, or only when the customer's name has a Y in it.

There have been many attempts to reduce the incidence of these moments by enforcing policies, review, and procedures. However, by not eliminating the weakest link in the chain - the human one - none of these well-meaning attempts have succeeded. Instead of saying "it will work this time, really!", we should aim to to eliminate downtime and improve performance by removing every possible human intervention and hand-over, and instead allowing one single original design to propagate everywhere automatically.

So yes, we get rid of ops by automating their jobs - what I once heard a sysadmin friend describe to a colleague as "monkey-compatible tasks", basically low-value-added, tactical, hands-on-keyboard activity. However, that does not mean that there is no role for IT! It simply means that IT's role is no longer in execution, or in other words, as the bottleneck in every request.

Standard requests should not require hands-on-keyboard intervention from IT.

This is what all these WhateverOps movements are about: preventing IT from becoming a bottleneck to other departments, whether the developers in the case of DevOps, or the GRC team in the case of SecOps that I keep banging on about lately, or whatever other variation you like.

IT still has a very important role to play, but it is not the operator's role, it is the sysadmin's role: to plan, to strategise, to have a deep understanding of the infrastructure. Ultimately, IT's role is to advise other teams on how best to achieve their goals, and to emplace and maintain the automation that lets them do that - much as sysadmins in the past would have worked to train their junior operators to deliver on requests.

The thing is, sysadmins themselves can't wait to rid themselves of scut work. Nothing would make them happier! But the state of the art today makes that difficult to achieve. DevOps et al are the friend of IT, not its enemy, at least when they're done right. Done wrong, they are the developer's enemy too.

In that sense, I say yes to NoOps - but let's not throw the baby out with the bathwater! Any developer trying to do completely without an IT team will soon find that they no longer have any time to develop, because they are so busy with all this extraneous activity, managing their infrastructure1, keeping it compliant, updating components, and all the thousand and one tasks IT performs to keep the lights on.


  1. No, Docker, "the cloud", or whatever fad comes next will not obviate this problem; there will always be some level of infrastructure that needs to be looked after. Even if it works completely lights-out in the normal way of things, someone will need to understand it well enough to fix it when (not if) it breaks. That person is IT, no matter which department they sit in. 

Debug mode for humans

I have been speaking a fair amount of German lately for one reason and another, both socially and professionally. I find casual conversation much easier, especially when well lubricated; my Bierdeutsch is super-fluent!

Delivering a professional presentation is completely different. It strikes me that speaking in one's non-primary language is like running in debug mode, at least in my experience.

First of all, I am conscious of various different threads, all running at the same time but at different speeds: what do I want to say, how am I going to phrase it, what is the word I want, make sure it isn't a "false friend", make sure the case of the adjective agrees with the noun that supports it, don't forget the verb at the end, … None of these are fully synced up, either (except at the height of Bierdeutsch), so there is also a monitor thread watching all of these other threads. Speaking on a serious subject for any length of time in a language you are not fully comfortable in is exhausting.

Interestingly, it seems that there is some reality behind the metaphor of debug mode. Certainly it seems that reactions in a non-primary language are more considered and less subject to empathy, according to a study in PLOS ONE: Your Morals Depend on Language.

This is a really interesting finding, if you think about it for a moment: our thoughts are dependent on our ability to express them.

At its extreme, of course, this turns into 1984's Newspeak. According to Orwell,

"the purpose of Newspeak was not only to provide a medium of expression for the world-view and mental habits proper to the devotees of IngSoc, but to make all other modes of thought impossible. Its vocabulary was so constructed as to give exact and often very subtle expression to every meaning that a Party member could properly wish to express, while excluding all other meaning and also the possibility of arriving at them by indirect methods."

Could this mean that it is also possible for us to train ourselves to be better people by expanding our vocabulary and our facility with it?

People have been known to worry about the impact of the internet in general, and social media in particular, on "culture", for want of a better word - but one overriding aspect of the internet is that to participate, you need at least a minimum level of comfort with language. To emerge and to excel, you need mastery.

In other words, Twitter and Facebook will save us - by forcing people to think.


Image by Florian Klauer via Unsplash

Suits You, Sir

Via Coté, I learned that an Australian TV anchor has been wearing the same suit for a year.

Stefanovic, who co-presents Channel Nine’s Today show with Lisa Wilkinson, has been wearing the same blue suit – day in, day out, except for a few trips to the dry cleaner - to make a point about the ways in which his female colleagues are judged. "No one has noticed," he said. "No one gives a shit."

Setting his point on equality aside for a moment - although it is very valid - this is why I tell my nerd(ier) friends that suits are the ultimate nerd attire. You can get dressed in the dark and still be sure of being perfectly presentable, as I have had occasion to write before. On top of that, we now know that as long as the minimum standard of suit-ness is met, nothing further is required or even noticed, even when you are operating very much in public view.

Then again, at an event today in Stockholm I was congratulated several times on my suit1, so maybe it's cultural differences again?

Also, the news media have failed me once again. I need to know who makes this guy's suit; it sounds like it should be great for travel, being pretty much indestructible!


  1. A very nice and - importantly in Sweden - warm grey wool check from Lardini. 

The Modern IT Service Desk

Email from the service desk today:

Please contact Service Desk to verify that your Outlook is compatible with Exchange 2013

This is almost a perfect example of non-user-centered process design.

Note that I'm not calling out our corporate IT department; they do some great work on the back-end systems, and individual members of IT are very helpful on internal message boards and such. It's just some of their processes that are a bit backwards.

Most IT departments work like this, with the user interaction as a bit of an afterthought. Training people won't work, because anyone who is good at both IT and people will quickly migrate out of a service desk role. The solution can only be to automate as much as possible and make IT invisible to end users.

For instance, in my case, they know what version of Outlook I use from the server logs. They also know whether that version is compatible with Exchange 2013. Also, as of this writing Outlook 2011 is the latest version available for Mac, so it's not as if they could upgrade me or something. Why, then, even ask me to contact them?

So what did I do? After scratching my head for a moment, I forwarded the email back to the service desk, adding:

Can you please verify compatibility? I am on Outlook for Mac 2011, version 14.4.5.

Let's see what they say.

Security Theatre

There are many things in IT that are received knowledge, things that everyone knows.

One thing that everyone knows is that you have to manage employee's mobile devices to prevent unauthorised access to enterprise systems. My employer's choice of MDM agent is a bit intrusive for my personal tastes, so I opted not to install it on my personal iPad. The iPhone is the company's device, so it's their own choice what they want me to run on it.

Among other things, this agent is required to connect to the company Exchange server from mobile devices. You can't just add an Exchange account and log in with your AD credentials, you need this agent to be in place.

unknown.jpg

But why the focus on mobile devices?

When I upgraded my work and home Macs to Yosemite, I finally turned on the iCloud Keychain. I hadn't checked exactly what was syncing, and was surprised to see work calendar alerts turning up on my home Mac. My personal Mac had just grabbed my AD credentials out of iCloud and logged in to Exchange, without any challenge from the corporate side.

So how is that different from my iPad? Why is a Mac exempt from the roadblock? A Mac is arguably less secure than an iPad if it gets forgotten in a coffee shop or whatever - never mind a Windows machine. Why is "mobile" different? Just because?

Many enterprise IT people seem to lose their minds when it comes to mobile device management. I'm not necessarily arguing for just dropping the requirement, just for a sane evaluation of the risks and the responses that are required.

Adventures in Screen Sharing

I'm having an odd issue, and I wonder whether anyone else has seen anything like this.

I have a headless Mac mini1, named "cooper" for reasons that should be obvious. The mini lives in a cupboard (not under the stairs), and its main job is to run iTunes and feed the AppleTV, as well as any other long-duration tasks. It also occasionally acts as a test bed for my projects, but those have been few and far between lately. Surprise! It turns out that having kids takes up a bunch of time that would otherwise be available for projects, and once they're in bed I'm usually too shattered to do anything very serious.

Because it's headless, the main way I interact with it is via Share Screen from my MacBook Air. The problem is that the mini occasionally loses the ability to advertise itself as a Shared device in the Finder sidebar.

In this screenshot, I only see the NAS. There should be another entry above that, like so:

The thing is, the mini is still reachable via VNC - just not from the Finder, because the Finder in its wisdom only allows you to Share Screen from a machine that is visible under Shared. Using the "Connect to" menu action, or for that matter iSSH on the iPad, however, I can still VNC in and see that everything is running fine.

The only fix to this issue that I have found is to reboot the mini. Since I can get in both via VNC and via SSH, this isn't a huge issue, because I can shut things down and make it a clean reboot, but it's still annoying. I haven't been able to figure out a cause, either; sometimes it happens while I'm connected via Share Screen if the Air goes to sleep, while at other times it happens if the mini is asleep - it wakes up but doesn't advertise itself in the Finder sidebar.

Both the Air and the mini are running Yosemite. Any suggestions?


UPDATE: Ars Technica did publish a deeper investigation than I got into. It seems that the root of the problem is indeed in discovery, as I had surmised. With Yosemite, Apple switched from mDNSResponder to discoveryd, and it looks like the latter has some issues.

That said, the Ars suggestion of restoring mDNSResponder seems insane to me. I guess I will just muddle through until Apple fixes discoveryd.


  1. Yes, that is the correct capitalisation, TYVM. 

Models and Examples

So Tim Cook came out.

I have always felt that this is not my battle. Given that I'm married, someone's sexual orientation is one of the least interesting things about them as far as I'm concerned, since I won't be engaging with them in that way. The only time it might become relevant is if we become sufficiently simpatico that I am trying to figure out which of my friends to set them up with, in which case I'd like to avoid an impedance mismatch. I have so little interest in this topic that I only discovered about a year ago that Michael Stipe is gay!

That said, Tim Cook coming out is significant, given how private he has kept his personal life to date. As he says in that Businessweek piece:

I don’t consider myself an activist, but I realize how much I’ve benefited from the sacrifice of others. So if hearing that the CEO of Apple is gay can help someone struggling to come to terms with who he or she is, or bring comfort to anyone who feels alone, or inspire people to insist on their equality, then it’s worth the trade-off with my own privacy.

All too often, the tech industry ends up in the news for the wrong reasons: someone is being harassed, too many women are leaving the field, someone has done something ridiculously insensitive, and so on and so forth. It is of course important to shed light on all of these things, but at a certain level, I worry that the negative reporting itself may contribute to the problem. If Tim Cook can give a positive image and example of acceptance and integration, then that is all to the good. My hope is that this would both help people who are worried about being excluded or marginalised in the industry, and also set a powerful counter-statement to intolerance, saying loudly that it is NOT OKAY.

Now, let's all talk about what is really important about Tim Cook: couldn't he just tuck his shirt-tails in?

Gender and Language

John Scalzi performed an interesting experiment in his recent book, Lock In - which is excellent, by the way.

lock-in-series-header.jpg

SPOILER WARNING

FOR SERIOUS

YOU'LL REGRET IT

Okay, I hope that persuaded anyone who hasn't read the book to stay away.

Basically, the protagonist of Lock In, Chris, is never explicitly gendered. This makes sense in the context of the book; Chris has grown up with Haden's syndrome, and therefore has not experienced puberty and such in the "normal" way. John Scalzi simply avoided declaring what gender Chris is, and left readers to reach their own conclusions.

Now this is an
interesting
experiment, as far as it goes, but as usual things are more complicated. For
instance,
I assumed Chris was male because "Chris" is a male name to me. I have never encountered females who go by "Chris". This could just be a US thing, of course: while I am intellectually aware that in the US, "Andrea" is a female name, if I see it without context I assume an Italian male - because that is where I am most used to seeing that name. I didn't even get to the point of wondering, as I might have if Chris had been called "Lesley" or something like that.

I also wonder how translators will play this in gender-obligatory languages. This is something I have [written about](

http://findthethread.postach.io/minimum-acceptable-standard "

Minimum Acceptable Standard

" ) before: English has it relatively easy, making it possible not only to write an entire book without ever stating the protagonist's gender, but to do it so subtly that readers may not even notice.

Doing this would be impossible in any other language I am familiar with. Adjectives, verb endings, and all sorts of other bits and pieces would force an explicit gender in the very first sentence where the protagonist appears, or would cause such obvious linguistic contortions that readers would know something was up.

Regardless, interesting experiment, not least because, going by an unscientific survey of the comments on the Tor piece, many women also read Chris as male. Gender assumptions are tricksy.