Reference This!

One of the downsides of working for a little startup that is going to change the world, but doesn’t quite have the name recognition yet - is that you get asked for customer references all. the. time.

Now on one level, I have absolutely no problem with this. It makes perfect sense from the point of view of a prospective customer: some rando just showed up, and sure, he’s got some good PowerPoint game, but I’ve never heard of him or his company - why should I waste any time on him?

The problem that a prospective customer might not appreciate is that by the nature of things, a growing startup has many more prospects than existing customers. Every single member of the sales team, if they are doing their job, has as many sales prospects at any one time as there are total customers in production. If we were to walk each of those prospects past a real customer, just so they could kick the tyres and see whether we have something real, pretty soon our existing customers would stop taking our calls - not a clever strategy, when we sell subscriptions and rely on customers renewing their contracts.

The trick, then, is to balance these requirements. On the side of the prospective customer, the goal is to validate whether this interesting startup has an actual product - or just an interesting idea and some vapourware slides. This is an absolutely valid goal - but prospective customers should recognise vendors’ incentives as well.

Reputable vendors who actually intend to build a lasting business have no more interest in wasting time and resources in projects that do not go anywhere than customers do. We know that our tech works (again, assuming for the sake of argument that the vendor you’re talking to is not just a straight-up scammer), so our goal is not to waste our limited time and resources chasing after something that is never going to be a successful, referenceable production implementation.

So, all of that being said - please don’t ask vendors for references on the first date. If the vendor you're talking to is any good, they will be qualifying you as aggressively as you are qualifying them. We vendors are very protective of our customers - once more, assuming you’re dealing with a reputable vendor in the first place! Please don’t see this as us being difficult or having something to hide; rather, it’s a preview of how your own relationship with us as a customer would be. If you trust us with your business, we will be equally protective of you. You want to be sure that if we come to you in the future with a request to talk to someone about your experience with our products, it’s for a good reason, and not something we will ask you to do every other week.

Once everyone is comfortable that there is a real opportunity - that is when we can get other parties involved. Until then, here’s a white paper, here’s a recorded webinar, here’s an article by an analyst who spoke to some of our customers - but my current customers are sacred to me, and I won’t introduce you to them until you convince me that you’re serious.

This has been your subtweet-as-blog-post of the day.


Image by María Victoria Heredia Reyes via Unsplash

Mr Cook, Tear Down This Geo-Fence

Among other signs and portents, the first few days of 2017 have also provided more examples of the negative consequences of geofencing: first China demanded removal of the NYT app from app stores, and now Russia requires removal of the LinkedIn app.

I joked that the next story would involve Saudi Arabia demanding the removal of Grindr - but it was pointed out to me that Grindr and other similar apps are already banned in the Kingdom, unless of course you use a VPN. I was not that surprised, since one of my memories from my first visit to the sandbox was of censored episodes of The Big Bang Theory, where not only were Penny’s legs blurred, but something was being bleeped out in the dialogue. I couldn’t work out what it was at the time, so when I got back home I went looking for the episode in question. It turned out to be… menstruation. Yes. Bleeped out.

I am not arguing (today) about the right of a privately-owned TV channel to choose to censor its content in the country they operate in - although I do question what is left of The Big Bang Theory once the censors have had their way with it. They might as well just not show it.

No, my beef is with the big international companies such as Apple enabling this sort of local ban. Geofencing - the practice of restricting content by geographic region - was originally instituted in the iTunes Store and App Store to comply with IP protection requirements, but it was always consumer-hostile. Basically the idea was to enable differential pricing and different release dates for the same content in different regions. This policy would replicate what region-coding on DVDs delivered, preventing DVDs from one region from showing content on players from another region. This policy was presented as enabling the studios to charge less for films in India or Africa or SE Asia or wherever than in the US and Europe, but given rampant piracy in those regions, I doubt it had much impact. Meanwhile, region coding made it extremely difficult for US consumers to watch European films, or for European consumers to watch US films at the same time as they were released in the US.

Nowadays, nobody really does staggered DVD releases any more anyway. With global fanbases who communicate over the Internet, even TV shows - let alone feature films - have been forced to launch more or less simultaneously in all territories. Content producers had believed that the most die-hard consumers would wait patiently for six months, ignoring spoilers from their US peers, and be grateful for the content when it finally arrived in their region. What actually happened was that they would simply hit BitTorrent or YouTube or whatever the next morning. Underground "fansub" communities grew up to provide subtitles for foreign-language content, more or less overnight.

Studios and broadcasters eventually figured out that it was better to enable their fans than obstruct them. I remember Lost as being the first series to really embrace this, to the point that in Italy at least, Lost episodes were broadcast in English with subtitles instead of being redubbed, because this could be done in near real time. The dubbed version would be broadcast some time later, but at least the true fans had got their real-time fix through an approved channel.

Today, the online stores are the only ones that still strictly enforce geofencing. I still cannot buy TV shows through the Italian iTunes Store - not even TV shows that are available in Italy through other means. I am also tied to the Italian Store (as opposed to the UK or US one) by my credit card’s billing address. This is in fact the last thing that is keeping me as a cable TV customer. If I could just buy my TV shows through iTunes - the ones that aren’t already on Netflix, that is - I’d kill my cable subscription in a heartbeat.

The thing is, this sort of restriction used to be "just" hostile to consumers. Now, it is turning into a weapon that authoritarian regimes can wield against Apple, Google, and whoever else. Nobody would allow Russia to ban LinkedIn around the world, or China to remove the New York Times app everywhere - but because dedicated App Stores exist for .ru and .cn, they are able to demand these bans as local exceptions, and even defend them as respecting local laws and sensibilities. If there were one worldwide App Store, this gambit would not work.

What are the downsides of a worldwide App Store without geographic restrictions? When the App Store was set up, Apple needed to pacify the studios to get access to their content libraries. But now, in 2017, what would the studios do - turn down all the revenue from the iTunes Store? I don’t think so.

Mr Cook, tear down this geo-fence!

Previously


Images by Cole Patrick and Gili Benita via Unsplash

IoT Future: Saved by Obsolescence?

It’s that most magical time of year… no, not Christmas, that’s all over now until next December. No, I mean CES, the annual Consumer Electronics Show in Las Vegas. Where better than Vegas for a million ridiculous dreams to enjoy brief moments of fame, only to fade soon after?

It used to be that the worst thing that could come out of CES was a drawer full of obsolete gadgets. These days, things can get a bit more serious. Pretty much every gadget on display is now wifi-enabled and internet-connected - yes, even the pillows and hairbrushes.

The reason this proliferation of connectivity is a problem is the "blinking twelves" factor, that I have written about before:

Back in the last century, digital clocks with seven-segment displays became ubiquitous, including as part of other items of home electronics such as VCRs. When first plugged in, these would blink "12:00" until the time was set by the user.

Technically-minded people soon noticed that when they visited less technical friends or relatives, all the appliances in the house would still be blinking "12:00" instead of the correct time. The "blinking twelves" rapidly became short-hand for "civilians" not being able to – or not caring to – keep up with the demands of ubiquitous technology.

The problem that we are facing is that computing has begun to spread beyond the desktop. Even the most technophobic now carry a phone that is "smart" to a greater or lesser degree, and many people treat these devices much like their old VCRs, installing them once and then forgetting about them. However, all of these devices are running 24/7, connected to the public Internet, with little to no management or updates.

Now we are starting to see the impact of that situation. Earlier this year, one of the biggest botnets in history was created from hacked smart CCTV cameras and took down big chunks of the Internet.

That’s just crude weight-of-numbers stuff, though; the situation will get even more… interesting as people figure out how to use all of the data gathered by those Things - and not just the owners of the devices, either. As people introduce always-on internet-connected microphones into their homes, it’s legitimate for police to wonder what evidence those microphones may have overheard. It is no longer totally paranoid to wonder what the eventual impact will be:

Remember that quaint old phrase "in the privacy of your own home". I wonder how often we will be using it in 20 years' time.

What can we do?

Previous scares have shown that there is little point in the digerati getting all excited about these sorts of things. People have enough going on with their lives; it takes laws to force drivers to take care of basic maintenance of their cars, and we are talking about multi-tonne hunks of metal capable of speeds in excess of 100mph. Forget about getting them to update firmware on every single device in their home, several times a year.

Calls for legislation of IoT are in my opinion misguided; previous attempts to apply static legal frameworks to the dynamic environment of the Internet have tended to be ineffective at best, and to backfire at worst.

Ultimately, what will save us is that same blinking twelves nature of consumers. There is a situation right now in San Francisco, where the local public transport system’s display units that should show the time until the next bus or train are giving wildly inaccurate times:

To blame is a glitch that's rendered as many as 40 percent of buses and Muni vehicles "invisible" to the NextMuni system: A bus or light rail train could arrive far sooner than indicated, but the problem, which emerged this week, is not expected to be resolved for several weeks.

Muni management have explained the problem (emphasis mine):

NextMuni data is transmitted via AT&T’s wireless cell phone network. As Muni was the first transit agency to adopt the system, the NextMuni infrastructure installed in 2002 only had the capacity to use a 2G wireless network – a now outdated technology which AT&T is deactivating nationwide.

What took down NextMuni - the obsolescence of the 2G network that it relied on - will also be the fix for all the obsolete and insecure IoT devices out there, next time there is a major upgrade in wifi standards. More expert users may proactively upgrade their wifi access points to get better speed and range, but that will not catch most of the blinking twelves people. However, it’s probably safe to assume that most of the Muggles are relying on devices from their internet provider, and when their provider sends them a new device or they change provider, hey presto - all the insecure Things get disconnected from their botnets.

Problem solved?


Image by Arto Marttinen via Unsplash

How To Lose Friends and Influence People

I am a huge fan of Evernote. I have used their software for many years, and its many features are key parts of my workflow. I take notes on multiple devices, use tagging to sync between those devices, take snapshots of business cards and let the OCR and the access to LinkedIn sort out the details, annotate images and PDFs, and more.

I should say, I used to be a fan of Evernote. They recently made some changes to their privacy policy that have users up in arms. Here is the relevant entry from their changelog:

Privacy Policy

January 23, 2017 updates to the October 4, 2016 version:

We clarified that in building a more personalized Evernote service that can adapt to the way you think and work, a small group of engineers may need to oversee these automated technologies to ensure they are working as intended. Also, we added that we will be using data from other sources to tailor your Evernote experience and explain how you can get more out of your Evernote account. Please see our FAQ for more information on these changes.

Updates to our legal documents | Evernote

This may be fairly inoffensive, but it is worrying to me and to many users. These days, "personalisation" is often code for "gathering data indiscriminately for obscure purposes that may change at any time". This exchange is generally presented as a bargain where users sacrifice (some) privacy to the likes of Google in exchange for free use of their excellent services such as Gmail or Maps.

Evernote's case is different. As a paid app, we users like to assume that we are opting out of that bargain, and paying directly for our services - instead of paying indirectly by authorising Evernote to resell our personal data to advertisers.

In addition, we use Evernote to store data that may be personal, sensitive, or both. Evernote have always had some weasel words in their Privacy Policy about their employees having access to our notes:

  • We believe our Terms of Service has been violated and confirmation is required or we otherwise have an obligation to review your account Content as described in our Terms of Service;
  • We need to do so for troubleshooting purposes or to maintain and improve the Service;
  • Where necessary to protect the rights, property or personal safety of Evernote and its users (including to protect against potential spam, malware or other security concerns); or
  • In order to comply with our legal obligations, such as responding to warrants, court orders or other legal process. We vigilantly protect the privacy of your account Content and, whenever we determine it possible, we provide you with notice if we believe we are compelled to comply with a third party’s request for information about your account. Please visit our Information for Authorities page for more information.

So basically, Evernote employees have always had access to our stuff. This part of the privacy policy has not changed substantially, but the changes are worrying (emphasis mine):

  • New: Do Evernote Employees Access or Review My Data?
  • Old: Do Evernote Employees Access or Review My Notes?

  • New: Below are the limited circumstances in which we may need to access or review your account information or Content:

  • Old: As a rule, Evernote employees do not monitor or view your personal information or Content stored in the Service, but we list below the limited circumstances in which our employees may need to access or review your personal information or account Content:

  • New: We need to do so for troubleshooting purposes or to maintain and improve the Service;

  • Old: We need to do so for troubleshooting purposes;

Privacy Policy | Evernote
Privacy Policy - 2017 update

Now, here is why people are all up in arms. We would like service providers to tread extremely carefully when it comes to our personal data, accessing it only when warranted. 2016 has provided plenty of object lessons in why we are so sensitive; just today I received an email from Yahoo detailing their latest hack. Yahoo hack: Should I panic? - BBC News

In this case, Evernote appear to have made two mistakes. First, they designed and built a new functionality that requires access to users’ personal data and content in order to do… well, it’s not entirely clear what they want to do, beyond the fact that it involves machine learning.

Secondly, they completely mis-handled the communication of this change. I mean, they even removed the disclaimer that "As a rule, Evernote employees do not monitor or view your personal information or Content stored in the Service"! How tone-deaf can you get?

It’s also very unclear why they even made this change. In their response to the outrage, they say this:

We believe we can make our users even more productive with technologies such as machine learning that will allow you to automate functions you now have to do manually, like creating to-do lists or putting together travel itineraries.

A Note From Chris O’Neill about Evernote’s Privacy Policy

The problem is, users are perfectly capable of managing to-do lists and itineraries, and based on an informal sample of Twitter reactions to this new policy, do not see enough value to want to give unknown Evernote employees access to their data.

An unforced error

This is such a short-sighted decision by Evernote. As one of the few cloud services which is used primarily through fat clients, Evernote is in a privileged position when it comes to pushing processing out to the users’ devices.

Apple have the same advantage, and do the right thing with it: instead of snooping around in my mail and calendar on the server side, my local Mail app can detect dates in messages and offer to create appointments in Calendar. Also, CloudKit’s sync services are encrypted, so nobody at Apple has access to my data - not even if law enforcement asks.

Evernote have chosen not to take that approach, and have not (yet) clarified any benefit that they expect or promise to deliver by doing so. This mis-step has now caused loyal, paying users like me to re-evaluate everything else about the service. At this point, even cancelling the new machine-learning service would not be enough to mollify users; nothing short of a new and explicit commitment to complete encryption of user data - including from Evernote employees! - would suffice.

Evernote's loss will be someone else’s gain

One possible winner from this whole mess is Bear, a new note-taking app that does use CloudKit and therefore is able to provide that encryption at rest that Evernote does not.

Bear - Notes for iPhone, iPad and Mac

The Bear team have even been having some fun on Twitter at Evernote’s expense:

I composed this post in Bear, and I have to say, it is very nice. I copied it over to Evernote to publish here, but it’s the first crack. Before this mess, I was a vocal advocate of Evernote. Now? I am actively evaluating alternatives.

Respect your users, yo.

Bimodal or Bi-Model

I just attended1 the Gartner Data Center, Infrastructure & Operations Management Summit in London. It was an interesting event, as ever; Gartner events are expensive for both exhibitors and attendees, but they are also the highest-value events around.

With all the excitement there has been about Gartner’s "bimodal IT" concept, I was curious to see whether they would double down, or whether the original idea would be modified in any way.

Given that this was the sixth slide of the opening keynote, I think it’s safe to say that they are not giving up on bimodal:

As is often the case with these high-level structures, hearing it explained by its practitioners helps with reaching a deeper understanding of what bimodal IT actually means. The concern that many people have with the soundbite version of bimodal IT is that it will devolve into "two-speed" IT, with one well-funded group forging ahead with new technologies while the other scrounges around for scrap to keep old creaking systems alive. Of course nobody with any sense would want to be in the latter group, and therefore there will be substantial brain drain between the two.

After spending two days listening to Gartner analysts, whether during formal presentations, in our briefings, or just hanging out during the evening reception, I think I can safely say that there is a bit more subtlety going on here.

What does the business need?

Just a few slides further into the keynote, we have this one:

It is clear that Gartner are trying to move the conversation beyond the details of which technology fits into which category. Once you move the focus to the business outcome, talking about Mode 1 and Mode 2 technologies is a short-hand for some of their characteristics.

Throughout the conference, Gartner people used the terms in this way, talking about the varying levels of agility of the different approaches, but always in terms of the business goals that are being enabled.

It’s unfortunate to my mind that "bimodal" has become such a hot-button topic, because that is absolutely a conversation that needs to be had. If you start "moving fast and breaking things", where those "things" support critical aspects of the business, pretty soon you’re going to find an angry mob of front-line people bearing down on the IT department, waving torches and pitchforks and calling for the head of the idiot who changed something they relied on just because it let them refactor something into something else more technically pleasing to them.

Instead, the public conversation has pretty much stopped at this binary opposition between boring, uncool Mode 1 that gets starved of resources, and cool, shiny Mode 2 that gets all the love.

As I have pointed out before, this misses the time-based aspect. Just because something is all the rage now, and genuinely needs to evolve quickly to respond to rapidly changing business needs, does not mean that will always be the case.

Don’t get stuck in the past

Where I do encounter visibly un-appreciated IT teams, without resources or standing, that is because somebody somewhere missed that key transition. Instead of their job being "supporting the business", in their minds the job description has become "provisioning servers" or "deploying patches" or whatever.

The Gartner people do call out this attitude - see this slide, from that same keynote:

I encounter this type of push-back all the time, pushing a new solution that is disruptive to the existing process. IT people need to be persuaded of the value of changing something that is known to work. The mantra is "if it ain’t broke, don’t mess with it" - and for very good reason! There is a point, though, when that precautionary attitude crosses over and becomes full-blown reactionary.

A good sanity check is to go through this deck from a presentation by Casey West, and see if you recognise yourself or your routine activities.

I get it, it’s highly technical work, it takes time, you’ll notice it if it goes wrong - but ultimately, the main metric - the only metric that matters - is whether IT delivered the outcome that the business required.

Behind the scenes, we IT people need to figure out all sorts of things, but we need to recognise that, to the business, this is about as interesting as the facilities maintenance. Sure, if the bins don’t get emptied or the lift plummets down its shaft, that would be bad, but as long as everything is humming along nicely, it’s invisible. To the business, IT is the same: as long as business outcomes are being achieved, all is well, and if the business result isn’t there, nothing else matters.

Arguing about what is Mode 1 and Mode 2 is to miss the point

The ongoing argument about bimodal IT is a mirror image of the ITIL conversation. ITIL is a useful roadmap, but it goes bad when people are so focused on the map that they start to try to adapt reality to the map. In the case of bimodal IT, nobody is actually suggesting that organisations split their organisations into two and starve Peter to pay Paula. Instead, the notion of bimodal IT is a useful short-hand to talk about existing realities within IT.

Once the time factor is added, bimodal IT is not that different from pace layering, but that model never really seemed to catch on - perhaps because it was overly complex and dynamic. Instead, we are (still) arguing about how bimodal is too binary and static. After spending time actually talking to Gartner people about this stuff, I recognise it as a description of a spectrum, and one that is dynamic. A particular technology will move along the spectrum over time, and that movement needs to be recognised in the processes that deal with that particular technology.

Now that’s taken care of, we can all go back to arguing about private cloud.


  1. Normally people say "I got back from" whatever event, but being me, I’m still on the road, and won’t get home for some time yet.

What about those new Macs?

The progression is so established, it's now entirely predictable. Several times a year, Apple puts on an event to announce their latest hardware or software product - if indeed that is still a valid distinction, now that our devices are basically spimes already.

Even before the event, the leaks begin. Soon they are coming thick and fast - and hot on their heels are the think pieces, decrying how this time Apple have really lost it, and what can they be thinking in Cupertino?

Finally, the day of the event actually arrives. The actual hardware is greeted with yawns - after all, nearly-final pictures of the products have been available for weeks. Any surprises are limited to software - and even then, extended and increasingly public betas mean that it is only minor software that is actually still new by the time it is officially unveiled. The TV app is an example of an announcement that Apple managed to keep the lid on. Ultimately this was possible because it had no leaky supply chain of physical parts and accessories, and also no need to make a beta available to developers.

Finally, as the last lingering effects of the Reality Distortion Field fade, the post-event hangover begins. That’s when the Macalope starts rubbing his hooves together in glee because of all the wonderful source material that people create just for him to make fun of.

This time around, the major criticism appears to be that Apple are not making enough different devices to satisfy different use cases. The two new MacBook Pro models (with and without Touch Bar) do not have enough RAM or enough ports, or they should still have older ports and SD card slots instead of going over wholesale to USB-C, or whatever. Oh, and while they were at it, Apple should also have updated all of their other computers.

Newsflash: this is how Apple has always done things, at least this century. Android users have always criticised the iPhone for its limited storage options and complete lack of expandability or external connectivity. None of that criticism has stopped the iPhone from taking basically all of the profit in the smartphone market, propelling Apple to being either the biggest company in the world or a close runner-up, depending on exactly when you make your measurement.

And I have yet to need to charge my iPhone 7 while also using the Lightning-to-TRS adapter that came in the box. I have also literally never used the SD card slot on any of my MacBooks over the years.

There was a time when Apple did offer many different options - and it was a massive disaster that almost sank the company. Seriously, check out this table just listing out all of the different models that Apple had under the Performa sub-brand, and how they mapped to the almost-but-not-quite identical "professional" models sold under different names.

That image is from Riccardo Mori, who adds the following bit of context:

Yes, it is a crowded space. The strategy behind this offering seems to be "Let’s try to cover every possible point of the spectrum, with regard to form factor, expandability, target audience, etc." This of course led to confusion, because there were some Macintosh models just as powerful as others, but coming in a different shape, or with one less card slot or expansion bay. And also because there were many Macintosh models delivering a similar performance. There was a lot of differentiation and little differentiation at the same time, so to speak.

On top of the confusion just within Apple’s own line-up, this was the period when you could also buy a legitimate and authorised Macintosh clone. In other words, Macintosh buyers could buy exactly what they wanted, and no two Macs were alike.

Apple nearly died from the results. Once Steve Jobs returned, applied the paddles, and revived the business, he set about rationalising the product line-up, to the point that people joked that "Steve hates SKUs".

So Apple didn’t update the MacBook Air with a Retina screen? Big deal - the computer you want is the MacBook Pro without Touch Bar (known as the "MacBook Escape" to listeners of ATP), which has basically all the good bits from the Air and upgrades everything else. That’s for a 13" screen size; if you wanted the 10" option, the MacBook (aka "MacBook One" or "MacBook Adorable") is your ultra-portable design choice.

YES, these are more expensive devices - again, if you’re surprised, you have not been paying attention. Apple’s products have always been positioned at the premium end of the market, with a price tag to match. It is important to note that those prices are generally not completely out of touch with reality. While you certainly can buy cheaper phones or laptops, once an Android or Windows device is specced up to the same power and size as the Apple equivalent, the price is usually not too far off the Apple option. Apple famously burns people with the price of upgrades to RAM or storage, but again, they have been doing this since I was a Mac tech in high school, literally twenty years ago. This has always been part of their business model; it's not some unwelcome surprise that they only just sprang on people now.

Fundamentally, Apple does not believe in giving people too many options. They are famously opinionated in their product design, and if you’re after ultimate flexibility - well, these may not be the products for you. However, they are the right products for very many people, Apple has made moves like this for a very long time; remember the howls of derision and outrage when they first announced the original iMac with no floppy disk drive? Or remember the MacBook Air - only USB and wifi? And look at it now - basically the default laptop for everyone, if you count its many imitators. These days, even famously brick-like ThinkPads come with dongles, because they’re too thin to accomodate all ports!

On the other hand, the last time Apple tried to be flexible and accomodate too many different user populations, it almost killed them. Is it any wonder that they are doubling down on what worked?

Anyway, this is all academic for me as I’m not due to replace my "Early 2015" MacBook Pro for another year or so, so I will get to see how this Touch Bar thing works in practice - and then get the improved and updated second-generation Touch Bar.


UPDATE: Right after I posted the above, I came across an interview with Phil Schiller in the Independent. The whole thing is worth a read, but this line is particularly illuminating:

We know we made good decisions about what to build into the new MacBook Pro and that the result is the best notebook ever made, but it might not be right for everyone on day one. That’s okay, some people felt that way about the first iMac and that turned out pretty good.

That’s exactly it: Apple made decisions about how to make the best product in their opinion, while recognising that the result may not be perfect for everyone. The opposite of this approach would be that ridiculous - and mercifully stillborn - Project Ara phone design.

What if…

While it may seem obvious to those of us who have been around this market for a while, it was interesting to read that at the recent Puppetconf 2016 event, Puppet still felt the need to state that "In the future code is going to be managed and deployed by other code". If you’re surprised that this sentiment still needs to be articulated explicitly in 2016, you have not been paying attention.

It is certainly true that the leading edge is all "cattle, not pets" and "automate all the things", but there’s a pretty long tail behind that head. Only 2% of workloads are currently running in "the cloud" - although the precise definition is complicated by that nebulous term. Everything else? Still running on premises, or at best in a colo.

The same goes for automation: for every fully-automated containerised full-stack deployment, there are fifty that are not automated.

Nevertheless, Puppet has built a $100M business on automation. I know a bit about this space, having worked at BladeLogic, one of the pioneers of automation. While BladeLogic and Puppet have a history, today I am wondering about whether things might have gone differently.

Luke Kanies, the founder of Puppet, was a BladeLogic employee, although he left before I ever joined. From what I gather, he was a proponent of extending BladeLogic’s foundation in Network Shell, or NSH, into a free open-source platform, on which a commercial product could be built. Instead, BladeLogic’s management preferred to shut down the open-source NSH project and just use the technology inside the commercial BladeLogic product.

For those in the know, NSH was a fantastic tool. At root it was a shell based on ZSH, but with network awareness on top. What this meant was that you could do things like this:

 host $ cp /etc/hosts //host1/etc/hosts

 host $ cd //host2/home

 host2 $ ps -ef | grep inetd

 host2 $ diff //host3/etc/passwd //host4/etc/passwd

 host2 $ iostat 2 5

 host2 $ vi //nthost/c/AUTOEXEC.BAT

 host2 $ nexec nthost reboot Let's reboot NT

You could copy files between systems, compare them or even edit them in place, and generally do all sorts of good things - including developing scripts to automate those tasks. For me at least, this was the first hint of the new world in which systems are no longer managed one by one, with admins ssh’ing into them individually, but in bulk, deploying a single config to many systems in one action. Best of all, it was multi-platform, abstracting the differences between different UNIX variants, and even working on Windows. ZSH on NT? That’s a major selling point right there!

However, even among BladeLogic employees and users, the interactive mode of NSH was a well-kept secret, with most people working exclusively within the BladeLogic GUI. What might the combination of NSH and BladeLogic have become if it had been allowed to flourish? Could a free NSH have taken the place in sysadmin’s hearts that is currently occupied by Puppet? Would this have prevented the long, quiet death of BladeLogic?

20/20 Hindsight

Of course hindsight is a wonderful thing, and what is a fairly uncontroversial strategy to propose in 2016 was not so obvious fifteen years ago. Back then, there were vanishingly few successful hybrid business models that combined an open-source platform with a commercial component. It would not be fair to criticise BladeLogic’s management at the time for not taking that route - especially since they were outstandingly successful with the strategy that they did choose. The hybrid model would have been a major strategic choice, and there is no guarantee that VCs and other investors would have gone along with it.

I just wonder sometimes - what might have been, in a world where a free download of NSH would have gained mindshare in the data center, at the same time that high-powered, PTC-trained sales people were gaining the trust of the C-suite?

Today, in 2016, Robert Stroud, a Forrester analyst at the Puppet event, is saying the following:

Businesses services now involve infrastructure, middleware, and applications, said Stroud. "Moving forward, to be a complete automation environment, the successful player in the space will have a role in all three," he said.

At BladeLogic, we were saying that ten years ago. Regardless of the commercials, this market of automated server configuration management is arguably ten years behind where it should be. Sure, we can deploy things at scale, but managing them at scale is still a challenge - although the challenge is as much one of process as of tools. The cloud has enabled all sorts of new businesses and even entire new business models, but it is still constrained by the complexity and consequent fragility of the underlying infrastructure.

What might be possible if we had solved that problem ten years ago? What new possibilities might have been enabled, that we will only find out about years from now?

A Conversation about AppleTV

A frustrating conversation with AppleSupport over Twitter DM:

Me: I can't enable Siri on my AppleTV, despite language & locale being set to en-US. Is this because my iTunes Store account is tied to the Italian Store?

@AppleSupport: If your Apple ID is tied to the Italian Store, then Siri won't work for your Apple TV as it's not available in Italy at this time.

Me: Why? The whole OS is in English, and I only want Siri to speak English. Plus it works on iOS; why make tvOS different? It should key off language & locale, not where my credit card bills are sent.

@AppleSupport: If the feature isn't available to a specific country, then any Apple ID connected to the country will not be able to access the feature when it's used on the Apple TV. You can keep an eye on this article to see when Siri will be available for Italy under the 'Here's where you can use Siri' section: apple.co/1ppjfUB1

Ugh. This is classic Apple, not giving any explanation.

My assumption is that the limitation is precisely because my Apple ID ties me to the Italian iTunes Store, and to its catalogue. I have complained before about the problems this causes. The way this would affect Siri would be me saying: "Hey Siri, please play The Godfather" and Siri not being able to find it - because in the Italian iTunes Store it’s listed as Il Padrino.

The obvious solution is to let people choose which iTunes Store they want to purchase from, but I suspect this will never happen, for two reasons. One is that Apple is presumably constrained by the licenses from the content owners only to specific countries. In the same way, DVDs (remember DVDs?) were locked to specific regions, and multi-region DVD players were grey-market items.

The other reason is that mine is an edge case, shared only by a relatively small number of expats and other deracinated cosmopolitans. Edge cases that affect Apple employees and their testers get addressed quickly, as John Gruber and Serenity Caldwell discussed referring to the use case of multiple Watches connected to a single iPhone. Anything that does not affect those users? Wait and hope.

We saw the same thing around the initial roll-out of Maps, with high quality data for the Bay Area, and problems elsewhere. The first version of the Watch arguably had the same issue, with one entire physical control dedicated to a feature that was only useful to people all of whose friends were Watch users - not the best idea for a product whose appeal at launch was unclear, and most of whose buyers would be the first adopter in their circle of friends.

I suppose this is almost the definition of a first-world problem, but it’s still frustrating to me when Apple stumbles on something this easy to fix.2


  1. From that page: "Siri is currently available on Apple TV (4th generation) in these countries and languages: Australia (English), Canada (English, French), Germany (German), France (French), Mexico (Spanish), Netherlands (Dutch), Norway (Norwegian Bokmål), Japan (Japanese), Spain (Spanish), Sweden (Swedish), UK (English), US (English, Spanish)." Seriously? Dutch and Norwegian before Italian? NL population is 16.8 M, NO population is 5 M, and Italy is nearly 60 M - not counting Italian-speaking Switzerland. Maybe there’s less AppleTV penetration, but it’s not exactly a small market, and Siri has been able to speak Italian almost since launch.

  2. My other pet peeve: the Control Center on iOS should allow users to 3D-Touch the wifi and Bluetooth controls to select networks and devices respectively. Especially for Bluetooth, the extra step of going into Settings > Bluetooth and waiting for the device to connect adds annoyance and friction when I just want to listen to a podcast or some music.

Updating the Car

As I mentioned in my one-year review of my car, the one aftermarket upgrade I made was to swap the rather dated factory ICE for a CarPlay head unit. That modification is itself now about a year into its service, so it is also about due a review.

The reason for the upgrade is that the factory PCM 2.1 unit was really showing its age, with no USB, Bluetooth, or even Aux-in. In other words, Porsche were way ahead of Apple in removing the headphone jack… Courage!

This meant it was not possible to connect my phone to the car. Instead, I had a second SIM card which lived in the dash itself, and a curly-cord handset in the armrest between the front seats. Very retro, but not the most practical solution.

The worst part, though, was the near decade-old maps. While we do have some roads around here that are a couple of thousand years old, lots of them are quite a bit newer, and even on the Roman roads, it’s important to know about one-way systems and traffic restrictions.

url.jpg

My solution for these problems was to swap the PCM 2.1 system for a head unit that is basically just a dumb screen driven by an iPhone, with no functionality of its own beyond a FM tuner. The reason is that I change phones much more frequently than I change cars, and upgrade the software on my phone more frequently than that.

The specific device is an Alpine ILX-007, and I am quite satisfied with it. It has a decent screen, which seems to be one of the key complaints people have about other CarPlay systems. There is occasionally a little lag, but I assume that’s software rather than hardware, since it’s not reproducible. It did crash on me once, losing my radio presets, but that’s it.

Upgrades

Adding this system to my car has been a substantial upgrade. I have all my music, podcasts and so on immediately available, I can make phone calls, and there is even a dedicated button to talk to Siri. I use this a lot to add reminders to myself while driving, as well as obvious stuff like calling people.

Siri also reads messages that come in while the phone is in CarPlay mode, which is occasionally hilarious when she tries to read something written in a language other than English. On the other hand Siri handles emoji pretty well, reading their name (e.g. "face blowing kisses"), which is very effective at getting the meaning across - although it’s a bit disconcerting the first time it happens!

Contrary to my early fears about CarPlay, it works perfectly with my steering-wheel controls too, so ergonomics are great.

The main win though is that my in-car entertainment now benefits from iOS upgrades in a big way. In particular, iOS 10 brings a redesigned Music screen and a major update to Maps.

Show me around

The Music screen used to have five tabs, which is way too many to navigate while driving. The new version has three tabs, and is generally much clearer to use. I don’t use Apple Music, and one of the things that I hated about the old version was that it would default to the Apple Music tab. The biggest reason why I don’t use streaming services like Apple Music is that the only time I really get to listen to music is while I’m out and about. That means either in aeroplanes, where connectivity is generally entirely absent, or in the car, where it is unreliable and expensive. Therefore, I only listen to music stored locally on my phone, but I had to switch away each and every time I launched the Music app. iOS 10 fixes that.

The biggest change iOS 10 brings to the CarPlay experience is to Maps. Many people have pointed out that Maps will now add a waypoint when the iPhone is disconnected from the car, so that drivers can easily retrace their steps to their parked car. I have to admit that I have never lost my car, but it’s good to know that it’s, say, ten minutes’ walk away when it’s raining.

url.jpg

There are also updated graphics, which are much clearer to read in a hurry. These are not just limited to pretty icons, though; there is actual improved functionality. Previously, users had to switch manually between separate Overview and Detail modes. Annoyingly, there was a significant gap between the greatest zoom on Overview and the widest area on Detail. Also, Detail did not include traffic alerts, while Overview by default showed the entire route, not just currently relevant parts, so a typical journey required a fair amount of switching back and forth between modes.

The new Maps zooms gradually over the course of the journey, always showing current position near one edge of the screen and destination near another edge. This is much more useful, allowing the driver to focus on alerts that are coming up rather than being distracted by ones that are already passed. There is also more intelligence about proposing alternate routes around congestion.

url.jpg

And yes, Maps works perfectly well for me, thank you. I would probably use it anyway given that, as the system-level mapping service, it plugs into everything, so I can quickly get directions to my next appointment from the calendar or go to a contact’s home or office address. The search could still be better, requiring very precise phrasing, but contrary to Maps’ reputation out there, landmarks generally exist and are in the correct place.

I am on record as an Apple Maps fan even in the early days, and it’s improved enormously since then. Don’t believe the hype, give it a go.

The integration is a big deal, as I saw last Wednesday. I was supposed to meet a colleague out and about, so I used Messages to send him my current location. To be extra sure, I chose the actual restaurant I was in, rather than just my GPS location. All my colleague needed to do was to tap on the location in the chat to be routed to my location. Unfortunately, he is one of those who prefer Google Maps, so he eyeballed the pin location and entered that in Google Maps. Unfortunately for him, the location he eyeballed turned out to correspond to a chain, and Google in its eagerness to give a result (any result) gave him the location of the nearest branch of that chain, rather than the specific location I was near.

It all worked out in the end, after a half-hour detour and a second taxi trip…

Trust the system, it works.

The System Works

This is exactly why I got a CarPlay unit in the first place: so I would get updates in the car more frequently than every few years when I get a whole new car. So far, that’s working out just perfectly. The iOS 10 upgrade cleaned up some annoyances and added convenient new features without requiring me to rip out all my dashboard wiring. I won’t consider another car without CarPlay support.

Dinosaurs Evolving

url.jpg

Right now, basically the entire Internet is having a massive collective tantrum over the fact that Apple dropped the headphone jack from the newest iPhone. This, despite the fact that (in a very un-Apple move) the box includes both a Lightning-to-TRS audio jack adapter, and a pair of EarPods with a Lightning connector.

Speaking for myself, I already specced out the iPhone I want, but I’m just waiting to pick it up when I go to San Francisco next month. Some times, geo restrictions actually work in my favour, as even with SF sales tax, the US price is a couple of hundred Euros cheaper than my local price. EarPods don’t fit my ears (which also means the new AirPods are out), so I’ll use the adapter while I look for W1 wireless earphones that I like.

The hysteria over the whole thing reminded me of a situation that is the exact opposite, one where an "obsolete" standard keeps soldiering on, despite repeated attempts to kill it or just declare it dead by fiat.

I am of course referring to email1.

A bit of history

To recap, everything started back in those tie-died days of 1965. This was not yet email as we know it, however; even the @-sign was not added until 1971, although for a while there things like bang paths were viable alternatives.

In those days the Internet in general and email specifically were still things that only academics and governments used. However, in September of 1993 - the September that never ended - Arpanet was opened up to the public, becoming the Internet2. It didn’t take long for the whole thing to degenerate into the wretched hive of scum and villainy that we know and love today.

url.jpg

So why did email survive the transition to the Internet, when many other protocols, including beloved ones like Usenet, withered and died? And why are people still trying to kill it now, with the likes of Slack or Cisco Spark or Microsoft's Yammer or Salesforce Chatter or whatever?

The key thing about email is that it is extremely simple. If you want (and if you can still find an SMTP server that does not require authentication), you can still send email from the command line in just a couple of lines.

 > telnet mail.domain.ext 25

 Trying ???.???.???.???...

 Connected to mail.domain.ext.

 Escape character is '^]'.

 220 mail.domain.ext ESMTP Sendmail ?version-number?; ?date+time+gmtoffset?

 > HELO local.domain.name

 250 mail.domain.ext Hello local.domain.name [loc.al.i.p], pleased to meet you

 > MAIL FROM: mail@domain.ext

 250 2.1.0 mail@domain.ext... Sender ok

 > RCPT TO: mail@otherdomain.ext

 250 2.1.0 mail@otherdomain.ext... Recipient ok

 > DATA

 > Subject: This is a subject

 >

 > This is the body of the email

 > .

 250 2.0.0 ???????? Message accepted for delivery

 > QUIT

 221 2.0.0 mail.domain.ext closing connection

 Connection closed by foreign host.

Try that with Chatter.

Of course nobody would do that except for a stunt - but this is what is going on in the background of every mail client you would actually use on a regular basis. The simplicity of this protocol means that anyone can implement their own tool, offering specific capabilities. Email clients can be arbitrarily simple or complex, and anyone can choose one that suits their own requirements.

Email is email is email

One of the consequences of that simplicity is universality and flexibility. Anyone using email can communicate with anyone else, regardless of what client or server software they are using. Email is email is email.

In contrast, most would-be email killers are walled gardens, consisting of a service that is tightly integrated with its client app and does not allow third-party clients. This makes it much harder for innovation to happen, because there is only one provider, and they deliver only the functionality that they want and can build. If you want a feature to be added to Slack, you can’t build your own Slack client; you have to petition Slack to do it, and they choose whether to implement that feature or not.

Even now, more than fifty years into the age of email, there is constant experimentation, with new email clients popping up all the time. Right now I am using one called Notion, which implements all sorts of gestures to triage your inbox. You can "star" messages, file them, and even snooze them so that they go away but come back to your inbox later. Even in the simplest clients, you still have the option to read something and then mark it as unread so that you don’t forget about it.

Try snoozing a notification from Facebook Messenger, or marking a WhatsApp message as unread to return to it later. Can’t be done.

You don’t need a fancy client, either. There are a ton of features built right into the protocol. Think of the concise power of the CC and BCC headers, or the simple "forward" action. With CC ("carbon copy", a coelacanth term surviving from a previous age of office technology) you can make people aware of a conversation, while also making it clear that they are being informed but are not expected to take action. BCC ("blind carbon copy") lets you send a message without making each participant aware of all of the others, so you can let your boss see the email you sent without the recipients seeing their name. BCC should also be used by anyone sending mass emails, to avoid disclosing the entire recipient list to every recipient, but people regularly forget - with hilarious consequences.

In contrast, chat systems are symmetrical. You can add people to a group chat, but it’s a flat hierarchy; no question of someone being informed as opposed to an active participant, or a silent observer. Forwarding a message with its context is also usually impossible. Sure, you can easily copy the text, but not the group participants and so on. Email’s simplicity make all of these features universal, independent of the generosity of one particular developer.

Email just won’t die

Email is unkillable because it provides substantial utility, and it is easy for people to build additional value on top of a common standard. In other words, if it's a dinosaur, it's the sort that didn't get killed by an asteroid, but instead grew feathers and is still around today.

The old TRS audio jack has only ubiquity in its favour. It does not offer any particular functionality; the TRRRS extended spec that lets in-line remotes work is a horrible hack, and it’s kind of surprising that it works as well as it does.

Also, most iPhone users just use the EarPods they get with their device, so I would not have been surprised if, absent the media firestorm and rending of vestments, people would have just used the Lightning EarPods and not even have noticed the change.

And if you feel that strongly about it, use the adapter that Apple puts right in the box.

Who wants to bet that inside of two years, all the major Android manufactures offer phones with audio over Micro USB or something similar, instead of TRS? Some vendors already do…


  1. Yes, I have given up on calling it "e-mail", although I still think that is more correct.

  2. An internet, the Internet. Come at me.