Showing all posts tagged apple:

Thoughts about WWDC '17

First of all, let’s get the elephant in the room out of the way; no new iPhone was announced. I was not necessarily expecting one to show up - that seems more suited to a September event, unless there were specific iOS features that were enabled by new hardware and that developers needed to know about.

We did get a whole ton of new features for iOS 11 (it goes up to eleven!), but many of them were aimed squarely at the iPad. With no new iPhone, the iPad got most of the new product glory, sharing only with the iMac Pro and the HomePod (awful name, by the way).

On that note, some people were confused by the iMac Pro, but Apple has helpfully clarified that there is also going to be a Mac Pro and external displays to go with it:

In addition to the new iMac Pro, Apple is working on a completely redesigned, next-generation Mac Pro architected for pro customers who need the highest-end, high-throughput system in a modular design, as well as a new high-end pro display.

I doubt I will ever buy a desktop Mac again, except possibly if Apple ever updates the Mac mini, so this is all kind of academic for me - although I really hope the dark-coloured wireless extended keyboard from the iMac Pro will also be available for standalone purchase.

What I am really excited about is the new 10.5" iPad Pro and the attendant features in iOS 111. The 12.9" is too big for my use case (lots of travel), and the 9.7" Pro always looked like a placeholder device to me. Now we have a full lineup, with the 9.7" non-Pro iPad significantly different from the 10.5" iPad Pro, and the 12.9" iPad Pro there for people who really need the larger size - or maybe just don’t travel with their iPad quite as much as I do.

My current iPad (an Air 2) is my main personal device apart from my iPhone. The MacBook Pro is my work device, and opening it up puts me in "work mode", which is not always a good thing. On the iPad, I do a ton of reading, but I also create a fair amount of content. The on-screen keyboard and various third-party soft-tip styluses (styli?) work fine, but they’re not ideal, and so I have lusted after an iPad Pro for a while now. However, between the lack of sufficient hardware differentiation compared to what I have2, and lack of software support for productivity, I never felt compelled to take the plunge.

Now, I can’t wait to get my hands on an iPad Pro 10.5".

I already use features like the sidebar and side-by-side multitasking, but what iOS 11 brings is an order of magnitude beyond - especially with the ability to drag & drop between applications. Right now, while I may build an outline of a document on my iPad, I rarely do the whole thing there, because it is just so painful to do any complex work involving multiple switches between applications - so I end up doing all of that on my Mac.

The problem is that there is a friction in working with a Mac; I need (or feel that I need) longer stretches of time and more work-like environments to pull out my Mac. That friction is completely absent with an iPad; I am perfectly happy to get it out if I have more than a minute or so to myself, and there is plenty of room to work on an iPad in settings (such as, to pick an example at random, an economy seat on a short-haul flight) where there is simply no room to type on a Mac.

The new Files app also looks very promising. Sure, you can sort of do everything it does in a combination of iCloud Drive, Dropbox, and Google Drive, and I do - but I always find myself hunting around for the latest revision, and then turning to the share sheet to get whatever I need to where I can actually work on it.

With iOS 11, it looks like the iPad will truly start delivering on its promise as (all together now) a creation device, not just a consumption device.

Ask me again six months from now…

And if you want more exhaustive analysis, Federico Viticci has you covered.


  1. Yes, there was also some talk about the Watch, but since I gave up on fitness tracking, I can't really see the point in that whole product line. That's not to say that it has no value, just that I don't see the value to me. It certainly seems to be the smartwatch to get if you want to get a smartwatch, but the problem with that proposition is that I don't particularly want any smartwatch. 

  2. To me this is the explanation for the 13 straight quarters of iPad sales drop: an older iPad is still a very capable device, and outside of very specific use cases, or people upgrading from something like an iPad 2 or 3, there hasn’t been a compelling reason to upgrade - yet. For me at least, that compelling reason has arrived, with the combination of 10.5" iPad Pro and iOS 11. After the holiday quarter, I suppose we will find out how many people feel the same way. 

Talk Softly

With the advent of always-on devices that are equipped with sensitive microphones and a permanent connection to the Internet, new security concerns are emerging.

Virtual assistants like Apple’s Siri, Microsoft’s Cortana and Google Now have the potential to make enterprise workers more productive. But do “always listening" assistants pose a serious threat to security and privacy, too?

Betteridge’s Law is in effect here. Sure enough, the second paragraph of the article discloses its sources:

Nineteen percent of organizations are already using intelligent digital assistants, such as Siri and Cortana, for work-related tasks, according to Spiceworks’ October 2016 survey of 566 IT professionals in North America, Europe, the Middle East and Africa.

A whole 566 respondents, you say? From a survey run by a help desk software company? One suspects that the article is over-reaching a bit - and indeed, if we click through to the actual survey, we find this:

Intelligent assistants (e.g., Cortana, Siri, Alexa) used for work-related tasks on company-owned devices had the highest usage rate (19%) of AI technologies

That is a little bit different from what the CSO Online article is claiming. Basically, anyone with a company-issued iPhone who has ever used Siri to create an appointment, set a reminder, or send a message about anything work-related would fall into this category.

Instead, the article makes the leap from that limited claim to extrapolating that people will be bringing their Alexa device to work and connecting it to the corporate network. Leaving aside for a moment the particular vision of hell that is an open-plan office where everyone is talking into the air all the time, what does that mean for the specific recommendations in the article?

  1. Focus on user privacy
  2. Develop a policy
  3. Treat virtual assistant devices like any IoT device
  4. Decide on BYO or company-owned
  5. Plan to protect

These are actually not bad recommendations - but they are so generic as to be useless. Worse, when they do get into specifics, they are almost laughably paranoid:

Assume all devices with a microphone are always listening. Even if the device has a button to turn off the microphone, if it has a power source it’s still possible it could be recording audio.

This is drug-dealer level of paranoia. Worrying that Alexa might be broadcasting your super secret and valuable office conversations does not even make the top ten list of concerns companies should have about introducing such devices into their networks.

The most serious threat you can get from Siri at work is co-workers pranking you if you enable access from the lock screen. In that case, anyone can grab your unattended iPhone and instruct Siri to call you by some ridiculous name. Of course I would never sabotage a colleague’s phone by renaming him “Sweet Cakes". Ahem. Interestingly, it turns out that the hypothetical renaming also extends to the entry in the Contacts…

The real concern is that by focusing on these misguided recommendations, the focus is taken off advice that would actually be useful in the real world. For instance, if you must have IoT devices in the office for some reason, this is good advice:

One way to segment IoT devices from the corporate network is to connect them to a guest Wi-Fi network, which doesn’t provide access to internal network resources.

This recommendation applies to any device that needs Internet access but does not require access to resources on the internal network. This will avoid issues where, by compromising a device (or its enabling cloud service), intruders are able access your internal network in what is known as a “traversal attack". If administrators restrict the device’s access to the network, that will also restrict the amount of damage an intruder can do.

Thinking about access to data is a good idea in general, not just for voice assistants or IoT devices:

Since personal virtual assistants “rely on the cloud to comprehend complex commands, fetch data or assign complex computing tasks to more resources," their use in the enterprise raises issues about data ownership, data retention, data and IP theft, and data privacy enforcement that CISOs and CIOs will need to address.

Any time companies choose to adopt a service that relies on the cloud, their attack surface is not limited to the device itself, but also extends to that back-end service - which is almost certainly outside their visibility and control. Worse, in a BYOD scenario, users may introduce new devices and services to the corporate network that are not designed or configured for compliance with organisations’ security and privacy rules.

Security is important - but let’s focus on getting the basics right, without getting distracted overly-specific cybersecurity fantasy role-playing game scenarios involving Jason Bourne hacking your Alexa to steal your secrets.

New Mac Fever

Apple bloggers are all very excited about the announcement of a new Mac Pro. The best roundup I have seen is on Daring Fireball: The Mac Pro Lives.

I'm not a Mac Pro user, nor frankly am I ever likely to be. My tastes lie more at the other end of the spectrum, with the ultra-portable MacBook (aka MacBook Adorable). However, there was one interesting tidbit for me in the Daring Fireball report:

Near the end, John Paczkowski had the presence of mind to ask about the Mac Mini, which hadn’t been mentioned at all until that point. Schiller: “On that I’ll say the Mac Mini is an important product in our lineup and we weren’t bringing it up because it’s more of a mix of consumer with some pro use. … The Mac Mini remains a product in our lineup, but nothing more to say about it today."

While there are certainly Mac Mini users who choose it as the cheapest Mac, and perhaps as a way to keep using a monitor and other peripherals that used to be plugged into a PC, there is a substantial contingent of Mac Mini "pro" users. Without getting into Macminicolo levels of pro-ness, I run mine headless in a cupboard, where it serves iTunes and runs a few other services. It's cheap, quiet, and reliable, which makes it ideal for that role. I don't necessarily need ultimate power - average utilisation is extremely low, although there is the odd peak - but I do want to be reassured that this is a product line that will stick around, just in case my current Mac Mini breaks.

The most important Macs are obviously the MacBook and MacBook Pros, but it's good to know that Apple recognises a role for the Mac Pro - and for the Mac Mini.

Smart Swatch

Remember Swatch? The must-have colourful plastic watches of the 80s and 90s? They are back in the news, with their new plan to produce their own smartwatch operating system.

Swatch plans to develop its own operating system as the Swiss watchmaker seeks to combine smart technology with the country’s expertise in making timepieces and miniaturisation, chief executive Nick Hayek has said.

Mr Hayek added that he wanted to avoid relying on Apple’s iOS and Google’s Android and provide a “Swiss" alternative offering stronger data protection and ultra-low energy consumption.

This new plan has caused all sorts of consternation around the Internet, but I was disposed to ignore it - until now. I just received this week's Monday Note, by the usually reliable Jean-Louis Gassée.

M. Gassée makes some initially good points about the complexity of operating systems, the immaturity of the smartwatch market, and the short timescales involved. Swatch intends to ship actual products by the end of 2018, which is barely any time at all when it comes to developing and shipping an entirely new physical product at mass-market scale. However, I do wonder whether he is falling into the same trap that he accuses Hayek and Swatch of falling into.

… in 2013, Hayek fils publicly pooh-poohed smart watches:
"Personally, I don’t believe it’s the next revolution… Replacing an iPhone with an interactive terminal on your wrist is difficult. You can’t have an immense display."

I tend to agree with Hayek, as it happens; the "terminal on the wrist" is pretty much a side show. The one stand-out use case for smart watches1 right now appears to be sensors and fitness. If that's not compelling, then there is very little else to attract you to smartwatches, even if you are a committed technophile like me. For myself, after wearing a Jawbone Up! for a year or two, I determined that I was not making use of the data that were gathered. The activity co-processor in my iPhone is ample for my limited needs.

What Is A Smartwatch?

The key point, however, is that Swatch have not announced an actual smart watch, but rather "an ecosystem for connected objects". M. Gassée even calls out some previous IoT form within CSEM, Swatch's partner in this venture, which recently produced the world's smallest Bluetooth chip.

The case against the wisdom of the Swatch project - the complexity of OS development and maintenance, the need for a developer ecosystem, and so on - assumes that Swatch are contemplating a direct rival for Apple's watchOS and Google Gear. What if that's not what's going on at all?

What if Swatch is going back to its roots, and making something simple and undemanding, but with the potential to be ubiquitous? The ecosystem for a smartwatch is now widespread: everyone has a smartphone, NFC is everywhere, from payment terminals to subway turnstiles. What if Swatch just intends to piggyback on that by embedding a few small and cheap sensors in its watches, without even having a screen at all?

Now that would be a Swatch move. In fact, it's such a Swatch move that they've done it before, with their Snow Pass line:

Its ski watch stores ski pass information and has an antenna that communicates with a scanner at the fast-track ski lift entrance. One swipe of the wrist and you're through.

That description sounds a lot like ApplePay to me - or really any NFC system. Add some pretty basic sensors, and you've got 80% of the smartwatch functionality that people actually use for 20% of the price.

Seen through this lens, the focus on privacy and security makes sense. It has been said that "the S in IoT stands for 'security'", and we could certainly all use an IoT player that focuses on that missing S. If the sensors themselves are small and simple enough, they would not need frequent updates and patches, as there would be nothing to exploit. The companion smartphone app would be the brains of the operation and gateway to all the data gathered, and could be updated as frequently as necessary, without needing to touch the sensors on the watch.

So What Is Swatch Really Up To?

As to why Swatch would even be interested in entering into such a project, remember that these days Swatch is part of a group that sprawls across 70 different brands, most far more up-scale (albeit less profitable) than lowly Swatch with its plastic watches. Think Omega, Breguet, Glashütte, Longines, or Blancpain. The major threat to those kinds of watches is not any single other watch; most watch lovers own several different mechanical watches, and choose one or another to wear for each day, activity, or occasion. In my own small way, I own three mechanical watches (and two quartz), for instance.

For a while now, and accelerating since the release of the iPhone, the competition for watches was - no watch at all. Why bother to wear a watch, the thinking went, when your smartphone can tell the time much more accurately? But now, insidiously, the competition is a watch again - but it is the last watch its owners will ever wear. Once you start really using an Apple Watch, you don't want to take it off, lest you miss out on all those activities being measured. Circles will go unfilled if you wear your Rolex to dinner.

But what if every watch you buy, at least from The Swatch Group, gives you the same measurements and can maintain continuity through the app on your phone? What if all of your watches can also let you on the subway, pay for your groceries, and so on? Other players such as Breitling and Montblanc have also been looking into this, but I think Swatch has a better chance, if only because they start from scale.

Now we are back to the comfortable (and profitable) status quo ante for the Swiss watch industry, in which watch aficionados own several different watches which they mix and match, but with each one part of the same connected experience.

Analogies are dangerous things. The last few years have conditioned us to watch out for the "PC guys are not just going to figure this out"-type statements from incumbents about to be disrupted. What if this time, the arrow points the other way? What if Swatch has finally figured out a way for the traditional watch industry to fight back against the ugly, unclassy interloper?


  1. In a further sign of the fact that this is still a developing market, even auto-correct appears to get confused between "smartwatch" and "smart watch". 

Own Your Interfaces

The greatest benefit of the Internet is the democratisation of technology. Development of customised high-tech solutions is no longer required for success, as ubiquitous commodity technology makes it easy to bring new product offerings to market.

Together with the ongoing move from one-time to recurring purchases, this process of commoditisation moves the basis of the competition to the customer experience. For most companies, the potential lifetime value of a new customer is now many times the profit from their initial purchase. This hoped-for future revenue makes it imperative to control the customer's experience at every point.

As an illustration, let us consider two scenarios involving outsourcing of products that are literally right in front of their users for substantial parts of the day.

Google Takes Its Eye Off the Watch

The first is Google and Android's answer to the Apple Watch, Android Wear. As is (usually) their way, Google have not released their own smartwatch product. Instead, they have released the Android Wear software platform, and left it to their manufacturing partners to build the actual physical products.

Results have been less than entirely positive:

If Android Wear is to be taken as seriously as the Apple Watch, we actually need an Android version of the Apple Watch. And these LG watches simply aren't up to the task.

Lacking the sort of singular focus and vertical integration between hardware and software that Apple brings to bear, these watches fail to persuade, and not by a little:

I think Google and LG missed the mark on every level with the Style, and on the basis of features alone that it is simply a bad product.

So is the answer simply to follow Apple's every move?

It is certainly true Google have shown with their Nexus and Pixel phones just how much better a first-party Android phone can be, and it is tempting to extrapolate that success to a first-party Google Watch. However, smartwatches are still very much a developing category, and it is not at all clear whether they can go beyond the current fitness-focused market. In fact, I would not be surprised to see a contraction in the size of the overall smartwatch market. Many people who bought a first-generation device out of curiosity and general technophilia may well opt not to replace that device.

Apple Displays Rare Clumsiness

In that case, let us look at an example outside the smartwatch market - and one where the fumble was Apple's.

Ever since Retina displays became standard first on MacBooks1 and then on iMacs, Mac users have clamoured for a large external display from Apple, to replace the non-Retina Apple Thundebolt Display that still graces many desks. Bandwidth constraints meant that this was not easy to do until a new generation of hardware came to market, but Apple fans were disappointed when, instead of their long-awaited Apple Retina 5K Display, they were recommended to buy a pretty generic-looking offering from LG.

Insult was added to injury when it became known that the monitor was extremely sensitive to interference, and in fact became unusable if placed anywhere near a wifi router:

the hardware can become unusable when located within 2 meters of a router.

Two metres is not actually that close; it's over six feet, if you're not comfortable with metric units. Many home office setups would struggle with that constraint - I know mine would.

Many have pointed out that one of the reasons for preferring expensive Apple solutions is that they are known to be not only beautifully designed, but obsessively over-engineered. It beggars belief that perfectionist, nit-picking Apple would have let a product to market with such a basic flaw - and yet, today, if an Apple fan spends a few thousand dollars on a new MacBook Pro and a monitor in an Apple Store, they will end up looking at a generic-looking LG monitor all day - if, that is, they can use the display at all.

Google and Apple both ceded control of a vitally important part of the customer experience to a third party, and both are now paying the price in terms of dissatisfied users. There are lessons here that also apply outside of manufacturing and product development.

Many companies, for instance, outsource functions that are seen as ancillary to third parties. A frequent candidate for these arrangements is support - but to view support this way is a mistake. It is a critical component of the user experience, and all the more so because it is typically encountered at times of difficulty. A positive support experience can turn a customer into a long-term fan, while a negative one can put them off for good.

Anecdata Time

A long time ago and far far away, I did a stint in technical support. During my time there, my employer initiated a contract with a big overseas outsourcing firm. The objective was to add a "tier zero" level of support, which could deal with routine queries - the ones where the answer was a polite invitation to Read The Fine Manual, basically - and escalate "real" issues to the in-house support team.

The performance of the outsourcer was so bad that my employer paid a termination fee to end the contract early, after less than one year. Without going into the specifics, the problem was that the support experience was so awful that it was putting off our customers. Given that we sold mainly into the large enterprise space, where there is a relatively limited number of customers in the first place, and that we aimed to cross-sell our integrated products to existing customers, a sudden increase in the number of unhappy customers was a potential disaster.

We went back to answering the RTFM queries ourselves, customer sat went back up into the green, and everyone was happy - well, except for the outsourcer, presumably. The company had taken back control of an important interface with its customers.

Interface to Differentiate

There are only a few of these interfaces and touch-points where a company has an opportunity to interact with its customers. Each interaction is an opportunity to differentiate against the competition, which is why it is so vitally important to make these interactions as streamlined and pleasant as possible.

This requirement is doubly important for companies who sell subscription offerings, as they are even more vulnerable to customer flight. In traditional software sales, the worst that can happen is that you lose the 20% (or whatever) maintenance, as well as a cross-sell or up-sell opportunity that may or may not materialise. A cancelled subscription leaves you with nothing.

A customer who buys an Android Wear smartwatch and has a bad experience will not remember that the watch was manufactured by LG; they will remember that their Android Wear device was not satisfactory. In the same way, someone who spends their day looking at an LG monitor running full-screen third-party applications - say, Microsoft Word - will be more open to considering a non-Apple laptop, or not fighting so hard to get a MacBook from work next time around. Both companies ceded control of their interface with their customers.

Usually companies are very eager to copy Apple and Google's every move. This is one situation where instead there is an opportunity to learn from their mistakes. Interfaces with customers are not costs to be trimmed; instead, they can be a point of differentiation. Treat them as such.


Image by Austin Neill via Unsplash


  1. Yes yes, except for the Air. 

Mr Cook, Tear Down This Geo-Fence

Among other signs and portents, the first few days of 2017 have also provided more examples of the negative consequences of geofencing: first China demanded removal of the NYT app from app stores, and now Russia requires removal of the LinkedIn app.

I joked that the next story would involve Saudi Arabia demanding the removal of Grindr - but it was pointed out to me that Grindr and other similar apps are already banned in the Kingdom, unless of course you use a VPN. I was not that surprised, since one of my memories from my first visit to the sandbox was of censored episodes of The Big Bang Theory, where not only were Penny’s legs blurred, but something was being bleeped out in the dialogue. I couldn’t work out what it was at the time, so when I got back home I went looking for the episode in question. It turned out to be… menstruation. Yes. Bleeped out.

I am not arguing (today) about the right of a privately-owned TV channel to choose to censor its content in the country they operate in - although I do question what is left of The Big Bang Theory once the censors have had their way with it. They might as well just not show it.

No, my beef is with the big international companies such as Apple enabling this sort of local ban. Geofencing - the practice of restricting content by geographic region - was originally instituted in the iTunes Store and App Store to comply with IP protection requirements, but it was always consumer-hostile. Basically the idea was to enable differential pricing and different release dates for the same content in different regions. This policy would replicate what region-coding on DVDs delivered, preventing DVDs from one region from showing content on players from another region. This policy was presented as enabling the studios to charge less for films in India or Africa or SE Asia or wherever than in the US and Europe, but given rampant piracy in those regions, I doubt it had much impact. Meanwhile, region coding made it extremely difficult for US consumers to watch European films, or for European consumers to watch US films at the same time as they were released in the US.

Nowadays, nobody really does staggered DVD releases any more anyway. With global fanbases who communicate over the Internet, even TV shows - let alone feature films - have been forced to launch more or less simultaneously in all territories. Content producers had believed that the most die-hard consumers would wait patiently for six months, ignoring spoilers from their US peers, and be grateful for the content when it finally arrived in their region. What actually happened was that they would simply hit BitTorrent or YouTube or whatever the next morning. Underground “fansub" communities grew up to provide subtitles for foreign-language content, more or less overnight.

Studios and broadcasters eventually figured out that it was better to enable their fans than obstruct them. I remember Lost as being the first series to really embrace this, to the point that in Italy at least, Lost episodes were broadcast in English with subtitles instead of being redubbed, because this could be done in near real time. The dubbed version would be broadcast some time later, but at least the true fans had got their real-time fix through an approved channel.

Today, the online stores are the only ones that still strictly enforce geofencing. I still cannot buy TV shows through the Italian iTunes Store - not even TV shows that are available in Italy through other means. I am also tied to the Italian Store (as opposed to the UK or US one) by my credit card’s billing address. This is in fact the last thing that is keeping me as a cable TV customer. If I could just buy my TV shows through iTunes - the ones that aren’t already on Netflix, that is - I’d kill my cable subscription in a heartbeat.

The thing is, this sort of restriction used to be “just" hostile to consumers. Now, it is turning into a weapon that authoritarian regimes can wield against Apple, Google, and whoever else. Nobody would allow Russia to ban LinkedIn around the world, or China to remove the New York Times app everywhere - but because dedicated App Stores exist for .ru and .cn, they are able to demand these bans as local exceptions, and even defend them as respecting local laws and sensibilities. If there were one worldwide App Store, this gambit would not work.

What are the downsides of a worldwide App Store without geographic restrictions? When the App Store was set up, Apple needed to pacify the studios to get access to their content libraries. But now, in 2017, what would the studios do - turn down all the revenue from the iTunes Store? I don’t think so.

Mr Cook, tear down this geo-fence!

Previously


Images by Cole Patrick and Gili Benita via Unsplash

What about those new Macs?

The progression is so established, it's now entirely predictable. Several times a year, Apple puts on an event to announce their latest hardware or software product - if indeed that is still a valid distinction, now that our devices are basically spimes already.

Even before the event, the leaks begin. Soon they are coming thick and fast - and hot on their heels are the think pieces, decrying how this time Apple have really lost it, and what can they be thinking in Cupertino?

Finally, the day of the event actually arrives. The actual hardware is greeted with yawns - after all, nearly-final pictures of the products have been available for weeks. Any surprises are limited to software - and even then, extended and increasingly public betas mean that it is only minor software that is actually still new by the time it is officially unveiled. The TV app is an example of an announcement that Apple managed to keep the lid on. Ultimately this was possible because it had no leaky supply chain of physical parts and accessories, and also no need to make a beta available to developers.

Finally, as the last lingering effects of the Reality Distortion Field fade, the post-event hangover begins. That’s when the Macalope starts rubbing his hooves together in glee because of all the wonderful source material that people create just for him to make fun of.

This time around, the major criticism appears to be that Apple are not making enough different devices to satisfy different use cases. The two new MacBook Pro models (with and without Touch Bar) do not have enough RAM or enough ports, or they should still have older ports and SD card slots instead of going over wholesale to USB-C, or whatever. Oh, and while they were at it, Apple should also have updated all of their other computers.

Newsflash: this is how Apple has always done things, at least this century. Android users have always criticised the iPhone for its limited storage options and complete lack of expandability or external connectivity. None of that criticism has stopped the iPhone from taking basically all of the profit in the smartphone market, propelling Apple to being either the biggest company in the world or a close runner-up, depending on exactly when you make your measurement.

And I have yet to need to charge my iPhone 7 while also using the Lightning-to-TRS adapter that came in the box. I have also literally never used the SD card slot on any of my MacBooks over the years.

There was a time when Apple did offer many different options - and it was a massive disaster that almost sank the company. Seriously, check out this table just listing out all of the different models that Apple had under the Performa sub-brand, and how they mapped to the almost-but-not-quite identical “professional" models sold under different names.

That image is from Riccardo Mori, who adds the following bit of context:

Yes, it is a crowded space. The strategy behind this offering seems to be “Let’s try to cover every possible point of the spectrum, with regard to form factor, expandability, target audience, etc." This of course led to confusion, because there were some Macintosh models just as powerful as others, but coming in a different shape, or with one less card slot or expansion bay. And also because there were many Macintosh models delivering a similar performance. There was a lot of differentiation and little differentiation at the same time, so to speak.

On top of the confusion just within Apple’s own line-up, this was the period when you could also buy a legitimate and authorised Macintosh clone. In other words, Macintosh buyers could buy exactly what they wanted, and no two Macs were alike.

Apple nearly died from the results. Once Steve Jobs returned, applied the paddles, and revived the business, he set about rationalising the product line-up, to the point that people joked that "Steve hates SKUs".

So Apple didn’t update the MacBook Air with a Retina screen? Big deal - the computer you want is the MacBook Pro without Touch Bar (known as the "MacBook Escape" to listeners of ATP), which has basically all the good bits from the Air and upgrades everything else. That’s for a 13" screen size; if you wanted the 10" option, you haven’t been paying attention, because the MacBook (aka "MacBook One" or "MacBook Adorable") is your ultra-portable design choice.

YES, these are more expensive devices - again, if you’re surprised, you have not been paying attention. Apple’s products have always been positioned at the premium end of the market, with a price tag to match. It is important to note that those prices are generally not completely out of touch with reality. While you certainly can buy cheaper phones or laptops, once an Android or Windows device is specced up to the same power and size as the Apple equivalent, the price is usually not too far off the Apple option. Apple famously burns people with the price of upgrades to RAM or storage, but again, they have been doing this since I was a Mac tech in high school, literally twenty years ago. This has always been part of their business model; it's not some unwelcome surprise that they only just sprang on people now.

Fundamentally, Apple does not believe in giving people too many options. They are famously opinionated in their product design, and if you’re after ultimate flexibility - well, these may not be the products for you. However, they are the right products for very many people, Apple has made moves like this for a very long time; remember the howls of derision and outrage when they first announced the original iMac with no floppy disk drive? Or remember the MacBook Air - only USB and wifi? And look at it now - basically the default laptop for everyone, if you count its many imitators. These days, even famously brick-like ThinkPads come with dongles, because they’re too thin to accomodate all ports!

On the other hand, the last time Apple tried to be flexible and accomodate too many different user populations, it almost killed them. Is it any wonder that they are doubling down on what worked?

Anyway, this is all academic for me as I’m not due to replace my "Early 2015" MacBook Pro for another year or so, so I will get to see how this Touch Bar thing works in practice - and then get the improved and updated second-generation Touch Bar.


UPDATE: Right after I posted the above, I came across an interview with Phil Schiller in the Independent. The whole thing is worth a read, but this line is particularly illuminating:

We know we made good decisions about what to build into the new MacBook Pro and that the result is the best notebook ever made, but it might not be right for everyone on day one. That’s okay, some people felt that way about the first iMac and that turned out pretty good.

That’s exactly it: Apple made decisions about how to make the best product in their opinion, while recognising that the result may not be perfect for everyone. The opposite of this approach would be that ridiculous - and mercifully stillborn - Project Ara phone design.

A Conversation about AppleTV

A frustrating conversation with AppleSupport over Twitter DM:

Me: I can't enable Siri on my AppleTV, despite language & locale being set to en-US. Is this because my iTunes Store account is tied to the Italian Store?

@AppleSupport: If your Apple ID is tied to the Italian Store, then Siri won't work for your Apple TV as it's not available in Italy at this time.

Me: Why? The whole OS is in English, and I only want Siri to speak English. Plus it works on iOS; why make tvOS different? It should key off language & locale, not where my credit card bills are sent.

@AppleSupport: If the feature isn't available to a specific country, then any Apple ID connected to the country will not be able to access the feature when it's used on the Apple TV. You can keep an eye on this article to see when Siri will be available for Italy under the 'Here's where you can use Siri' section: apple.co/1ppjfUB1

Ugh. This is classic Apple, not giving any explanation.

My assumption is that the limitation is precisely because my Apple ID ties me to the Italian iTunes Store, and to its catalogue. I have complained before about the problems this causes. The way this would affect Siri would be me saying: "Hey Siri, please play The Godfather" and Siri not being able to find it - because in the Italian iTunes Store it’s listed as Il Padrino.

The obvious solution is to let people choose which iTunes Store they want to purchase from, but I suspect this will never happen, for two reasons. One is that Apple is presumably constrained by the licenses from the content owners only to specific countries. In the same way, DVDs (remember DVDs?) were locked to specific regions, and multi-region DVD players were grey-market items.

The other reason is that mine is an edge case, shared only by a relatively small number of expats and other deracinated cosmopolitans. Edge cases that affect Apple employees and their testers get addressed quickly, as John Gruber and Serenity Caldwell discussed referring to the use case of multiple Watches connected to a single iPhone. Anything that does not affect those users? Wait and hope.

We saw the same thing around the initial roll-out of Maps, with high quality data for the Bay Area, and problems elsewhere. The first version of the Watch arguably had the same issue, with one entire physical control dedicated to a feature that was only useful to people all of whose friends were Watch users - not the best idea for a product whose appeal at launch was unclear.

I suppose this is almost the definition of a first-world problem, but it’s still frustrating to me when Apple stumbles on something this easy to fix.2


  1. From that page: "Siri is currently available on Apple TV (4th generation) in these countries and languages: Australia (English), Canada (English, French), Germany (German), France (French), Mexico (Spanish), Netherlands (Dutch), Norway (Norwegian Bokmål), Japan (Japanese), Spain (Spanish), Sweden (Swedish), UK (English), US (English, Spanish)." Seriously? Dutch and Norwegian before Italian? NL population is 16.8 M, NO population is 5 M, and Italy is nearly 60 M - not counting Italian-speaking Switzerland. Maybe there’s less AppleTV penetration, but it’s not exactly a small market, and Siri has been able to speak Italian almost since launch. 

  2. My other pet peeve: the Control Center on iOS should allow users to 3D-Touch the wifi and Bluetooth controls to select networks and devices respectively. Especially for Bluetooth, the extra step of going into Settings > Bluetooth and waiting for the device to connect adds annoyance and friction when I just want to listen to a podcast or some music. 

Flattering Apple

It’s long been obvious that other phone1 manufacturers largely follow Apple’s lead. Phones used to look like all sorts of things, but now they all look like iPhones.

This is part of Apple’s modus operandi: they are never the first into a particular field, but they do tend to refine it, if not define it outright. They didn’t make the first personal computer, but they did make the one that influenced all the others. They didn’t make the first MP3 player, but can you even name one apart from the iPod? In the same way, they didn’t make the first mobile phone or even the first smartphone - but they refined both form and function in a way that instantly made everything else obsolete.

Some manufacturers of Android handsets at least make an effort to differentiate themselves, but most are pretty shameless, even copying advertising and packaging. However, most of the time they do at least go to the effort of tracing over Apple’s designs onto their own paper. Now Samsung has just given up, and submitted a patent application (for removable watch bands) which actually includes Apple’s own drawings of an Apple Watch!

This sort of thing goes on all the time - seriously, there are entire blogs just dedicated to documenting instances of Samsung shamelessly copying Apple.

Things have finally got to the point that a group of eminent designers has filed an amicus curiae brief documenting and explaining the negative impacts of this practice.

Well done to them, and I hope it helps. I don’t wish Samsung any ill, but the Android world needs its own identity. By all means copy ideas from Apple - every desktop GUI around (and many that are dead) follow conventions first set by Apple in the original Lisa and Macintosh2. The execution of the idea needs to be original, though.

I’m not saying there need to be gimmicks; if that sort of thing were useful, we’d all be using Windows phones, with those Live Tiles and dockable apps. I just wish there were more recognition that Apple makes choices for reasons, and others may wish to make different choices for different reasons.


  1. In 2016, they’re no longer smartphones, they’re just phones. 

  2. And of course Apple took ideas from Xerox PARC, and so on - but Apple took those ideas and adapted them to a vision that they already had, rather than just slavishly copying what the researchers at PARC had been doing. 

Send In The Clones

Since my last post rehashing ancient IT industry history seemed to go over well, here’s another one.

In that previous post, I used the story of the HP acquisition of Mercury and its rumoured impending spin-off as a cautionary tale about handling acquisitions correctly. There is never any lack of “fantasy M&A" going on in this industry, but one of the longest-running figures is Apple.

I’ve actually been a Mac user long enough that I can remember when the rumour of the week would be, not about whom Apple should buy, but about who was going to buy Apple. Would it be Dell? Would it be Sony? Would it be Silicon Graphics? Would it be Sun? Would it be IBM?

Twenty years later, that catalogue is ridiculous on the face of it. Only one of those companies even still meets the two core criteria, namely a) existence, and b) being a PC manufacturer. However, in the mid-90s, things were not at all rosy at Apple, and management was getting desperate. How desperate? They approved a programme that licensed the MacOS to other manufacturers, who could then make and sell their own fully-legal and -compatible MacOS computers.

As it happened, I had a front-row seat for all of this. In the mid-90s I was still in high school, but given that in Italy high school is a morning-only affair, I took on an afternoon job at the local Apple reseller. Unbeknownst to me, they had also just signed up to be the Italian reseller for UMAX, one of those MacOS clone makers (also known as SuperMac in the US).

url.jpg

UMAX had already been around for a while, and had made a name for themselves with a range of scanners that went from consumer-grade to very definitely pro-grade. The most expensive machine I dealt with was a $25k A3 flat-bed scanner with 9600 dpi optical resolution. Photographers and other graphic artists from all over Italy were already dealing with this company, so the value proposition of a cheaper Mac for their work was pretty obvious.

Where things got exciting was when performance of the UMAX machines started to overtake that of contemporary Macs. This was in the days of the Motorola/IBM PowerPC CPU, and Mac performance was already starting to suffer compared to contemporary Intel chips. Therefore, when UMAX brought to market a dual-604e motherboard, available with not one but two screaming-fast 200 MHz CPUs, this was big news - not least because they not only undercut the price of the equivalent PowerMac 9600, but beat it to market as well.

(Embarrassingly, I blew up the very first one of those machines to come to Italy. It had a power supply with a physical switch to change from 115v US-style power to the full-strength 230v juice we enjoy in Europe. I did check the switch before plugging in the cable, but BANG! Turned out, the switch was not properly connected on the inside of the PSU… Luckily, all that had blown was the power supply itself, not the irreplaceable motherboard, and we got it swapped out in double-quick time and nobody ever found out… until now.)

Anyway, this was all great fun for me, still in high school and all, and everyone was doing very well out of the arrangement - except for Apple. The licensing fee for MacOS that they were receiving did not even come close to replacing the profit they missed out on from all the lost sales of Apple hardware1. As soon as Steve Jobs returned to Apple, he killed the programme. UMAX was the last of the cloners to fall, managing to secure the only license to ship MacOS 8 (everyone else’s licenses ended with System 72), but the writing was on the wall. UMAX switched to making Wintel PCs - a market they since exited, reverting to their core strength of imaging products.

url.jpg

Today, a handful of dedicated people still build “hackintosh" computers from commodity parts, and then try to force OS X3 to run on them, with varying degrees of success. However, there is no officially sanctioned way of running OS X on any hardware not sold by Apple.


So, given this history and the results for Apple, why exactly do people feel the need to advise Apple to license iOS? Both the Macalope and Nick Heer of Pixel Envy have already done the hard work of eviscerating this wrong-headedness, but I couldn’t resist getting my own blow in.

First of all, iOS runs on its own system-on-a-chip (SoC) - currently, the A9 and A9X. Sure, this is based on the industry-standard ARMv8 architecture, but with substantial refinements added by Apple, which they would presumably be even more reluctant to license than iOS itself.

So let’s say Samsung or whoever either licenses the SoC design, or builds their own (not a trivial exercise in itself), install iOS, and sell the resulting device as the iGalaxy. Where are they going to position this frankenphone? It can’t be priced above Apple’s own offerings unless it brings something novel to the table.

What could that be? Maybe some device that spans the gap between Android and iOS? Well, here too, history can be our guide.

Back in my UMAX days, we did sell one very popular accessory. Basically it was a full-length PCI card with an entire x86 chipset and its own Intel CPU on it. Seriously, this thing was the biggest expansion board I have ever seen - the full width of the motherboard, so wide that it had a special support bracket in the case to prevent it sagging under its own weight. It also had its own CPU fan, of course, so it took up a fair amount of vertical space too. This allowed owners to run Windows on Intel side by side with MacOS on PowerPC, sharing a graphics card and input devices. Mind-blowing stuff in the mid-Nineties!

So in that vein, could a cloner conceivably sell a handset that could run Android apps natively side-by-side with iOS ones? Frankly, I doubt it. These days, it’s easier to emulate another platform, or just carry two phones. Maybe a few developers would be interested, but the market would be tiny.

It used to be the case that if you wanted a large phone (I refuse to call it a “phablet") you had to go with Android, because iPhones came in one size only. These days, Apple sells phones in a variety of sizes, from the small iPhone SE, through the standard iPhone, up to the iPhone Plus - so I can’t see the form factor being enough of a draw for people to go with a third-party device.

url.jpg

The only variable that’s left is price. Any iOS clone manufacturer would have to substantially undercut Apple’s cheapest devices to get sales. To do this, they would cut corners. By giving the device less RAM, or a non-Retina display, or less storage, or whatever, the cloners could lower the price point enough to get the initial sale - but Apple would be stuck with the horrible customer satisfaction issues from running on this below-par hardware.

That last point is particularly problematic because Apple’s entire business model is predicated upon taking, not the whole of the smartphone market, but the most profitable slice of it. One important consequence of this is that iOS is also the most profitable market for developers, because iOS users by definition have money to spend on apps. This is a virtuous circle for Apple, as the richer app ecosystem draws more users, which draws more development, and so on.4

If users - many of them first-time users, who are tempted into trying iOS by new low-cost clone devices - have a terrible experience, never buy apps, and replace their iOS device with an Android one as soon as they get the chance, that virtuous cycle turns vicious fast.

And that’s not even getting into the strategy tax Apple would be paying on other decisions. To cite another rumour that’s doing the rounds, could Apple drop the headphone jack from their own devices if there were cloners still manufacturing iOS devices that featured it? Maybe they could - but the decision would be much more fraught.

Bottom line, there is no iOS license fee that the cloners would pay that would also compensate Apple for both lost sales of their own hardware and for the wider market impact.

Apple tried this once, and it nearly killed them.

Can we please stop bringing up this idiotic idea now?5


  1. For more context from 1997, see here and search for “Why Apple Pulled the Plug". 

  2. What, you thought confusing name changes to Apple’s operating systems were a new thing? Hah. 

  3. See what I mean? Are we supposed to call it macOS already, or is it still OS X for now? So confused. 

  4. And of course Apple takes its cut from the App Store, too. 

  5. Of course not: when it comes to Apple, we’re always fighting the same battles