Showing all posts tagged macos:

Dragging the Anchor

Apple events may have become routine, and recorded events don't hit quite the same as ones with a live audience — even if I only ever viewed them remotely. However, they still have the potential to stir up controversy, at least among the sorts of people who follow Apple announcements religiously.

If you are not part of that group, you may not be aware that Apple’s MacBook Pro memory problem is worse than ever. Wait, what is going on? Is the RAM catching fire and burning people's laps or something?

No, nothing quote that bad. It's just that even in Apple's newest M3 MacBook Pro, the base configuration comes with a measly 8 GB of RAM, which is simply not adequate in this year 2023.

There has been a certain amount of pushback claiming that 8 GB is fine, actually — and it is true that Apple Silicon does use RAM differently than the old Intel MacBooks did, so 8 GB is not quite as bad as it sounds. But it sounds pretty bad, so there is still plenty of badness to be had!

Jason Koebler took on the critics in a piece titled In Defense of RAM at increasingly essential tech news site 404 Media:

It is outrageous that Tim Cook is still selling 8GB of RAM as the default on a $1,600 device. It is very similar to when Apple was selling the iPhone 6S with 16GB of storage as its base device, and people were talking themselves into buying it. It is not just a performance and usability problem, it’s a sustainability and environmental one, too. This is because RAM, historically one of the easiest components to upgrade in order to get more life out of your computer, on MacBook Pros cannot be upgraded and thus when 8GB inevitably becomes not enough, users have to buy a new computer rather than simply upgrade the part of the computer that’s limiting them.

This is the key point. If I may age myself for a moment, my first computer, a mighty Macintosh LC, had a whole whopping 4 MB of RAM — yes, four megabytes. But the default was two. The motherboard let owners expand the RAM up to a screaming 10 MB by swapping SIMMs (yes, this machine predated DIMMs).

These days, RAM is soldered to the motherboard of MacBooks, so whatever spec you buy is the most RAM that machine will ever have. If it turns out that you need more RAM, well, you’ll just have to buy a new MacBook — and figure out what to do with your old one.

This is obviously not great, as Jason Koebler writes in the piece I quoted above — but sustainability and environmental issues can only do so much when set against increased frequency of upgrades and the consequent increase in profitability.

Here's the thing: that forced binary choice between environment and profit is a false dilemma, in this as in so many other cases.

Default configurations are extremely important to customer satisfaction and brand perception because they anchor the whole product line. Both uninformed consumers and large corporate buyers will gravitate to the default, so that is the experience that most of the users of the product will have.

We are talking here about the experience of using a MacBook Pro — not an Air, where you might expect a trade-off, but the nominally top-of-the-tree model that is supposedly designed for Professionals. If that experience is unsatisfactory and causes users to develop a negative opinion of their MacBook Pro, this becomes a drag on their adoption of the rest of the Apple ecosystem.

Is this the issue that is going to kill Apple? No, of course not. But it comes on top of so many other stories: we've had Batterygate, Antennagate, Bendgate, and I'm probably forgetting some other 'gates, not to mention iPhone sales being halted in France due to radiation concerns. None of these issues is actually substantive, but in the aggregate, slowly but surely, they erode Apple’s brand perception.

Negative press is a problem for any company, but it is a particular problem for Apple, because a lot of the value comes from the ecosystem. The all-Apple lifestyle is pretty great: MacBook unlocks with Apple Watch syncs with iPhone AirPlays to Apple TV served by Mac mini together with iPad — and that's just my house.

But I've been a Mac user since that little pizzabox LC in the 90s. If my first Apple experience was to be handed a nominally "Pro" machine, open a handful of browser tabs, and find it immediately slowing down, would I consider any other Apple devices? Or would I get an Android phone, a Garmin smartwatch, an Amazon Fire TV stick, and so on? Sure, Apple fans talk about how nice their world is, but this computer is just hateful.

That's the risk. Will Apple recognise it in time?

Nice Tech, Pity About The Product

Like many IT types, my workspace has a tendency to acquire obsolete technology. When I shared a flat in London with somebody else who lives with the same condition, computers significantly outnumbered people; heck, operating systems sometimes outnumbered people, even after our then-girlfriends/now-wives moved in! At one point, we even had an AS/400 desk-side unit that we salvaged, until we realised we really didn't have anything fun to do with it and moved it on again.

In the big clear-out last year, I got rid of a bunch of the old stuff — yes, even some of the cables! One item made the opposite journey, though, from the depths of a box inside a cupboard of toner cartridges underneath a monitor so old it still has a 4:3 aspect ratio, to pride of place in my line of sight from my desk chair.

That item is the installation media for a thoroughly obsolete computer operating system from the 90s.

What Even Is BeOS?

BeOS was the brain-child of a bunch of ex-Apple people, including Jean-Louis Gassée, who worked for Apple through the 80s and was instrumental in the creation of the Newton, among other things. While Apple spent the 90s trying and failing to create a new operating system to replace the aging MacOS, Gassée and his merry band created a brand-new operating system called BeOS. The 90s were probably the last time in history that it was possible to do something like that; the platforms that have emerged since then (iOS and Android) are variations on existing platforms (NeXTSTEP/OS X, which slightly predates BeOS, and Linux respectively).

Initially targeted at AT&T's Hobbit CPUs, BeOS was soon ported to the PowerPC architecture. These were the CPUs that powered Apple computers at the time, the product of an alliance between Apple, IBM, and Motorola. Between them, the three companies hoped to foster the emergence of an ecosystem to rival (or at least provide an alternative to) Intel's dominant x86. In those days, Apple licensed a handful of manufacturers to build MacOS-compatible PowerPC computers, so Be quickly stopped manufacturing their own BeBox hardware and switched to offering the BeOS to people who owned these computers — or actual Apple Macs, I suppose, but even at the time you didn't hear of many people doing that.

This is where BeOS first entered my life. If you can believe it, the way you found out about cool software in those pre-broadband days was to buy a printed magazine that would come with a CD full of demos, shareware, utilities, wallpapers, icon sets, and more. There were a few magazines that catered to the Apple enthusiast market, and in 1997, I happened to pick one up that included Preview Release 2 of the BeOS.1

Luckily for me, I owned a whopping 500MB external SCSI drive, so I didn't have to mess around with reformatting the main HDD of the family computer (which would probably have run all of 2GB at the time, kids!). I was quickly up and running with the BeOS, which absolutely blew away the contemporary Macintosh operating system.

Why Bother With BeOS?

The performance was the first and most obvious difference between BeOS and MacOS. Just watching GLTeapot spinning around in real time was amazing, especially compared to what I was used to in MacOS on the same hardware. Check out this contemporary review, focusing specifically on BeOS’ multimedia capabilities.

This was also my first exposure to a bash terminal, or indeed any command-line interface beyond MS-DOS, and I can safely say that it was love at first sight, especially once I started understanding how the output of one command could be passed to another, and then the whole thing wired up into a script.

BeOS was properly multi-user, in a way that Classic MacOS very definitely wasn't. This factor made me consider it as a full-time replacement for MacOS on the family computer, but the lack of hardware support killed that idea. Specifically, the Global Village Teleport fax/modem which was our connection to the early Internet, running at a blazing fast 14.4kbps, did not work in BeOS.

This lack was doubly annoying since BeOS shipped with an actual web browser: NetPositive, one of whose claims to fame was its haiku error messages. At the time, Mac users were stuck between Netscape Navigator, Microsoft Internet Explorer, Apple's almost wilfully obscure Cyberdog, and early versions of Opera.

What Happened To BeOS?

This is where we get to the point of the story. What killed BeOS was not any sort of issue with the technology. It was leaps and bounds ahead of both dominant operating systems of the day, with massive developer interest.

Unfortunately, Be did not own its own destiny. After failing to sell itself to Apple, Be staggered on for a few more years. Once it became obvious that Apple was going to kill the MacOS clone business which powered the ecosystem of non-Apple PowerPC hardware that BeOS ran on, an x86 port was quickly added. By this point dual-booting operating systems on x86 had become, if not exactly mainstream, at least somewhat common in technical circles. Unfortunately for Be, the second OS (of course after Windows) was almost always Linux. A second commercial operating system was always going to be a hard sell in a world where everyone had already paid for a Windows license as part of the purchase price for their PC, to the point that Be literally couldn't even give it away. In fact Be actually sued Microsoft over its alleged monopolistic practices, possibly the last gasp of the First Browser War of the late 90s.2

Be was eventually sold to Palm, and after Palm's own travails, the last vestiges of BeOS disappeared from public view only a few years later.

The lesson here is that the best technology does not always win — or at least, does not win unaided. Execution is key, and Be, despite some very agile pivots, failed to execute to the point of making any meaningful dent in the personal-computer-OS market.

What could Be have done differently? It's hard to say, even with the benefit of hindsight. None of the alternative desktop operating systems that sprang up in the late 80s and early 90s have survived. BeOS? Gone. OS/2 Warp? Gone. All the commercial UNIX systems? Gone — but maybe next year will be the year of Linux on the desktop. NeXT? It got acquired by Apple, and the tech is still with us in every current Apple platform — but if Be had been the one to get bought to replace the failed Copland project, NeXT would certainly have been the one to disappear.

That is the one inflection point really worth considering: what if Gassée had managed to negotiate a deal with Apple back then? What would OS X be like today if it were based on BeOS rather than on NeXTSTEP?3 And… what would Apple be like without Steve Jobs, in hindsight the most valuable part of the NeXT acquisition? There would probably still be a mobile product; one of the key Be employees was Steve Sakoman, godfather of the Newton, so it seems fairly certain that a descendant of some sort would have emerged from a Be-infused Apple. But would it have become the globe-spanning success of the iPhone (and iPad) without Steve Jobs to market it?

One day I would like to own both a BeBox and a NeXTcube,3 but for now I just keep that BeOS PR2 CD as a tech industry memento mori, a reminder to myself not to get caught up in the elegance of the tech, but always to remember the product and the use cases which that tech enables.


  1. I could have sworn it was MacAddict, which was definitely my favourite magazine at the time, but the only references I can find online say it was MacTech, and it's been long enough that I can't be sure. 

  2. Be's travails did inspire at least one high-profile fan, with Neal Stephenson discussing BeOS in his book-length essay In the Beginning... Was the Command Line, as well as giving it a cameo in Cryptonomicon (alongside "Finux", his gossamer-thin Linux-analogue). 

  3. Yes, weird capitalisation has always been part of the computer industry. 

Why The M1 Won't Kill The iPad Pro

Quick, it's happening again!

This is my third CPU architecture transition since I've been a Mac user. I started on the 68k, with the weedy 68020 in my first Mac LC. When Apple moved to PowerPC, I cajoled my parents into getting a 603e — still relatively weedy in the context of the dual 604s I got to play with at work, but a massive upgrade from the LC. By the time of the Intel transition I was out of the Apple fold — I couldn't afford one as a student, and later prioritised gaming and dual-booting with Linux.

However, when the MacBook Air launched — yes, the very first one, with the 11" screen, no ports, and no power — I spent my own money rather than use the massive corporate-issue Dell that was assigned to me. Since then I've never looked back; every work computer I've had since that tiny MacBook Air has been a MacBook. My personal computer is my iPad Pro, but I also have a 2nd-gen Mac mini1 which runs headless to take care of various things around the house. An upgrade to SSD and maxed-out 16 GB of RAM keeps it chugging away, nearly a decade in.

When Apple announced the new M1-based Macs, I was blown away like everyone else by the performance on offer. I was particularly relieved to see the M1 Mac mini in the line-up, not because I have any urgent need to upgrade, but just to know that it remains a product in Apple's line-up, for whenever I might need to upgrade in the future. In the same way, I'm not pushing for an early upgrade of my work-issued MacBook Pro, because one of the must-haves for me is support for multiple monitors. I'm assuming that will come with the rumoured 14" Pros that are more clearly differentiated from the Air, so that's what I'm waiting for there.

Most of the commentary is trying to differentiate between the new Air and Pro, and figuring out whether to replace an iMac Pro (or even a Mac Pro!) with the M1 Mac mini. Some, though, have gone the other way, comparing the new MacBook Air to the iPad Pro. The article's conclusion is that "Apple's M1 MacBook Air kills the iPad Pro for the rest of us", but I'm not so sure.

Over-reach

My iPad is a substantially different device from my MacBook, and it gets used for different things, even when I have both within arm's reach. Let's dig into those differences, because they are key to understanding what (I think) Apple's strategy will be for the Mx MacBook and the iPad Pro in the future.

Form Factor

All of the comparisons in that ZDNet article are comparing the big 12.9" iPad Pro to the 13" MacBook Air — which is fair enough on the MacBook side, since that's what Apple has announced so far. On the iPad side, though, most people have the smaller size — currently 11" — and that is the more meaningful basis for differentiation. We'll see whether that changes when and if Apple ever releases a successor to my beloved MacBook Air 11", or SWMBO's (just) MacBook 12", aka the MacBook Adorable — but for now, if you want an ultra-portable device without sacrificing power, the smaller iPad Pro still has an edge.

External Display

Seriously, who connects an external display to an iPad? AirPlay is far more relevant for that use case. Meanwhile, I'm actually more bothered about the fact that no M1 MacBook allows for more than one monitor to be connected.

Webcam

This is a long-standing weak point of the MacBook line, and it's going to be hard to remedy simply due to physics. A better webcam requires more depth, meaning a thicker cover around and behind the screen. Again, though, the use case matters: it's more important for the iPad to have a good built-in webcam, because a MacBook is more likely to have an external one for people who really do care about image quality, resting on top of that external monitor. People who use their MacBook for work care a lot less about image quality anyway, because they may well be looking at a shared document rather than headshots most of the time.

What's Missing

A surprising omission from the list of differences between MacBook and iPad is the operating system. iOS — or rather, iPadOS — is a big differentiator here, because it affects everything about how these devices are actually used. This is the same mistake as we see in those older PC reviews that only compared the hardware specs of Macs to Wintel devices, missing out entirely on the differentiation that came from running macOS as opposed to Windows.

Uhoh, This content has sprouted legs and trotted off.

Confusion

I think the confusion arises from the Magic Keyboard, and how it makes the iPad Pro look like a laptop. This is the foundational error in this list of recommendations to improve the iPad Pro.

Adopt a landscape-first mindset. Rotate the Apple logo on the back and move the iPad’s front-facing camera on the side beneath the Apple Pencil charger to better reflect how most people actually use their iPad Pros.

No! Absolutely not! I use my iPad in portrait mode a lot more than I use it in landscape! Does it bug me that the Apple is rotated when I'm using it with the keyboard? Sure, a little bit — but by definition, I can't see it while I'm doing that.

Release a new keyboard + trackpad case accessory that allows the iPad to be used in tablet mode without removing it from the case.

Now this one I can stand behind: I still miss my origami keyboard case for my iPad Pro, which sadly broke. You could even rotate the Apple logo on the case, while leaving the one on the device in its proper orientation, if you really wanted to.

The reason I still miss that origami case is that I didn't replace it when it broke, thinking I would soon be upgrading my iPad Pro, and I would get a new keyboard case for the new-style flat-edge case. Then Apple did not refresh the iPad Pro line this year, so I still have my 10.5" model.

I do wonder whether this could be the reason why the iPad Pro didn't get updated when the new iPad and iPad Air did. That is, could there be an even better one coming, that differentiates more clearly against the M1 MacBook Air?

Then again, Apple may be getting ready to release a convergent device: a fold-over, touch- & Pencil-enabled MacBook. They would never tell us, so we'll just have to wait and see, credit cards at the ready.


  1. Yes, that really is how you're supposed to capitalise it. No, really

Whammo, Camo

Everyone is raving about a new app called Camo that lets you use your iPhone as a webcam for your computer.

This is a great idea at first blush, since any recent iPhone has way better cameras than the webcam on any Mac you can buy. This is why webcams sold out in the early days of lockdown, when everyone was on Zoom all the time, realised just how bad the image quality was, and rushed out to buy better options. Serious streamers have of course long used full-on DSLRs as their webcams, but that’s a whole other level of expense — and the little dongles to connect those as webcams also sold out for ages.

Camo therefore seems like a really good idea, taking advantage of the great cameras in the phone that you already have. Unfortunately, it has a number of downsides in practice.

The big one is that it’s not wireless1; your iPhone will need to be connected to your Mac2 by a cable. If you have a (just plain) MacBook like SWMBO’s, with a single USB-C port, you’re already in trouble. Her non-Pro iPhone 11 came with a USB-A cable, so she would need to find her dongle, and then hope that she doesn’t need the single port for anything else. If she did have a USB-C cable, of course, she could choose between charging the laptop or connecting the phone — still not ideal. Newer models don’t have the single-port conundrum, but there are still plenty of MacBooks out there that only have one or two ports.

The other problem with the wired setup is that it means you can’t just prop your phone up and go; you’re going to need either a dock of some description with a connector already in place, or a stand or tripod to hold your phone. I had avoided buying one of these by getting a case with a kickstand built into it, but it’s not possible to connect a cable this way. I could of course put the phone in landscape mode, but that way the camera is far too low, giving viewers the full NostrilCam effect.

So okay, I can pick up a tripod of some sort from Amazon for not too much money — but speaking of money, here’s the big Achilles’ heel: while the free version of the app is fairly functional, upgrading to Pro costs €41.47. That’s nearly fifty bucks! Sure, I’d like to be able to use all my cameras; in free mode the app shows me the Wide 1x camera and the front selfie camera, but is it worth that much to use the Telephoto 2x or the Ultrawide 0.5x? Pro unlocks higher resolutions, and there are also a bunch of options to control focus, lighting, flash, zoom, and so on, which I would definitely have bought for a fiver or so — but not for this much. I already have a decent webcam at home (a Razer Kiyo), so I’d be using Camo only away from home, and I’d have to acquire and carry a separate piece of kit to do so.

I do wonder how many of this app’s target market are going to make the same evaluation as me. Most people who wanted a better webcam already bought one, which already limits the target market, and while many might be attracted to the simplicity of using their phone as a webcam, once the reality of what it takes to do that starts to sink in, I doubt many will pony up. The thirty-day refund does go some way to reduce the downsides, but at least for me it’s not enough.

I hate to be the person who quibbles at paying for software, but this is a lot for a very single-purpose app. I pay for other software I use — but a year’s subscription to Evernote costs about the same as this thing, and I get a lot more value from Evernote.

Nice app, though.


  1. The developer pinged me after this post went live, and apparently wifi support is coming in a month or so, and portrait mode too. That might just be enough to change my opinion. If so, I’ll come back and update this post again to link to a more complete review — one in which I actually use the app. 

  2. Windows support is coming, per the Camo website. 

The User-Unfriendly Web

Why I love Reader mode in Safari – a tale in two screenshots.

Before:

I have to scroll down more than an entire screen to see even the first line of the text. This design is actively user-hostile.

After:

Much better! I still have the title and the sub-heading, but I can see the image and the first paragraph of the text. Result.

What is even better is configuring Safari to use Reader mode automatically on offending websites:

Just a pity I can’t do that on iOS; the setting is only available in the macOS version of Safari. Still, a guy can dream…


UPDATE: I was wrong! It is in fact possible to do the same thing on iOS; just hold down on the Reader Mode icon.

Not especially discoverable, perhaps, but very useful. In fact, since I found out about this, it turns out that the same thing works on desktop Safari.

Going From Caffeine To Amphetamine

No, this is not a post about controlled substances – even though it is Friday!

In the past, I have recommended some useful apps to improve your presenter game, including a great little tool called Caffeine. Unfortunately the upgrade to macOS Mojave seems to have finally killed off Caffeine, which is fair enough really since it has not been updated in some years.

Luckily, there is a fantastic alternative called Amphetamine, which sticks to Caffeine’s attractive pricing of "free". It does the same job that Caffeine did, sitting quietly in the menu bar until you need to prevent your display from sleeping – perhaps because it is connected to an external projector and you are trying to show something other than your cool screensaver.

On top of that, though, Amphetamine offers a ton of configuration options. My favourite is that you can create triggers which will automatically prevent your Mac from sleeping in certain conditions. I created a trigger so that my Mac will automatically stay awake when I connect my presentation remote, for instance.

If you don’t want a pill icon in your menu bar, you can also change it to something else, including a version of Caffeine’s classic coffee cup.

Do check out Amphetamine and let me know how you get on.

Thoughts about WWDC '17

First of all, let’s get the elephant in the room out of the way; no new iPhone was announced. I was not necessarily expecting one to show up - that seems more suited to a September event, unless there were specific iOS features that were enabled by new hardware and that developers needed to know about.

We did get a whole ton of new features for iOS 11 (it goes up to eleven!), but many of them were aimed squarely at the iPad. With no new iPhone, the iPad got most of the new product glory, sharing only with the iMac Pro and the HomePod (awful name, by the way).

On that note, some people were confused by the iMac Pro, but Apple has helpfully clarified that there is also going to be a Mac Pro and external displays to go with it:

In addition to the new iMac Pro, Apple is working on a completely redesigned, next-generation Mac Pro architected for pro customers who need the highest-end, high-throughput system in a modular design, as well as a new high-end pro display.

I doubt I will ever buy a desktop Mac again, except possibly if Apple ever updates the Mac mini, so this is all kind of academic for me - although I really hope the dark-coloured wireless extended keyboard from the iMac Pro will also be available for standalone purchase.

What I am really excited about is the new 10.5" iPad Pro and the attendant features in iOS 111. The 12.9" is too big for my use case (lots of travel), and the 9.7" Pro always looked like a placeholder device to me. Now we have a full lineup, with the 9.7" non-Pro iPad significantly different from the 10.5" iPad Pro, and the 12.9" iPad Pro there for people who really need the larger size - or maybe just don’t travel with their iPad quite as much as I do.

My current iPad (an Air 2) is my main personal device apart from my iPhone. The MacBook Pro is my work device, and opening it up puts me in "work mode", which is not always a good thing. On the iPad, I do a ton of reading, but I also create a fair amount of content. The on-screen keyboard and various third-party soft-tip styluses (styli?) work fine, but they’re not ideal, and so I have lusted after an iPad Pro for a while now. However, between the lack of sufficient hardware differentiation compared to what I have2, and lack of software support for productivity, I never felt compelled to take the plunge.

Now, I can’t wait to get my hands on an iPad Pro 10.5".

I already use features like the sidebar and side-by-side multitasking, but what iOS 11 brings is an order of magnitude beyond - especially with the ability to drag & drop between applications. Right now, while I may build an outline of a document on my iPad, I rarely do the whole thing there, because it is just so painful to do any complex work involving multiple switches between applications - so I end up doing all of that on my Mac.

The problem is that there is a friction in working with a Mac; I need (or feel that I need) longer stretches of time and more work-like environments to pull out my Mac. That friction is completely absent with an iPad; I am perfectly happy to get it out if I have more than a minute or so to myself, and there is plenty of room to work on an iPad in settings (such as, to pick an example at random, an economy seat on a short-haul flight) where there is simply no room to type on a Mac.

The new Files app also looks very promising. Sure, you can sort of do everything it does in a combination of iCloud Drive, Dropbox, and Google Drive, and I do - but I always find myself hunting around for the latest revision, and then turning to the share sheet to get whatever I need to where I can actually work on it.

With iOS 11, it looks like the iPad will truly start delivering on its promise as (all together now) a creation device, not just a consumption device.

Ask me again six months from now…

And if you want more exhaustive analysis, Federico Viticci has you covered.


  1. Yes, there was also some talk about the Watch, but since I gave up on fitness tracking, I can't really see the point in that whole product line. That's not to say that it has no value, just that I don't see the value to me. It certainly seems to be the smartwatch to get if you want to get a smartwatch, but the problem with that proposition is that I don't particularly want any smartwatch. 

  2. To me this is the explanation for the 13 straight quarters of iPad sales drop: an older iPad is still a very capable device, and outside of very specific use cases, or people upgrading from something like an iPad 2 or 3, there hasn’t been a compelling reason to upgrade - yet. For me at least, that compelling reason has arrived, with the combination of 10.5" iPad Pro and iOS 11. After the holiday quarter, I suppose we will find out how many people feel the same way. 

New Mac Fever

Apple bloggers are all very excited about the announcement of a new Mac Pro. The best roundup I have seen is on Daring Fireball: The Mac Pro Lives.

I'm not a Mac Pro user, nor frankly am I ever likely to be. My tastes lie more at the other end of the spectrum, with the ultra-portable MacBook (aka MacBook Adorable). However, there was one interesting tidbit for me in the Daring Fireball report:

Near the end, John Paczkowski had the presence of mind to ask about the Mac Mini, which hadn’t been mentioned at all until that point. Schiller: "On that I’ll say the Mac Mini is an important product in our lineup and we weren’t bringing it up because it’s more of a mix of consumer with some pro use. … The Mac Mini remains a product in our lineup, but nothing more to say about it today."

While there are certainly Mac Mini users who choose it as the cheapest Mac, and perhaps as a way to keep using a monitor and other peripherals that used to be plugged into a PC, there is a substantial contingent of Mac Mini "pro" users. Without getting into Macminicolo levels of pro-ness, I run mine headless in a cupboard, where it serves iTunes and runs a few other services. It's cheap, quiet, and reliable, which makes it ideal for that role. I don't necessarily need ultimate power - average utilisation is extremely low, although there is the odd peak - but I do want to be reassured that this is a product line that will stick around, just in case my current Mac Mini breaks.

The most important Macs are obviously the MacBook and MacBook Pros, but it's good to know that Apple recognises a role for the Mac Pro - and for the Mac Mini.

What about those new Macs?

The progression is so established, it's now entirely predictable. Several times a year, Apple puts on an event to announce their latest hardware or software product - if indeed that is still a valid distinction, now that our devices are basically spimes already.

Even before the event, the leaks begin. Soon they are coming thick and fast - and hot on their heels are the think pieces, decrying how this time Apple have really lost it, and what can they be thinking in Cupertino?

Finally, the day of the event actually arrives. The actual hardware is greeted with yawns - after all, nearly-final pictures of the products have been available for weeks. Any surprises are limited to software - and even then, extended and increasingly public betas mean that it is only minor software that is actually still new by the time it is officially unveiled. The TV app is an example of an announcement that Apple managed to keep the lid on. Ultimately this was possible because it had no leaky supply chain of physical parts and accessories, and also no need to make a beta available to developers.

Finally, as the last lingering effects of the Reality Distortion Field fade, the post-event hangover begins. That’s when the Macalope starts rubbing his hooves together in glee because of all the wonderful source material that people create just for him to make fun of.

This time around, the major criticism appears to be that Apple are not making enough different devices to satisfy different use cases. The two new MacBook Pro models (with and without Touch Bar) do not have enough RAM or enough ports, or they should still have older ports and SD card slots instead of going over wholesale to USB-C, or whatever. Oh, and while they were at it, Apple should also have updated all of their other computers.

Newsflash: this is how Apple has always done things, at least this century. Android users have always criticised the iPhone for its limited storage options and complete lack of expandability or external connectivity. None of that criticism has stopped the iPhone from taking basically all of the profit in the smartphone market, propelling Apple to being either the biggest company in the world or a close runner-up, depending on exactly when you make your measurement.

And I have yet to need to charge my iPhone 7 while also using the Lightning-to-TRS adapter that came in the box. I have also literally never used the SD card slot on any of my MacBooks over the years.

There was a time when Apple did offer many different options - and it was a massive disaster that almost sank the company. Seriously, check out this table just listing out all of the different models that Apple had under the Performa sub-brand, and how they mapped to the almost-but-not-quite identical "professional" models sold under different names.

That image is from Riccardo Mori, who adds the following bit of context:

Yes, it is a crowded space. The strategy behind this offering seems to be "Let’s try to cover every possible point of the spectrum, with regard to form factor, expandability, target audience, etc." This of course led to confusion, because there were some Macintosh models just as powerful as others, but coming in a different shape, or with one less card slot or expansion bay. And also because there were many Macintosh models delivering a similar performance. There was a lot of differentiation and little differentiation at the same time, so to speak.

On top of the confusion just within Apple’s own line-up, this was the period when you could also buy a legitimate and authorised Macintosh clone. In other words, Macintosh buyers could buy exactly what they wanted, and no two Macs were alike.

Apple nearly died from the results. Once Steve Jobs returned, applied the paddles, and revived the business, he set about rationalising the product line-up, to the point that people joked that "Steve hates SKUs".

So Apple didn’t update the MacBook Air with a Retina screen? Big deal - the computer you want is the MacBook Pro without Touch Bar (known as the "MacBook Escape" to listeners of ATP), which has basically all the good bits from the Air and upgrades everything else. That’s for a 13" screen size; if you wanted the 10" option, the MacBook (aka "MacBook One" or "MacBook Adorable") is your ultra-portable design choice.

YES, these are more expensive devices - again, if you’re surprised, you have not been paying attention. Apple’s products have always been positioned at the premium end of the market, with a price tag to match. It is important to note that those prices are generally not completely out of touch with reality. While you certainly can buy cheaper phones or laptops, once an Android or Windows device is specced up to the same power and size as the Apple equivalent, the price is usually not too far off the Apple option. Apple famously burns people with the price of upgrades to RAM or storage, but again, they have been doing this since I was a Mac tech in high school, literally twenty years ago. This has always been part of their business model; it's not some unwelcome surprise that they only just sprang on people now.

Fundamentally, Apple does not believe in giving people too many options. They are famously opinionated in their product design, and if you’re after ultimate flexibility - well, these may not be the products for you. However, they are the right products for very many people, Apple has made moves like this for a very long time; remember the howls of derision and outrage when they first announced the original iMac with no floppy disk drive? Or remember the MacBook Air - only USB and wifi? And look at it now - basically the default laptop for everyone, if you count its many imitators. These days, even famously brick-like ThinkPads come with dongles, because they’re too thin to accomodate all ports!

On the other hand, the last time Apple tried to be flexible and accomodate too many different user populations, it almost killed them. Is it any wonder that they are doubling down on what worked?

Anyway, this is all academic for me as I’m not due to replace my "Early 2015" MacBook Pro for another year or so, so I will get to see how this Touch Bar thing works in practice - and then get the improved and updated second-generation Touch Bar.


UPDATE: Right after I posted the above, I came across an interview with Phil Schiller in the Independent. The whole thing is worth a read, but this line is particularly illuminating:

We know we made good decisions about what to build into the new MacBook Pro and that the result is the best notebook ever made, but it might not be right for everyone on day one. That’s okay, some people felt that way about the first iMac and that turned out pretty good.

That’s exactly it: Apple made decisions about how to make the best product in their opinion, while recognising that the result may not be perfect for everyone. The opposite of this approach would be that ridiculous - and mercifully stillborn - Project Ara phone design.

Send In The Clones

Since my last post rehashing ancient IT industry history seemed to go over well, here’s another one.

In that previous post, I used the story of the HP acquisition of Mercury and its rumoured impending spin-off as a cautionary tale about handling acquisitions correctly. There is never any lack of "fantasy M&A" going on in this industry, but one of the longest-running figures is Apple.

I’ve actually been a Mac user long enough that I can remember when the rumour of the week would be, not about whom Apple should buy, but about who was going to buy Apple. Would it be Dell? Would it be Sony? Would it be Silicon Graphics? Would it be Sun? Would it be IBM?

Twenty years later, that catalogue is ridiculous on the face of it. Only one of those companies even still meets the two core criteria, namely a) existence, and b) being a PC manufacturer. However, in the mid-90s, things were not at all rosy at Apple, and management was getting desperate. How desperate? They approved a programme that licensed the MacOS to other manufacturers, who could then make and sell their own fully-legal and -compatible MacOS computers.

As it happened, I had a front-row seat for all of this. In the mid-90s I was still in high school, but given that in Italy high school is a morning-only affair, I took on an afternoon job at the local Apple reseller. Unbeknownst to me, they had also just signed up to be the Italian reseller for UMAX, one of those MacOS clone makers (also known as SuperMac in the US).

UMAX had already been around for a while, and had made a name for themselves with a range of scanners that went from consumer-grade to very definitely pro-grade. The most expensive machine I dealt with was a $25k A3 flat-bed scanner with 9600 dpi optical resolution. Photographers and other graphic artists from all over Italy were already dealing with this company, so the value proposition of a cheaper Mac for their work was pretty obvious.

Where things got exciting was when performance of the UMAX machines started to overtake that of contemporary Macs. This was in the days of the Motorola/IBM PowerPC CPU, and Mac performance was already starting to suffer compared to contemporary Intel chips. Therefore, when UMAX brought to market a dual-604e motherboard, available with not one but two screaming-fast 200 MHz CPUs, this was big news - not least because they not only undercut the price of the equivalent PowerMac 9600, but beat it to market as well.

(Embarrassingly, I blew up the very first one of those machines to come to Italy. It had a power supply with a physical switch to change from 115v US-style power to the full-strength 230v juice we enjoy in Europe. I did check the switch before plugging in the cable, but BANG! Turned out, the switch was not properly connected on the inside of the PSU… Luckily, all that had blown was the power supply itself, not the irreplaceable motherboard, and we got it swapped out in double-quick time and nobody ever found out… until now.)

Anyway, this was all great fun for me, still in high school and all, and everyone was doing very well out of the arrangement - except for Apple. The licensing fee for MacOS that they were receiving did not even come close to replacing the profit they missed out on from all the lost sales of Apple hardware1. As soon as Steve Jobs returned to Apple, he killed the programme. UMAX was the last of the cloners to fall, managing to secure the only license to ship MacOS 8 (everyone else’s licenses ended with System 72), but the writing was on the wall. UMAX switched to making Wintel PCs - a market they since exited, reverting to their core strength of imaging products.

Today, a handful of dedicated people still build "hackintosh" computers from commodity parts, and then try to force OS X3 to run on them, with varying degrees of success. However, there is no officially sanctioned way of running OS X on any hardware not sold by Apple.


So, given this history and the results for Apple, why exactly do people feel the need to advise Apple to license iOS? Both the Macalope and Nick Heer of Pixel Envy have already done the hard work of eviscerating this wrong-headedness, but I couldn’t resist getting my own blow in.

First of all, iOS runs on its own system-on-a-chip (SoC) - currently, the A9 and A9X. Sure, this is based on the industry-standard ARMv8 architecture, but with substantial refinements added by Apple, which they would presumably be even more reluctant to license than iOS itself.

So let’s say Samsung or whoever either licenses the SoC design, or builds their own (not a trivial exercise in itself), install iOS, and sell the resulting device as the iGalaxy. Where are they going to position this frankenphone? It can’t be priced above Apple’s own offerings unless it brings something novel to the table.

What could that be? Maybe some device that spans the gap between Android and iOS? Well, here too, history can be our guide.

Back in my UMAX days, we did sell one very popular accessory. Basically it was a full-length PCI card with an entire x86 chipset and its own Intel CPU on it. Seriously, this thing was the biggest expansion board I have ever seen - the full width of the motherboard, so wide that it had a special support bracket in the case to prevent it sagging under its own weight. It also had its own CPU fan, of course, so it took up a fair amount of vertical space too. This allowed owners to run Windows on Intel side by side with MacOS on PowerPC, sharing a graphics card and input devices. Mind-blowing stuff in the mid-Nineties!

So in that vein, could a cloner conceivably sell a handset that could run Android apps natively side-by-side with iOS ones? Frankly, I doubt it. These days, it’s easier to emulate another platform, or just carry two phones. Maybe a few developers would be interested, but the market would be tiny.

It used to be the case that if you wanted a large phone (I refuse to call it a "phablet") you had to go with Android, because iPhones came in one size only. These days, Apple sells phones in a variety of sizes, from the small iPhone SE, through the standard iPhone, up to the iPhone Plus - so I can’t see the form factor being enough of a draw for people to go with a third-party device.

The only variable that’s left is price. Any iOS clone manufacturer would have to substantially undercut Apple’s cheapest devices to get sales. To do this, they would cut corners. By giving the device less RAM, or a non-Retina display, or less storage, or whatever, the cloners could lower the price point enough to get the initial sale - but Apple would be stuck with the horrible customer satisfaction issues from running on this below-par hardware.

That last point is particularly problematic because Apple’s entire business model is predicated upon taking, not the whole of the smartphone market, but the most profitable slice of it. One important consequence of this is that iOS is also the most profitable market for developers, because iOS users by definition have money to spend on apps. This is a virtuous circle for Apple, as the richer app ecosystem draws more users, which draws more development, and so on.4

If users - many of them first-time users, who are tempted into trying iOS by new low-cost clone devices - have a terrible experience, never buy apps, and replace their iOS device with an Android one as soon as they get the chance, that virtuous cycle turns vicious fast.

And that’s not even getting into the strategy tax Apple would be paying on other decisions. To cite another rumour that’s doing the rounds, could Apple drop the headphone jack from their own devices if there were cloners still manufacturing iOS devices that featured it? Maybe they could - but the decision would be much more fraught.

Bottom line, there is no iOS license fee that the cloners would pay that would also compensate Apple for both lost sales of their own hardware and for the wider market impact.

Apple tried this once, and it nearly killed them.

Can we please stop bringing up this idiotic idea now?5


  1. For more context from 1997, see here and search for "Why Apple Pulled the Plug". 

  2. What, you thought confusing name changes to Apple’s operating systems were a new thing? Hah. 

  3. See what I mean? Are we supposed to call it macOS already, or is it still OS X for now? So confused. 

  4. And of course Apple takes its cut from the App Store, too. 

  5. Of course not: when it comes to Apple, we’re always fighting the same battles