Showing all posts tagged apple:

Dragging the Anchor

Apple events may have become routine, and recorded events don't hit quite the same as ones with a live audience — even if I only ever viewed them remotely. However, they still have the potential to stir up controversy, at least among the sorts of people who follow Apple announcements religiously.

If you are not part of that group, you may not be aware that Apple’s MacBook Pro memory problem is worse than ever. Wait, what is going on? Is the RAM catching fire and burning people's laps or something?

No, nothing quote that bad. It's just that even in Apple's newest M3 MacBook Pro, the base configuration comes with a measly 8 GB of RAM, which is simply not adequate in this year 2023.

There has been a certain amount of pushback claiming that 8 GB is fine, actually — and it is true that Apple Silicon does use RAM differently than the old Intel MacBooks did, so 8 GB is not quite as bad as it sounds. But it sounds pretty bad, so there is still plenty of badness to be had!

Jason Koebler took on the critics in a piece titled In Defense of RAM at increasingly essential tech news site 404 Media:

It is outrageous that Tim Cook is still selling 8GB of RAM as the default on a $1,600 device. It is very similar to when Apple was selling the iPhone 6S with 16GB of storage as its base device, and people were talking themselves into buying it. It is not just a performance and usability problem, it’s a sustainability and environmental one, too. This is because RAM, historically one of the easiest components to upgrade in order to get more life out of your computer, on MacBook Pros cannot be upgraded and thus when 8GB inevitably becomes not enough, users have to buy a new computer rather than simply upgrade the part of the computer that’s limiting them.

This is the key point. If I may age myself for a moment, my first computer, a mighty Macintosh LC, had a whole whopping 4 MB of RAM — yes, four megabytes. But the default was two. The motherboard let owners expand the RAM up to a screaming 10 MB by swapping SIMMs (yes, this machine predated DIMMs).

These days, RAM is soldered to the motherboard of MacBooks, so whatever spec you buy is the most RAM that machine will ever have. If it turns out that you need more RAM, well, you’ll just have to buy a new MacBook — and figure out what to do with your old one.

This is obviously not great, as Jason Koebler writes in the piece I quoted above — but sustainability and environmental issues can only do so much when set against increased frequency of upgrades and the consequent increase in profitability.

Here's the thing: that forced binary choice between environment and profit is a false dilemma, in this as in so many other cases.

Default configurations are extremely important to customer satisfaction and brand perception because they anchor the whole product line. Both uninformed consumers and large corporate buyers will gravitate to the default, so that is the experience that most of the users of the product will have.

We are talking here about the experience of using a MacBook Pro — not an Air, where you might expect a trade-off, but the nominally top-of-the-tree model that is supposedly designed for Professionals. If that experience is unsatisfactory and causes users to develop a negative opinion of their MacBook Pro, this becomes a drag on their adoption of the rest of the Apple ecosystem.

Is this the issue that is going to kill Apple? No, of course not. But it comes on top of so many other stories: we've had Batterygate, Antennagate, Bendgate, and I'm probably forgetting some other 'gates, not to mention iPhone sales being halted in France due to radiation concerns. None of these issues is actually substantive, but in the aggregate, slowly but surely, they erode Apple’s brand perception.

Negative press is a problem for any company, but it is a particular problem for Apple, because a lot of the value comes from the ecosystem. The all-Apple lifestyle is pretty great: MacBook unlocks with Apple Watch syncs with iPhone AirPlays to Apple TV served by Mac mini together with iPad — and that's just my house.

But I've been a Mac user since that little pizzabox LC in the 90s. If my first Apple experience was to be handed a nominally "Pro" machine, open a handful of browser tabs, and find it immediately slowing down, would I consider any other Apple devices? Or would I get an Android phone, a Garmin smartwatch, an Amazon Fire TV stick, and so on? Sure, Apple fans talk about how nice their world is, but this computer is just hateful.

That's the risk. Will Apple recognise it in time?

Fun In The Sun

A reliable way for companies to be seen as villains these days is to try to roll back concessions to remote work that were made during the pandemic1. Apple is of course a perennial scapegoat here, and while it seems reasonable that people working on next year's iPhone hardware might have to be in locked-down secure labs with all the specialised equipment they need, there is a lurking suspicion that much of the pressure on other Apple employees to return to work is driven by the need to justify the massive expense of Apple Park. Jony Ive's last project for Apple supposedly cost over $4B, after all. Even for a company with Apple's revenues, that sort of spending needs to be justified. It's not a great look if your massive new vanity building is empty most of the time.

The same mechanisms are playing out in downtown business districts around the world, with commercial landlords worried about the long-term value of their holdings, and massive impacts on the services sector businesses (cafes, restaurants, bars, dry-cleaners, etc etc) that cluster around those office towers.

With all of this going on, it was probably inevitable that companies would try to jump on the bandwagon of being remote-work friendly — some with greater plausibility than others. I already mentioned Airbnb in a past post; they have an obvious incentive to facilitate remote work.

Other claims are, let's say, more far-fetched.

In a recent example of the latter genre, it seems that Citi is opening a hub in Málaga for junior bankers:

  • Over 3,000 Málaga hopefuls applied for just 27 slots in the two-year program, which promises eight-hour days and work-free weekends -- practically unheard of in the traditional banking hubs in Manhattan and London. In exchange, Málaga analysts will earn roughly half the starting salaries of their peers.
  • The new Spain office will represent just a minuscule number of the 160 analysts Citi hired in Europe, the Middle East, and Africa, on top of another 300+ in New York.

This is… a lot less than meets the eye. 27 people, out of a worldwide intake of ~500 — call it 5% — will be hired on a two-year contract in one admittedly attractive location, and in exchange for reasonable working hours, will take a 50% hit on their starting salary. In fairness the difference in cost of living between Málaga and London will make up a chunk of that difference, and having the weekends free to enjoy the place is not nothing, but apart from that, what is the upside here?

After the two years are up, the people who have been busy brown-nosing and visibly burning the midnight oil at head office will be on the promotion track. That is how banking works; if you can make it through the first few years, you have a) no social life any more, and b) a very remunerative career track in front of you. Meanwhile, it is a foregone conclusion that the people from the Málaga office will either not have their contract renewed after the two years are up, or will have to start their career track all over again in a more central location.

In other words, what this story boils down to is some short-term PR for Citi, a bunch of cheap(er) labour with a built-in termination date, and not much more.

Then again, it could be worse (it can always be worse). Goldman Sachs opted for the stick instead of the carrot with its own return to the office2 mandate, ending the free coffee that had been a perk of its offices.

Even after all these years in the corporate world, I am amazed by these utterly obvious PR own goals. The value of the coffee cart would have been infinitesimal, completely lost in Goldman's facilities budget. But what is the negative PR impact to them of this move? At one stroke they have hollowed out all the rhetoric of teamwork and empowerment that is the nominal justification for the return to office.

Truly committing to a remote work model would look rather different. I love the idea of Citi opening a Málaga hub. The difference is that in a truly remote-friendly organisation, that office would not have teams permanently based in it (apart from some local support staff). Instead, it would be a destination hub for teams that are truly remote to assemble on a regular basis for planning sessions. The rest of the time, everyone would work remotely wherever they currently live.

Some teams do need physical proximity to work well, some customer-facing roles benefit from having access to meeting space at a moment's notice — but a lot of the work of modern companies does not fall into these categories. Knowledge workers can do their work anywhere — trust me, I've been working this way for more than fifteen years. Some of my most productive work has been done in airport lounges, not even in my fully equipped home office! With instant messaging, video calls, and collaboration tools, there is no real downside to working this way. Meanwhile, the upside is access to a global and distributed talent pool. When I did have to go into an office, it was so painful to be in an open-space with colleagues that were not on my actual team that I wore noise-cancelling headphones. If that's the situation, what's the point of commuting to an office?

This sort of reorganisation would admittedly not be great for the businesses that currently cluster around Citi offices and cater to the Citi employees working in those offices — but the flip side would be the massive benefits to businesses in those Citi employees' own home neighbourhoods. If you're not spending all your waking hours in Canary Wharf or Wall Street, you can do your dry cleaning at your local place, you can buy lunch around the corner instead of eating some over-priced plastic sandwich hunched over your desk, and you can get a better quality of life that way — maybe even in Málaga!

The only downside of working from home is that you have to pay for your own coffee and can't just get Goldman to foot the bill.


🖼️ Photos by Carles Rabada, Jonas Denil, and Tim Mossholder on Unsplash


  1. Not that the pandemic is quite over yet, but let's not get into that right now. 

  2. Never "return to work". This is a malicious rhetorical framing that implies we've all been slacking off at home. People are being asked to continue to work, and to return to the office to do so. They may want to pick up noise-cancelling headphones on their way in. 

Growing Pains

The iPad continues to (slowly, slowly) evolve into a Real Computer. My iPad Pro is my only personal computer — I don't have a Mac of my own, except for an ancient Mac Mini that is plugged into a TV and isn't really practical to use interactively. It's there to host various network services or display to that TV.

For reasons I don't feel like going into right now, I don't currently have a work Mac to plug into my desk setup, so I thought I'd try out the new Stage Manager feature in iPadOS 16.

So, the bottom line is that it does work, and it makes the iPad feel suddenly like a rather different machine.

Some setup is required. Of course Stage Manager needs iPadOS 16; I've been running the beta on my iPad all summer, and it seems pretty stable. The second display needs to connect via USB-C; I already have my CalDigit dock set up that way, so that part was no problem. Using Stage Manager with an external display also requires an external keyboard and mouse, and these have to be connected by Bluetooth; the USB keyboard connected to my dock was not recognised. Without those peripherals, the external display only works for screen mirroring, which is a bit pointless in my opinion. Mirroring the iPad's display to another screen makes sense if you are showing something to someone, but then, why would you need Stage Manager?

Anyway, once I had everything connected, the external display started working as a second display. I was able to arrange the two displays correctly from Settings; some new controls appeared under Display & Brightness to enable management of the second display.

It's interesting to see what does and does not work. The USB microphone plugged into the dock — and the analogue headphones daisy-chained from that — worked without any additional configuration, but the speakers connected to the dock's SPDIF port were not visible to iPadOS. Luckily these speakers also support Bluetooth, so I'm still able to use them; it’s just a bit of a faff to have to connect three Bluetooth devices (keyboard, mouse, and speakers) every time I want to sit at my desk. The Mac is way easier: one USB-C cable, and you’re done. The second desktop display does not show up at all, but that's fair enough; even the first generation of M1 Macs didn't support two external displays. External cameras also do not show up, and there's not even any control, so it's the iPad's built-in camera or nothing.

There's some other weird stuff that I assume and hope is due to the still-beta status of iPadOS 16.

  • The Settings app does not like being on the external display in the least, and appears all squashed. My display is an Ultrawide, but weirdly, the Settings window is squashed horizontally. Maybe the Settings app in iPadOS has not received much attention given the troubled gestation of the new Settings app in macOS Ventura?
  • Typing in Mail and a couple of other apps (Evernote, Messages, possibly others I haven’t encountered yet) sometimes lagged — or rather, the keystrokes were all being received, but they would not be displayed, until I did something different such as hitting backspace or clicking the mouse. At other times, keystrokes showed up normally.
  • The Music App goes straight into its full-screen display mode when it's playing, even when the window is not full-screen. The problem is that the touch control at the top of that window which would normally return to the usual display mode does not work. Also, Music is one of the apps whose preview in the Stage Manager side area does not work, so it's always blank. This seems like an obvious place to display static cover art, even if we can't have live-updating song progression or whatever.
  • Sometimes apps jump from the external display to the iPad’s built-in, for instance if you open something in Safari from a different app.

What does work is that apps can be resized and rearranged, giving a lot more flexibility than the previous single-screen hover or side-by-side multitasking options. App windows can also be grouped to keep apps together in logical groups, such as the editor I'm typing this into and a Safari window to look up references. Again, this is something that I already did quite a lot with the pre-existing multi-tasking support in iPadOS, but it only really worked for two apps, plus one in a slide-over if you're really pushing it. Now, you can do a whole lot more.

I am glad that I came back to give Stage Manager another chance. I had played with the feature on my iPad without connecting it to anything, and found it unnecessarily complex. I do wonder how much of that is because I'm rocking an 11" rather than a 13"? Certainly, I can see this feature being much more useful on a Mac, even standalone. However, Stage Manager on iPadOS truly comes into its own with an external display. This is a big step on way to the iPad becoming a real computer rather than merely a side device for a Mac or a bigger iPhone.

It's worth noting that Stage Manager only works with the very latest iPads that use Apple silicon: iPad Air (5th generation), 11-inch iPad Pro (2021), and 12.9-inch iPad Pro (2021). It's probably not the time to be buying a new iPad Pro, with rumours that it's due for a refresh soon, maybe to an M2, unless you really really want to try Stage Manager right now. However, if you have an iPad that can support it, and an external display, keyboard, and mouse, it's worth trying it out to get a better idea of the state of the iPadOS art.


🖼️ Photos by author, except Stage Manager screenshot from Apple

Nice Tech, Pity About The Product

Like many IT types, my workspace has a tendency to acquire obsolete technology. When I shared a flat in London with somebody else who lives with the same condition, computers significantly outnumbered people; heck, operating systems sometimes outnumbered people, even after our then-girlfriends/now-wives moved in! At one point, we even had an AS/400 desk-side unit that we salvaged, until we realised we really didn't have anything fun to do with it and moved it on again.

In the big clear-out last year, I got rid of a bunch of the old stuff — yes, even some of the cables! One item made the opposite journey, though, from the depths of a box inside a cupboard of toner cartridges underneath a monitor so old it still has a 4:3 aspect ratio, to pride of place in my line of sight from my desk chair.

That item is the installation media for a thoroughly obsolete computer operating system from the 90s.

What Even Is BeOS?

BeOS was the brain-child of a bunch of ex-Apple people, including Jean-Louis Gassée, who worked for Apple through the 80s and was instrumental in the creation of the Newton, among other things. While Apple spent the 90s trying and failing to create a new operating system to replace the aging MacOS, Gassée and his merry band created a brand-new operating system called BeOS. The 90s were probably the last time in history that it was possible to do something like that; the platforms that have emerged since then (iOS and Android) are variations on existing platforms (NeXTSTEP/OS X, which slightly predates BeOS, and Linux respectively).

Initially targeted at AT&T's Hobbit CPUs, BeOS was soon ported to the PowerPC architecture. These were the CPUs that powered Apple computers at the time, the product of an alliance between Apple, IBM, and Motorola. Between them, the three companies hoped to foster the emergence of an ecosystem to rival (or at least provide an alternative to) Intel's dominant x86. In those days, Apple licensed a handful of manufacturers to build MacOS-compatible PowerPC computers, so Be quickly stopped manufacturing their own BeBox hardware and switched to offering the BeOS to people who owned these computers — or actual Apple Macs, I suppose, but even at the time you didn't hear of many people doing that.

This is where BeOS first entered my life. If you can believe it, the way you found out about cool software in those pre-broadband days was to buy a printed magazine that would come with a CD full of demos, shareware, utilities, wallpapers, icon sets, and more. There were a few magazines that catered to the Apple enthusiast market, and in 1997, I happened to pick one up that included Preview Release 2 of the BeOS.1

Luckily for me, I owned a whopping 500MB external SCSI drive, so I didn't have to mess around with reformatting the main HDD of the family computer (which would probably have run all of 2GB at the time, kids!). I was quickly up and running with the BeOS, which absolutely blew away the contemporary Macintosh operating system.

Why Bother With BeOS?

The performance was the first and most obvious difference between BeOS and MacOS. Just watching GLTeapot spinning around in real time was amazing, especially compared to what I was used to in MacOS on the same hardware. Check out this contemporary review, focusing specifically on BeOS’ multimedia capabilities.

This was also my first exposure to a bash terminal, or indeed any command-line interface beyond MS-DOS, and I can safely say that it was love at first sight, especially once I started understanding how the output of one command could be passed to another, and then the whole thing wired up into a script.

BeOS was properly multi-user, in a way that Classic MacOS very definitely wasn't. This factor made me consider it as a full-time replacement for MacOS on the family computer, but the lack of hardware support killed that idea. Specifically, the Global Village Teleport fax/modem which was our connection to the early Internet, running at a blazing fast 14.4kbps, did not work in BeOS.

This lack was doubly annoying since BeOS shipped with an actual web browser: NetPositive, one of whose claims to fame was its haiku error messages. At the time, Mac users were stuck between Netscape Navigator, Microsoft Internet Explorer, Apple's almost wilfully obscure Cyberdog, and early versions of Opera.

What Happened To BeOS?

This is where we get to the point of the story. What killed BeOS was not any sort of issue with the technology. It was leaps and bounds ahead of both dominant operating systems of the day, with massive developer interest.

Unfortunately, Be did not own its own destiny. After failing to sell itself to Apple, Be staggered on for a few more years. Once it became obvious that Apple was going to kill the MacOS clone business which powered the ecosystem of non-Apple PowerPC hardware that BeOS ran on, an x86 port was quickly added. By this point dual-booting operating systems on x86 had become, if not exactly mainstream, at least somewhat common in technical circles. Unfortunately for Be, the second OS (of course after Windows) was almost always Linux. A second commercial operating system was always going to be a hard sell in a world where everyone had already paid for a Windows license as part of the purchase price for their PC, to the point that Be literally couldn't even give it away. In fact Be actually sued Microsoft over its alleged monopolistic practices, possibly the last gasp of the First Browser War of the late 90s.2

Be was eventually sold to Palm, and after Palm's own travails, the last vestiges of BeOS disappeared from public view only a few years later.

The lesson here is that the best technology does not always win — or at least, does not win unaided. Execution is key, and Be, despite some very agile pivots, failed to execute to the point of making any meaningful dent in the personal-computer-OS market.

What could Be have done differently? It's hard to say, even with the benefit of hindsight. None of the alternative desktop operating systems that sprang up in the late 80s and early 90s have survived. BeOS? Gone. OS/2 Warp? Gone. All the commercial UNIX systems? Gone — but maybe next year will be the year of Linux on the desktop. NeXT? It got acquired by Apple, and the tech is still with us in every current Apple platform — but if Be had been the one to get bought to replace the failed Copland project, NeXT would certainly have been the one to disappear.

That is the one inflection point really worth considering: what if Gassée had managed to negotiate a deal with Apple back then? What would OS X be like today if it were based on BeOS rather than on NeXTSTEP?3 And… what would Apple be like without Steve Jobs, in hindsight the most valuable part of the NeXT acquisition? There would probably still be a mobile product; one of the key Be employees was Steve Sakoman, godfather of the Newton, so it seems fairly certain that a descendant of some sort would have emerged from a Be-infused Apple. But would it have become the globe-spanning success of the iPhone (and iPad) without Steve Jobs to market it?

One day I would like to own both a BeBox and a NeXTcube,3 but for now I just keep that BeOS PR2 CD as a tech industry memento mori, a reminder to myself not to get caught up in the elegance of the tech, but always to remember the product and the use cases which that tech enables.


  1. I could have sworn it was MacAddict, which was definitely my favourite magazine at the time, but the only references I can find online say it was MacTech, and it's been long enough that I can't be sure. 

  2. Be's travails did inspire at least one high-profile fan, with Neal Stephenson discussing BeOS in his book-length essay In the Beginning... Was the Command Line, as well as giving it a cameo in Cryptonomicon (alongside "Finux", his gossamer-thin Linux-analogue). 

  3. Yes, weird capitalisation has always been part of the computer industry. 

App Stores & Missing Perspectives

In Apple-watching circles, there has long been some significant frustration about Apple's App Store policies. Whether it's the opaque approvals process, the swingeing 30% cut that Apple takes out of any purchase, or the restrictions on what types of apps and pricing models are even allowed, developers are not happy.

It was not always this way: when the iPhone first launched, there was no App Store. Everying was supposed to be done with web apps. Developers being developers, people quickly worked out how to "jailbreak" their iPhones to install their own apps, and a thriving unofficial marketplace for apps sprang up. Apple, seeing this development taking place out of their control, relented and launched an official App Store. The benefit of the App Store was that it would do everything for developers: hosting, payment process, a searchable catalogue, everything. Remember, the App Store launched in 2008, when all of that was quite a bit harder than it is today, and would have required developers to make up-front investments before even knowing whether their apps would take off — without even thinking about free apps.

With the addition of in-app purchase (IAP) the next year, and subscriptions a couple of years after that, most of the ingredients were in place for the App Store as we know it today. The App Store was a massive success, trumpeted by Apple at every opportunity. In January, Apple said that it paid developers $60 billion in 2021, and $260 billion since the App Store launched in 2008. Apple also reduced its cut from 30% to 15%, initially for the second year of subscriptions, but later for any developer making less than $1M per year in the App Store.

What's Not To Like?

This all sounds very fine, but developers are up in arms over Apple's perceived high-handed or even downright rapacious behaviour when it comes to the App Store. Particular sticking points are requirements that apps in the App Store use only Apple's payment system, and that apple’s own in-app purchasing mechanism be used for any digital experience offered to groups of people. The first requirement touched off a lawsuit from Epic, who basically wanted to have their own private store for in-game purchases, and the second resulted in some bad press early in the pandemic when Apple started doing things like chasing fitness instructors who were providing remote classes while they were unable to offer face-to-face sessions.

The bottom line is that many of these transactions simply do not have a 30% margin in the first place, let alone the ability to still make any profit after giving Apple a 30% (or even a 15%) cut. This might seem to be a problem for developers, but not really for anyone else — but what gave this issue resonance beyond the narrow market of iOS developers is that the world has moved on since 2008.

Hosting an app and setting up payment for it is easy and cheap these days, thanks to the likes of AWS and Stripe. Meanwhile, App Store review is capricious, while also allowing through all sorts of scams, generally based on subscriptions — what is becoming known as fleeceware.

The long and the short of it is that public opinion has shifted against Apple, with proceedings not just in the US, but in Korea, Japan, and the Netherlands too. Apple are being, well, Apple, and refusing to budge except in the most minor and grudging ways.

Here is my concern, though: this situation is being looked at as a simple conflict between Apple and developers. In all the brouhaha, nobody ever mentions another very important perspective: what do users want?

Won't Somebody Think Of The Users?

Developers rightly point out that the $260B that Apple trumpeted having paid them was money generated by their apps, not Apple's generosity, and that a big part of the reason users buy Apple's devices is the apps in the App Store. However, that money was originally paid by users, and we also have opinions about how the App Store should work for our needs and purposes.

First of all, I want all of the things that developers hate. I want Apple's App Store to be the only way of getting apps on iPhones, I want all subscriptions to be in the App Store, and I want Apple's IAP to be the only payment method. These are the factors that make users confident in downloading apps in the first place! Back when I had a Windows machine, it was just accepted that every twelve months or so, you'd have to blow away your operating system and reinstall it from scratch. Even if you were careful and avoided outright malware, bloat and cruft would take over and slow everything to a crawl — and good luck ever removing anything. Imagine a garden that you weed with a flamethrower.

The moment Apple relaxed any of the restrictions on app installation and payment, shady developers would stampede through — led by Epic and Facebook, who both have form when it comes to dodgy sideloading. It doesn't matter what sort of warnings Apple put into iOS; if that were to become how people get their Fortnight or their WhatsApp, they would tap through any number of dialogues without reading them, just as fast as they can tap. And once that happens, all bets are off. Subscriptions to Epic's games or to whatever dodgy thing in Facebook's platform would not be visible in users' App Store profiles, making it all too easy for money to be drained out, through forgetfulness and invisibility if not outright scams.

Other Examples: The Mac

People sometimes bring up the topic of the Mac App Store, which operates along the same notional lines as the iOS (and iPadOS) App Store, but without the same problems. The Mac App Store is actually a great example, but not for the reasons its proponents think. On the Mac, side-loading — deploying apps without going through the Mac App Store — is very much a thing, and in fact it is a much bigger delivery channel than the Mac App Store itself. The problem is that it is also correspondingly harder to figure out what is running on a Mac, or to remove every trace of an app that the user no longer wants. It's nowhere near as bad as Windows, to be clear, but it's also not as clean-cut as iOS, where deleting an app's icon means that app is gone, no question about it.

On the Mac, technical users have all sorts of tools to manage this situation, and that extra flexibility also has many other benefits, making the Mac a much more capable platform than iOS (and iPadOS — sigh). But many more people own iPhones and iPads than own Macs, and they are comfortable using those devices precisely because of the sandboxed1 nature of the experience. My own mother, who used to invite me to lunch and then casually mention that she had a couple of things she needed me to do on the computer, is fully independent on her iPad, down to and including updates to the operating system. This is because the lack of accessible complexity gives her confidence that she can't mess something up by accident.

More Examples: Google

Over the pandemic, I have had the experience of comparing Google's and Apple's family controls, as my kids have required their own devices for the first time for remote schooling. We have a new Chromebook and some assorted handed-down iPads and iPhones (without SIM cards). The Google controls are ridiculously coarse-grained and easily bypassed — that is, when they are not actively conflicting with each other: disabling access to YouTube breaks the Google login flow… In contrast, Apple lets me be extremely granular in what is allowed, when it is allowed, and for how long. Once again, this is possible because of Apple's end-to-end control: I can see what apps are associated with each kid's account, and approve or decline them, enforce limits, and so on. I don't want to have to worry that they will subscribe to a TikTok creator or something, outside the App Store, and drain my credit card, possibly with no way to cancel or get a refund.

What Now?

Good developers like Marco Arment want to build a closer relationship with customers and manage that process themselves. I do trust Marco to use those tools ethically — but I don't trust Mark Zuckerberg with the same tools, and this is an all-or-nothing decision. If it's the price it takes to keep Mark Zuckerberg out of my business, then I'd rather have the status quo.

All of that said, I do think Apple are making things harder on themselves. Their unbending attitude in the face of developers' complaints is not serving them well, whether in the court of public opinion or in the court of law. I do hope that someone at Apple can figure out a way to give enough to developers to reduce the noise — cut the App Store take, make app review more transparent, enable more pricing models, perhaps even refunds with more developer input, whatever it takes. There are also areas where the interests of developers and users are perfectly aligned: search ads in the App Store are gross, especially when they are allowed against actual app names. It's one thing (albeit still icky) to allow developers to pay to increase their ranking against generic terms, like "podcast player"; it's quite another to allow competing podcast players to advertise against each other by name. Nobody is served by that.

If Apple does not clear up this mess themselves, the risk is that lawmakers will attempt to clear it up for them. This could go wrong in so many ways, whether it's specific bad policies (sideloading enforced by law), or a patchwork of different regulations around the world, further balcanising the experience of users based on where they happen to live.

Everyone — Apple, developers, and users — want these platforms to (continue to) succeed. For that to happen, Apple and developers need to talk — and users' concerns must be heard too.


🖼️ Photos by Neil Soni on Unsplash


  1. Yes, I am fully aware that the sandboxing is at the OS level and technically not affected by any App Store changes, but it's part of a continuum of experience, and I would rather not rely on the last line of defence in the OS; I would prefer a continuum between the OS and the App Store to give me joined-up management. In fact, I would like the integration to go even further, such that if I delete an app that has an active subscription, iOS prompts me to cancel the subscription too. 

Spending Tim Cook's Money

Mark Gurman has had many scoops in his time covering Apple, and they have led him to a perch at Bloomberg that includes a weekly opinion column. This week's column is about how Apple is losing the home, and it struck a chord with me for a few reasons.

First of all, we have to get one thing out of the way. There is a long and inglorious history of pundits crying that Apple must make some particular device or risk ultimate doom. I mean, Apple must be just livid at missing out on that attractive netbook market, right? Oh right, no, that whole market went away, and Apple is doing just fine selling MacBook Airs and iPads.

That said, the reason this particular issue struck home is that I have been trying to get stuff done around the house, and really felt the absence of what feel like some obvious gap-filling devices from Apple. As long as we are spending Tim Cook's money, here are some suggestions of my own — and no, there are no U2 albums on this list!

Can You See Me Now?

FaceTime is amazing, it is by far the most pleasant video-chat software to use. Adding Center Stage on the iPad Pro makes it even better. It has the potential to be a game-changer for group calls — not the Zoom calls where each person is in their own box, but calls where several people are in one place, trying to talk to several people in another place. Examples are families with the kids lined up on the couch, or trying to play board or card games with distant friends. What I really want in those situations is a TV-size screen, but the Apple TV doesn't support any sort of camera. Yes, you can sort of fudge it by mirroring the screen of a smaller device onto the TV via AirPlay, but it's a mess and still doesn't work right. In particular, your eye is still drawn to the motion on the smaller screen, plus you have to find a perch for the smaller device somewhere close enough to the TV that you are "looking at" the people on the other end.

What I want is a good camera, at least HD if not 4k, that can perch somewhere around the TV screen and talk to the AppleTV directly so that we can do a FaceTime call from the biggest screen in the house. Ideally, this device would also support Center Stage so that it could focus in on the speaker. In reverse, the AppleTV should be able to use positional audio to make the voice of speakers on the far end come from the right place in your sound stage.

Can You Hear Me?

This leads me to the next question: I have dropped increasingly less subtle hints about getting a Home Pod Mini for Christmas, but if people decide against that (some people just don't like buying technology as a gift), I will probably buy at least one for myself. However, the existence of a Home Pod Mini implies the existence of Home Pod Regular and perhaps even a Home Pod Pro — but since the killing of the original-no-qualifiers Home Pod, the Mini is the only product in its family. Big speakers are one of those things that are worth spending money on in my opinion, but Apple simply does not want to take my money in this regard. Maybe they have one in the pipeline for 2022 and I will regret buying the Mini, but right now I can only talk about what's in the current line-up.

Me, I Disconnect From You

This lack of interest in speakers intersects with the same disinterest when it comes to wifi. I loved my old AirPort base station, and the only reason I retired it is that I wanted a mesh network that had some more sophisticated management options. If we are going to put wifi-connected smart speakers all over our homes, why not make them also act as repeaters of that same wifi signal? And they should also work as AirPlay receivers for external, passive speakers, for people who already have good speakers and just want them to be smart.

People Have Families

These additions to Apple's line-up would do a lot more to help Apple "win the home" than Mark Gurman's suggestion of a big static iPad that lives in the kitchen. Apart from the cost of such a thing, it would also require Apple to think much more seriously about multi-user capabilities than they ever have with i(Pad)OS, so that the screen recognises me and shows me my reminders, not my wife's.

Something Apple could do today in the multi-user space is to improve CarPlay. My iPhone remembers where I parked my car and puts a pin in the map. This is actually useful, because (especially these days) I drive my car infrequently enough that I often genuinely do have to think for a moment about where I left it. Sometimes though I drive my wife's car, and then it helpfully updates that "parked car" pin, over-writing the location where I parked my car with the last location of my wife's car — which is generally the garage under the building we live in… The iPhone knows that they are two different cars and lets me maintain car-specific preferences; it just doesn't track them separately in Maps. As long as we are wishing, it would be even better if, when my wife drives her car and leaves it somewhere, if the pin could update in my phone too, since we are all members of the same iCloud Family.

This would be a first step to a better understanding of families and other units of multiple people who share (some) devices, and the sorts of features that they require.


🖼️ Photo by Howard Bouchevereau on Unsplash

Lessons in Hiring

Some of the most insightful and succinct commentary on the whole Antonio Garcìa Martìnez debacle comes from an ungulate with a Classic Mac for a head:

the Macalope believes Apple should not have hired García Martínez only to fire him. He believe it never should have hired him in the first place.

I'm not going to go over all of the many (many, many) red flags about this person's opinions that should have at the very least triggered some additional scrutiny before hiring him. The reaction from Apple employees was entirely predictable and correct. Even if the misogynistic opinions expressed in his public writing were exaggerated for effect, as he now claims, there would always be a question mark around his interactions with female employees or those from minority backgrounds. At the very least, that would be enormously disruptive to the organisation.

Leaving that aspect aside for a moment: even if this had been someone with the most milquetoast opinions possible (and no NYT bestselling book in which to trumpet them), it's still not great that Apple was looking for someone with his specific professional experience — honed at Facebook.

This particular hire blew up in Apple's face — but it's extremely concerning for Apple users that they were actively recruiting for this type of experience in the first place.

I'll lay my cards on the table: I dislike the idea of search ads as a category, especially in the App Store. We can argue the merits of allowing apps to "jump the queue" of results for generic searches, but as it is today, you can buy yourself into a position ahead of your competitor even for direct searches on that competitor app's name. Where is the value to users in that?

Display ads in Apple News or Stocks, which are the other two Apple properties discussed, might be acceptable — as long as they are not too intrusive. I don't have as much of a philosophical issue as some do with Apple using first-party tracking data within iOS, precisely because those data are not available to other parties or to other platforms. It's easy to opt out of Apple's tracking, simply by not using those apps, and ads from there won't follow me around the rest of the web.

The lesson I hope that Apple takes away from this whole situation is not "don't hire people with big public profiles" but "users really hate sleazy adtech". I would hate for Apple to go the way of YouTube, which is becoming unusable due to ad load. I understand that Apple is trying to boost its Services revenue, and App Store search ads are a way to do that, but if it makes my user experience worse, that's a problem. Apple products command a premium in large part because of how nice they are for users; anything that undermines that niceness weakens the rationale for staying in the Apple camp.

Why The M1 Won't Kill The iPad Pro

Quick, it's happening again!

This is my third CPU architecture transition since I've been a Mac user. I started on the 68k, with the weedy 68020 in my first Mac LC. When Apple moved to PowerPC, I cajoled my parents into getting a 603e — still relatively weedy in the context of the dual 604s I got to play with at work, but a massive upgrade from the LC. By the time of the Intel transition I was out of the Apple fold — I couldn't afford one as a student, and later prioritised gaming and dual-booting with Linux.

However, when the MacBook Air launched — yes, the very first one, with the 11" screen, no ports, and no power — I spent my own money rather than use the massive corporate-issue Dell that was assigned to me. Since then I've never looked back; every work computer I've had since that tiny MacBook Air has been a MacBook. My personal computer is my iPad Pro, but I also have a 2nd-gen Mac mini1 which runs headless to take care of various things around the house. An upgrade to SSD and maxed-out 16 GB of RAM keeps it chugging away, nearly a decade in.

When Apple announced the new M1-based Macs, I was blown away like everyone else by the performance on offer. I was particularly relieved to see the M1 Mac mini in the line-up, not because I have any urgent need to upgrade, but just to know that it remains a product in Apple's line-up, for whenever I might need to upgrade in the future. In the same way, I'm not pushing for an early upgrade of my work-issued MacBook Pro, because one of the must-haves for me is support for multiple monitors. I'm assuming that will come with the rumoured 14" Pros that are more clearly differentiated from the Air, so that's what I'm waiting for there.

Most of the commentary is trying to differentiate between the new Air and Pro, and figuring out whether to replace an iMac Pro (or even a Mac Pro!) with the M1 Mac mini. Some, though, have gone the other way, comparing the new MacBook Air to the iPad Pro. The article's conclusion is that "Apple's M1 MacBook Air kills the iPad Pro for the rest of us", but I'm not so sure.

Over-reach

My iPad is a substantially different device from my MacBook, and it gets used for different things, even when I have both within arm's reach. Let's dig into those differences, because they are key to understanding what (I think) Apple's strategy will be for the Mx MacBook and the iPad Pro in the future.

Form Factor

All of the comparisons in that ZDNet article are comparing the big 12.9" iPad Pro to the 13" MacBook Air — which is fair enough on the MacBook side, since that's what Apple has announced so far. On the iPad side, though, most people have the smaller size — currently 11" — and that is the more meaningful basis for differentiation. We'll see whether that changes when and if Apple ever releases a successor to my beloved MacBook Air 11", or SWMBO's (just) MacBook 12", aka the MacBook Adorable — but for now, if you want an ultra-portable device without sacrificing power, the smaller iPad Pro still has an edge.

External Display

Seriously, who connects an external display to an iPad? AirPlay is far more relevant for that use case. Meanwhile, I'm actually more bothered about the fact that no M1 MacBook allows for more than one monitor to be connected.

Webcam

This is a long-standing weak point of the MacBook line, and it's going to be hard to remedy simply due to physics. A better webcam requires more depth, meaning a thicker cover around and behind the screen. Again, though, the use case matters: it's more important for the iPad to have a good built-in webcam, because a MacBook is more likely to have an external one for people who really do care about image quality, resting on top of that external monitor. People who use their MacBook for work care a lot less about image quality anyway, because they may well be looking at a shared document rather than headshots most of the time.

What's Missing

A surprising omission from the list of differences between MacBook and iPad is the operating system. iOS — or rather, iPadOS — is a big differentiator here, because it affects everything about how these devices are actually used. This is the same mistake as we see in those older PC reviews that only compared the hardware specs of Macs to Wintel devices, missing out entirely on the differentiation that came from running macOS as opposed to Windows.

Uhoh, This content has sprouted legs and trotted off.

Confusion

I think the confusion arises from the Magic Keyboard, and how it makes the iPad Pro look like a laptop. This is the foundational error in this list of recommendations to improve the iPad Pro.

Adopt a landscape-first mindset. Rotate the Apple logo on the back and move the iPad’s front-facing camera on the side beneath the Apple Pencil charger to better reflect how most people actually use their iPad Pros.

No! Absolutely not! I use my iPad in portrait mode a lot more than I use it in landscape! Does it bug me that the Apple is rotated when I'm using it with the keyboard? Sure, a little bit — but by definition, I can't see it while I'm doing that.

Release a new keyboard + trackpad case accessory that allows the iPad to be used in tablet mode without removing it from the case.

Now this one I can stand behind: I still miss my origami keyboard case for my iPad Pro, which sadly broke. You could even rotate the Apple logo on the case, while leaving the one on the device in its proper orientation, if you really wanted to.

The reason I still miss that origami case is that I didn't replace it when it broke, thinking I would soon be upgrading my iPad Pro, and I would get a new keyboard case for the new-style flat-edge case. Then Apple did not refresh the iPad Pro line this year, so I still have my 10.5" model.

I do wonder whether this could be the reason why the iPad Pro didn't get updated when the new iPad and iPad Air did. That is, could there be an even better one coming, that differentiates more clearly against the M1 MacBook Air?

Then again, Apple may be getting ready to release a convergent device: a fold-over, touch- & Pencil-enabled MacBook. They would never tell us, so we'll just have to wait and see, credit cards at the ready.


  1. Yes, that really is how you're supposed to capitalise it. No, really

In-App Drama

Everyone and their dog has followed the saga of Hey and Apple — but in case you missed some of the twists and turns, this is a decent recap from The Verge.

My own opinion can be summed up as follows: "Wait, a hundred bucks a year?1 For email? In 2020? Are you insane?" (We also discussed the Hey saga on the most recent episode of the Roll For Enterprise podcast.) In fact, I am far more interested in Bye, the Hey parody that promises to reply to all your email with insults.

That said, there are a couple of different aspects to this story that I think are worth looking at in more detail. One is the PR debacle that this whole saga has been for Apple, and the other is what any of it means for users.

PR Ju-Jitsu

The fact that all this drama went down in the week before WWDC, and at the very same time the EU opens antitrust investigations into Apple’s App Store practices, led many to wonder whether this could be some mastermind move to generate the sort of PR money can’t buy for an email app (because, again, email simply is not exciting in 2020. Ahem).

I don’t buy it. Oh, I am sure that the Hey team chose to launch the week before WWDC very consciously to get more attention, but they could never have expected Apple to approve their initial release, then reject a bug fix, and finally to be so ham-fisted in all of their subsequent moves. To be sure, David Heinemeier Hansson (DHH on Twitter, Hey and Basecamp cofounder) rode the PR wave masterfully, positioning himself as the David (ha!) to Apple’s Goliath. He was largely successful in this effort, judging by an entirely unscientific survey of my Twitter feed.

On the other hand, I am equally sure that Apple did not deliberately set out to pick a fight with a Twitter loudmouth in the week before the biggest event of their year. It does seem that they have been trying for some time to get more paid apps to use their own in-app-purchase (IAP) mechanism, and the reviewer(s) for Hey didn’t anticipate this level of blowback from one more enforcement decision in what is already a long list.

Apple PR did make some pretty heavy-handed and tone-deaf moves. At one point, a letter to Hey was apparently released to the press before it was sent to Hey, which is bad enough, but that letter contains language that DHH was easily able to present as a threat to his other apps in the App Store, which also do not use IAP:

Thank you for being an iOS app developer. We understand that Basecamp has developed a number of apps and many subsequent versions for the App Store for many years, and that the App Store has distributed millions of these apps to iOS users. These apps do not offer in-app purchase — and, consequently, have not contributed any revenue to the App Store over the last eight years. We are happy to continue to support you in your app business and offer you the solutions to provide your services for free — so long as you follow and respect the same App Store Review Guidelines and terms that all developers must follow.

To me this is not a threat, merely a statement of fact. Operating the App Store is not free, and Basecamp, by not offering IAP, has not contributed any revenue whatsoever to Apple.

Mob Tactics?

This is the key point: is Apple merely rent-seeking by attempting to extract their 30% cut from developers, or do they actually offer a service that is worth that overhead?

Ben Thompson has consistently been critical of the App Store’s regulations and their enforcement; in fact he goes so far as to consider it an antitrust issue, and made hay (or Hey) with this story:

I would go so far as to say that executives in the tech industry are more afraid of Apple in 2020 than they were of Microsoft two decades ago. App Store Review is such an absolute gatekeeper, and the number of ways that Apple can retaliate are so varied and hard to verify, that no one is willing to publicly breathe a word against the company — again, except for Basecamp. I wish I could prove this to you — the stories I have received the last few days tell the tale — but no one is willing to go on the record, to me or to regulators. The risk is too great, because Apple’s level of control, and willingness to use it, is so overwhelming. I wish I were exaggerating, but I’m not.

It’s certainly true that the App Store extracts rent from developers, but the key point is that it also adds substantial value. All of the coverage of Hey has focused on Apple and on developers, but I have not seen any significant discussion of the users’ point of view. Customers are more willing to engage with a single trusted intermediary like Apple than with vast numbers of unknown developers. Especially with subscriptions, which are notorious for being easy to start and difficult to impossible to cancel, Apple’s role in the process is invaluable.

The user experience is better because of Apple’s aggressive curation of the App Store experience, and users are more willing to take a chance on apps because of that curation, and because of the established trust relationship they already have with Apple.

Friction Is Traction

It’s easy for DHH to say that Apple is interposing itself between him and his customers. He would rather have a direct relationship with them, and keep the 30% for his bottom line. In his view, the App Store and IAP add unnecessary friction to the smooth transmission back and forth.

Here’s the thing, though: friction is not just a negative. If we remove all friction, we also lose all traction. Intermediaries like Apple add both friction and traction. The way they justify their 30% cut — the friction that DHH complains about — is by offering traction: the technical underpinnings of the App Store — hosting, payments, marketing, and so on — but also by enabling developers to take advantage of the trust that Apple has built up with its customers.

I am happy to have my credit card on file with Apple, so buying an app (or a book, or a film, or music back before I subscribed to Apple Music) is a one-click process. One of the reasons I trust Apple with my credit card is because they let me see and manage my subscriptions in one place, and they let me cancel them and even offer refunds of purchases simply and quickly. I have bought thousands of euros through Apple if you add up apps, books, and media; if I had had to register for each one of those purchases, and ask myself "do I trust this vendor not to scam me or just make my life difficult in some way?", I would not have bought nearly as much.

The restrictions that Apple imposes on iOS — no side-loading of apps outside the App Store, sandboxing of individual apps, Apple ID login — may annoy developers and power users, but they also lower the barrier to installing new apps, because those apps cannot mess up anything else, either deliberately or on purpose. People who have experienced Windows are trained to be extremely reluctant to install new apps; no such caution is needed on iOS, in large part due to Apple’s oversight.

None of this is to say that the App Store experience is perfect for users. I could definitely use better search, as scammy developers seem to be winning this round against Apple and have made searching within the App Store almost pointless. The review process itself needs to be more aggressive in my opinion; especially with my eldest now using the App Store, I have discovered a whole lot of scammy IAP practices! Even then, though, the parental controls built into iOS beat anything Google offers.

Hey Hey, Bye Bye

Personally I hope Apple gets a fright and figures out a better way to continue to give me what I like as a user, without developers feeling ripped off. And regardless, there is no way I am dropping a hundred bucks a year2 on email.


  1. And it turns out, shorter account names cost even more: "Ultra-short 2-character addresses like ab@hey.com are $999/year, and 3-character addresses like abc@hey.com are $349/year." I mean, genius business model, charge whatever the traffic will bear and so on, but I just can’t even. 

  2. In fairness, Hey are hardly the only ones at the super-premium end of the email market. Superhuman charges $30/month to improve your Gmail experience, although this review is pretty uncomplimentary

Won’t Somebody Think of the (Virtual) Users?

Here’s the thing with VR: nobody has yet figured out what – or who – it’s actually for.

It seems like you can’t throw a rock without hitting some wild-eyed evangelist for VR. Apparently the next big thing is going to be VR tourism. On the one hand, this sort of thing could solve problems with overcrowding. Imagine if instead of the Mona Lisa, smaller than you expected, behind a barrier of smudged glass and smartphone-wielding fellow tourists, you could spend undisturbed time as close as you wanted to a high-pixel-count scan. And of course, being VR, you could take selfies from any angle without needing to wield a selfie stick or worry about permits for your camera drone.

On the other, you wouldn’t get to spend time in Paris and experience everything else that the city has to offer. At that point, why not just stay home in your favourite chair, enjoying a piped-in virtual experience, like the passengers of the cruise ship in Wall-E?

That’s the question that the VR industry has yet to answer successfully. Much like commercial-grade fusion power, it remains fifteen years away, same as fifteen years ago, and fifteen years before that. In fact, back at the tail end of last century, I played Duke Nukem 3D in a pub1 with goggles, a subwoofer in a backpack, and something called a 3D mouse. The whole thing was tethered to a pretty hefty gaming PC, which back then probably meant a 166 MHz CPU and maybe a first-gen 3dfx Voodoo graphics card.

It was fun, in the immature way that Duke Nukem was, but once the novelty of looking around the environments had worn off, I didn’t see anything that would make me pay the not-inconsiderable price for a similar setup for myself.

A couple of years ago I was at some tech event or other – maybe MWC? – and had the chance to try the then-new Oculus headset. I was stunned at how little the state of the art had moved forward – but that’s what happens when there is no clear use case, no pull from would-be users of the product, just push from people who desperately want to make it happen.

Now, the (virtual) chickens are coming home to roost. This piece in Fast Company admits the problems, but punts on offering any solutions.

The industry raised an estimated $900 million in venture capital in 2016, but by 2018 that figure had plummeted to $280 million. Oculus—the Facebook-owned company behind one of the most popular VR headsets on the market—planned to deliver 1 billion headsets to consumers, but as of last year had sold barely 300,000.

Investments in VR entertainment venues all over the world, VR cinematic experiences, and specialized VR studios such as Google Spotlight and CCP Games have either significantly downsized, closed down, or morphed into new ventures.

[…]

Ultimately it is down to VR developers to learn from existing success stories and start delivering those "killer apps." The possibilities are limited only by imagination.

Apple, more clear-headed than most, is postponing the launch of its own VR and AR efforts. This is particularly significant because Apple has a history of not being the first mover in a market, but of defining the use case such that every other player follows suit. They did not have the first smartphone, or even the first touchscreen, but it’s undeniable that these days almost every phone out there looks like an iPhone.

It’s not clear at this stage whether the delay in their AR/VR efforts is due to technology limitations or the lack of a clear use case, but either way, the fact that they could not see a way to a useful product does not bode well for anyone else trying to make a go of this market.

Shipping The Org Chart

The players who are staying in are the ones who want VR and AR to succeed for their own reasons, not because they see huge numbers of potential users clamouring for it. This is a dangerous road, as Sun found out to their cost, back in the day.

Read the whole thread, it’s gold.

Here’s the problem for VR: while I don’t doubt that there is a small population of hardcore gamers who would love deeper immersion, there is no killer app for the rest of us. Even console gaming is struggling, because it turns out that most people don’t graduate from casual gaming on their smartphones to "serious gaming". This is the other thing that will kill Google Stadia.

The one play that Apple might have is the one that seems to be working with Apple Arcade: first get devices everywhere, then slowly add capabilities. If Apple came out with a physical controller, or endorsed a third-party one, Apple TV would be an interesting contender as a gaming platform. The same thing could work with AR/VR, if only they can figure out a use case.

If it’s just the Google Glass thing of notifications, but RIGHT IN YOUR EYEBALLS, I don’t think it will go anywhere. The only convincing end-user demo I’ve seen is walking or cycling navigation via a virtual heads-up display, but again, that’s a niche use case that won’t support an entire industry.

This one image set back the AR industry by decades.

I already don’t have time for video, because it requires me to be somewhere where I can pay attention to video, listen to audio, and not be interrupted for maybe quarter of an hour. Adding the requirement for substantial graphics support and power consumption, not to mention the headset itself, and extending the timeline to match, further reduces the applicability of this technology.

But go ahead, prove me wrong.


🖼️ Top photo by Juan Di Nella on Unsplash


  1. This was back in the good old days before drinking-age laws were introduced, which meant that all of us got our drinking done when all we were in charge of was bicycles, limiting potential damage. By the time we got driving licenses, drinking was somewhat old-hat, so there was much less drive to mix the two.