Showing all posts tagged tech:

Internet of Meh

There was some excitement when it seemed that 100.000 "smart" devices had been corralled into a botnet used for sending spam. While Ars Technica says there’s more (or less) to that story, I think the situation is both worse and better than reported.

Bad first: of course those devices are vulnerable! Think: once we get past the early adopters, these things are going to be in the hands of people running unpatched Windows XP, who want to call the fire brigade if you mention firewalls, and whose oven (or their VCR, heaven help us) has been blinking 12:00 since it was installed.

The manufacturers will also stop updating the things after about two months of shelf life. Most of the apps on my four-year-old "smart" TV no longer work, to the point that I never even bothered connecting it to the net when we moved house. I threw out a Skype phone because it was never updated for Windows 7, never mind any other platform. And I could go on...

Even after we have accounted for incompetence and laziness, there’s always malice. What happens if the low-powered smart devices that are going to be running the Internet of Things are actually hiding out inside other Things?

We’re doomed, then? The Internet of Things will actually be an Internet of (Even More) Spam?

6a00d8358081ff69e2011571fb0e2d970b-800wi.jpg

Well, smeg.1

Well, no. Most of these smart devices will never be connected to the internet in the first place, because the owners won’t be bothered to do it. They will just keep using the TV as a TV and the fridge as a fridge, without worrying about the extra feeping creatures.

Saved by sloth. Result.


  1. Smeg

Where is cloud headed in 2014?

Cross-posted to my work blog


There's an old joke that in China, it's just food. The main thing that will happen in 2014 is that it will be just computing.

Cloud has gone mainstream. Nobody, whether start-up or enterprise, can afford to ignore cloud-based delivery options. In fact, in many places it's now the default, which can lead to its own problems.

The biggest change in 2014 is the way in which IT is being turned inside out. Whereas before the rhythm of IT was set by operations teams, now the tempo comes from users, developers, and even outside customers. IT operations teams had always relied on being able to set their own agenda, making changes in their own time and drawing their own map of what is inside or outside the perimeter.

The new world of IT doesn't work like that. It's a bit like when modern cities burst their medieval walls, spreading into what had been fields under the walls. The old model of patrolling the walls, keeping the moat filled and closing the gates at night was no longer much use to defend the newly sprawling city.

New strategies were required to manage and defend this new sort of city, and new approaches are required for IT as well.

One of my first customer meetings of 2014 brought a new term: "polyglot management". This is what we used to call heterogeneous management, but I think calling it polyglot may be more descriptive. Each part of the managed infrastructure speaks its own language, and the management layer is able to speak each of those languages to communicate with the infrastructure.

That same customer meeting confirmed to me that the polyglot cloud is here to stay. The meeting was with a customer of many years's standing, a bank with a large mainframe footprint as well as distributed systems. The bank's IT team had always tried to consolidate and rationalise their infrastructure, limiting vendors and platforms, ideally to a single choice. Their initial approaches to cloud computing were based on this same model: pick one option and roll it out everywhere.

Over time and after discussions with both existing suppliers and potential new ones, the CTO realised that this approach would not work. The bank would still try to limit the number of platforms, but now they are thinking in terms of two to three core platforms, with the potential for short-term use of other platforms on a project basis.

When a team so committed to consolidation adopts the heterogeneous, polyglot vision, I think it's safe to say that it's a reality. They have come down from their walls and are moving around, talking to citizens/users and building a more flexible structure that can take them all into the future.

This is what is happening in 2014. Cloud is fading into the background because it’s everywhere. It's just... computing.


Image by Kelly Sikkema via Unsplash

Feeping creatures

If you’ve ever found yourself alone in a datacenter, with the white noise of the cooling fans lulling you into complacency, you may have caught a movement out of the corner of your eye, or thought you heard something go "feep" back in the rows and racks of machinery. You didn’t dream it; there are creatures back there, wandering around and occasionally feeping softly to each other.

65359656922.jpg

From the Feeping Creatures Tumblr

Actually, the truth is far more prosaic. "Feeping creatures" is just a spoonerism for "creeping features", a disease of software products where they sprout features for no good reason. This would not normally be a problem, except for the pesky relationship between code and bugs. Basically, the more features and code you put in, the more bugs you will have.2

If debugging is the process of removing software bugs, then programming must be the process of putting them in. -- Edsger Dijkstra

However, even assuming your code is perfect, with no bugs whatsoever, the rapidly multiplying creatures, all feeping away madly to each other, are probably not doing your users any good. Implementing features that users don’t want or can’t use is a bad idea even if those features are implemented correctly.

Without requirements or design, programming is the art of adding bugs to an empty text file. -- Louis Srygley

Users by and large don’t want features. I’m a nerd, and I geek out on new features, but most people are not like me.1 Most people don’t know how to find geotags in Instagram and ask for locations in comments. My own wife sits watching YouTube videos on her phone screen or in a default-sized window, rather than full-screening or streaming them to the big TV with AirPlay. She’s not multi-tasking, she just can’t be bothered to start messing around with features.

The same thing happens with enterprise tools. These are famously infested with hordes of creatures, their incessant feeping deafening users. In fact, people have to be forced to use these tools, and if there is any workaround, they will take it in a shot.

large.jpg

If people can’t find the feature or use it once they have found it, it’s not useful. Get rid of it and/or figure out another approach.

People have a certain model of what they want to get done with your tool. Since it’s rather unlikely that the models will be the same from one person to the next, one of two things will have to change: either the model they have in their heads, or the usage model for the tool you built.

Guess what? Almost everyone will try to figure out a way to use your tool in a way that makes sense for them. Hardly anyone will Read The Fine Manual, or watch the video walkthrough you made, or access the wiki. Most will try to bash your tool into submission, and in the process develop a hatred for you and your tool that will last for years after they have moved to another job to get away from the tool.

fd8.jpg

Adding features may feel like you’re helping the users, but it’s not. Make a tool that does one thing well - and make sure that one thing is what your users wanted, and that they can actually get it done.


  1. This is only one of many reasons why I stay the heck away from working on consumer products. 

  2. More on the bugs per line of code ratio. I’ve seen this first-hand: remember, I used to work for a software-testing outfit, and still have many friends in that industry. 

Adapt and evolve

A couple of days ago I was on the panel for a Hangout with Mark Thiele, of Switch fame. We had an interesting and wide-ranging chat, but Mark's answer to the last question stuck with me. The question was, what advice would he give to new graduates or in general to young people contemplating a career in IT. His answer was long and considered, but if I had to choose one word to concentrate it, the word would be adaptability.

The idea of adaptability resonates very strongly with me. In my own career, I have had various very different jobs, and I don't expect to have the same job ten years from now. In fact, I fully expect that the job I will have ten years from now does not even exist today, at least in that form! This is the world we live in nowadays, and while IT may be at the bleeding edge of the change, it is coming to all of us. Nobody can assume that they will go to university, study a topic, and work in that narrow field from graduation to retirement. Our way of working is changing, and the pace of change is accelerating.

Perhaps unsurprisingly, some of the stiffest resistance to these changes comes from the field most outsiders would expect to be leading the charge. Many IT people do not like the changes that are coming to the nice cozy world of the datacenter - or, well, the frosty world of the datacenter, unless the cooling system has failed…

I used to be a sysadmin, but that was more than a decade ago. The skills I had then are obsolete, and the platforms I worked on are no more. Some of the habits I formed then are still with me, but their applications have evolved over time. Since then, I've done tech support, both front-line and L2. I've done pre-sales, I've done training, I've done sales, and now I'm in marketing. Each time I changed jobs, my role changed and so did what was expected of me. This is the new normal.

In the last few years, my various roles have had one thing in common: I have been travelling around, showing IT people new technology that can make their jobs better and trying to persuade them to adopt it. My employers have always had competition, of course, but far and away the most dangerous competitor was what a previous boss used to call the Do Nothing Corporation: the status quo. People would say things like "we're doing fine now" or "we don't need anything new", all while the users were beating at the doors of the datacenter with their fists in a combination of frustration and supplication.

This is not your grandfather's IT

Every time you have to execute a task yourself, you lost. Your goal is to automate yourself out of ever having to do something manually. Your job is not to install a system or configure a device, to set up a monitor or to tail a log file; that's not what you were hired for. Your job is to make sure that users have what they need to do their jobs, and that they can get access to it quickly and easily.

This might mean that you have to let go of doing it all yourself - and that's fine. The WOPR was a big, impressive piece of kit, but it was a prop. If only half of "your" IT runs in your datacenter, and the rest is off in the cloud somewhere… well, as long as the users are happy and the business objectives are being met, you're ahead of the game!

Right now there is a certain amount of disillusionment with all this cloud nonsense. According to Gartner, we're about half-way down the slope from the Peak of Inflated Expectations to the Trough of Disillusionment. The cause of much of this disillusionment is partly those inflated expectations built on some overheated rhetoric by cloud boosters, but partly the refusal to accept the changes that are needed.

abstraction.jpeg

This is what it looks like if you try to treat the cloud like the same old IT

Of course cloud is built on servers and hypervisors and storage arrays and routers and switches and firewalls and all the rest of it, and we forget that at our peril. What makes the cloud different, and forces a long-overdue change in how IT works, is the expectation on the part of users. People expect their IT to be instantly available, to work nicely with what they already have - and with what they will add in the future - and to make it easy to understand costs. This is where IT can add value. Provisioning a server - that's a solved problem. That's not even table stakes; that's walking-into-the-casino stakes.

It's not so bad. Join the evolution!

All Software Sucks

It is a truism in high-tech that All Software Sucks. (There is an equally valid corollary that All Hardware Sucks.) The usual reason for restating this universally accepted truth is to deflate someone who is lapsing into advocacy of one platform over another. There is a deeper truth here, though, and it's about who builds the software, who for, and why.

Many open-source projects have terrible interfaces, major usability issues, and appearances that only a mother could love. This is because the creators by and large did not get into the project for that part; they wanted a tool that would perform a specific, often quite technical, task for them, and assembled a band of like-minded enthusiasts to work on the project in their off-hours.

This is great when the outcome of the project is something like the Linux kernel, which is safely hidden away from users who might cut themselves on its sharp edges. Problems start to occur when the band of hackers try to build something closer to the everyday surface that people see and use. There must be a thousand Linux desktop environments around when you count the combinations, and while each of them suits somebody's needs, they are otherwise pretty uniformly horrible to look at and inconsistent to use.

The same applies to enterprise software, but for a slightly different reason. Cynics have long suggested that the problem with enterprise software is that the buyer is not the user: purchasing departments and CIOs don't have to live with the consequences of their decisions. While I don't doubt that some of this goes on, in my experience both groups of people are usually trying to do their best. The problem is with the selection process itself.

Who specs enterprise software? Until very recently, only the IT department, and even now they still do most of it. Much like open-source hackers, they draw up the specification based on their own needs, drivers, and experience. These days, though, more and more people within the enterprise are doing more and more with the software tools, and they expect even more. Gone are the days of sending a supplication through inter-office mail to the high tower of IT. People have become used to self-service systems with attractive interfaces in their personal lives, and they expect the same at work.

Enterprise IT departments are struggling to adapt to this brave new world:

  • Their security policies are crafted around the concept of a perimeter, but that perimeter no longer exists. Personally-owned devices and corporate devices used for personal purposes have fuzzed the edges, and great chunks of the business process and even the infrastructure now live outside the firewall.

  • Their operational procedures are based on the idea that only a small band of experts who all work together and understand each other will work on the infrastructure, but more and more of the world doesn't work that way any more. Whether it's self-service changes for users, shared infrastructure that IT does not have full control over, developers doing their own thing, or even the infrastructure changing itself through automation, there is simply no way for every change to be known, in advance or indeed at all.

  • Their focus on IT is narrow and does not encompass all the ways people are interacting with their devices and systems. In fact, the IT department itself is often divided into groups responsible for different layers, so that an overall view even of the purely technical aspects is difficult to achieve.

This is important right now because enterprise IT departments are looking at a phase change, and many are failing or shying away from the new challenges. These days I am working on cloud management platforms, which are at the intersection of all of these issues. Too many of these projects take too long, fail to achieve all of their objectives, or never even get off the ground.

How does this happen?

The reasons for these failures are exactly what I described above. Here are a couple of real-life examples (names have been removed to protect the guilty).

The CIO of a large corporation in the energy and resource sector illustrated his cloud roadmap to me. The roadmap had been developed with the advice of a large consultancy, and was ambitious, long-term, and complete - except for one thing. After his presentation was complete, I asked him who he expected the users and use cases to be. His answer was: "The IT department, of course!" Noting my flabbergasted expression, he added: "Why, what else are they going to do?" What else, indeed? As far as I know, that roadmap has still not found any existence beyond the CIO's PowerPoint slides.

A major international bank did begin its cloud project, and implemented everything successfully. New and very advanced state-of-the-art hardware was procured, all the infrastructure and management software was installed, everyone congratulated each other, and the system was declared open for business. A few months later, the good cheer had evaporated, as usage of the system was far below projections: a few tens of requests per month, instead of the expected several hundred. It seems that nobody had thought to ask the users what they needed, or to explain how the new system could help them achieve it.

Something even worse happened to a big European telco. The cloud platform was specified, architected, evaluated, selected, and implemented according to a strict roadmap that had been built jointly by the telco’s own in-house architects and consultants from a big-name firm. Soon after the first go-live milestone, though, we all realised that utilisation was well below projections, just as in the case of the bank above.

As it happened, though, I was also talking to a different group within the company. A team of developers needed a way to test their product at scale before releasing it, and were struggling to get hold of the required infrastructure. This seemed to me like a match made in heaven, so I brokered a meeting between the two groups.

To cut a long story short, the meeting was a complete train wreck. The developers needed to provision "full stack" services: not just the bare VM, but several software components above that. They also needed to configure both the software components and the network elements in between to make sure all the bits & pieces of their system were talking to each other. All of this was right in the brochure for the technology we had installed - but the architects flatly refused to countenance the possibility of letting developers provision anything but bare VMs, saying that full-stack provisioning was still nine months out according to their roadmap.

That project managed to stagger on for a while, but eventually died quietly in a corner. I think it peaked at 600 simultaneous VMs, which is of course nothing in the cloud.

What is the lesson of these three stories?

The successful projects and products are the ones where people are not just designing and building a tool for themselves, but for a wide group of users. This is a fundamentally different approach, especially for enterprise IT, but it is necessary if IT is to survive.

So what do we do now?

If you are in enterprise IT, the new cloud services that users are asking for or even adopting on their own are not your competition. If someone out there can offer a better service more cheaply than you can operate it in house, that's great; one less headache for you. Your job is not to be engaged in the fulfilment of each request - because that makes you the bottleneck in the process. That sort of thinking is why IT is so often known as "the department of No". Instead, focus on making sure that each request is fulfilled, on time, on spec, and on budget.

If you are selling to enterprise IT, help them along this road. Talk to the users, and share your findings back with the IT department. This way everybody wins.

Talk to the users, they'll tell you what the problem is. Have no doubt about that.

Missing the point

Another day, another misguided article claiming that "bad attitudes to BYOD put off prospective employes". At least this time they missed out the hitherto obligatory reference to "millennials", whatever more-than-usually-misleading category they might be.

Look, the issue is rarely with BYOD as such. If you're as entitled a know-it-all as to make your employment choices based on whether your prospective employer will let you spend your own money on work technology, there is no help for you. Plenty of companies, my own sainted employer included, offer company-issued Macs and iPhones as optional alternatives to Dells and Blackberries. Wouldn't that be a better trait to look out for?

The problem people have with anti-BYOD policies is that they're generally the tip of an iceberg of bad policy and straightjacketed thinking. Companies that ban BYOD are not far from whitelisting approved executables, restricting admin privileges to users with a "valid and documented reason" for having that access, configuring ridiculously restrictive content firewalls, and so on and so forth.

Others have already explained in depth why BYOD is a symptom of unhealthy IT practices. In fact, the BYODers are arguably doing the company a favour by identifying problem areas. As I had occasion to say on Twitter, users interpret bad IT policies as damage and route around them.

BYOD just happens to be the latest buzzword which people can hang their Dilbertian complaints onto, but reversing that one clause would not fix the problem. In fact, a greater worry is a future in which everyone is required to purchase and maintain IT equipment for work use at their own expense. I might be able to do this now, and in fact I did Spend My Own Money and bought myself an 18 month reprieve from lugging the monster Dull around, but I certainly couldn't have afforded to do that when I started out in my career - at least, not without cutting into other areas of my budget, like food.

Stick to the important concerns. BYOD will fix itself, if all the other pieces are in place.

Stop doing that.

Generally I prefer apps on my iDevices to web pages or "web apps". I like the offline access to historical data, I like the streamlined navigation, and I like the fact that interesting navigational concepts don't kill Safari with megabytes of JavaScript and CSS.

There is one thing that I hate about apps: they all insist on opening web pages inside the app.

Don't do that, not even if John Gruber likes it.

For one thing, Safari has all the cookies, and I don't want to log in to things all over again just because I tapped on a link in an app rather than going through the browser. For another, Reader mode only works in real Safari, not embedded Safari. Finally, all my useful bookmarklets are also only available in Safari; things like "Save to Instapaper", for instance. Even Flipboard, possibly my very favourite iPad app, does this: if you're reading something and you want to bookmark it so that it will persist after you close Flipboard, you have to first "View on Web" and then "Open in Safari". At least these days you can "Read Later" directly from Flipboard without having to back all the way out to Safari, but waiting for developers of other apps to adopt your app is a major stumbling block for adoption of new useful apps.

Images are fine inline, but complete web pages should go to Safari, full stop.

A change will be needed in how Safari manages tabs for this to work. Either it needs a limitless number of tabs, to be managed like the iOS app list, or it needs a warning when opening a new tab will cause an existing one to be closed.

One other feature I want for iOS 7 is a central router for URLs, so that for instance everything to do with twitter.com gets sent to the Twitter app, no matter where it comes from. Some app developers seem to be onboard with this idea; twitter.com now displays a bar along the top of the page offering to open the current view in the native Twitter app, but Google+ and Facebook don't. This leads to the sort of idiocy we see in this screenshot, where clicking on a link in the Google+ iPad app spawns an embedded browser which does not have my G+ ackles.

b282f-gplus-login-scaled1000.png

No, grazie.

This then triggers another rant of mine because Google in their wisdom send you all their content in the local language of wherever their geo-IP code thinks you're located, instead of, oh, for instance
respecting HTTP
Accept-Language
headers.

At least Google seem to have fixed another pet peeve of mine, where the menu with all the different language options was itself localised. While one of the less-publicised benefits of a classical education is the ability to identify Αγγλικα in the menu when browsing from a beach bar somewhere in the Cyclades, this works less well in Riyadh or Bangkok.

google-language-scaled500.png

Go on, now find English.

Nowadays there's a nice "Google is also available in English" popup pretty much everywhere, so there's less call for appending
/ncr
to Google URLs. Progress, finally!

The good news is that things are moving in the right direction, as we can see in the examples of Flipboard and Google, but if the Daring Fireball is issuing plaudits for apps that reinvent their own wheel^W browser, maybe continued progress is not a given.

I brought my device for me, not for you!

Some of the hottest topics right now are Mobile Device Management (MDM), and Bring Your Own Device (BYOD). BYOD was memorably redefined by Vittorio Viarengo on stage at VMworld 2012 as SYOM, which stands for Spend Your Own Money.

5becd-vittorio-viarengo-syom-jpeg-scaled500.jpg

Today however I want to talk about the intersection of those two topics. BYOD is not new; even before laptops were a general-issue item, I was building unofficial machines at work out of scavenged parts to run Linux on. Plenty of people brought their own machines from home, even back then when it was a pretty major logistics challenge.

Techies could not easily be prevented from doing things like reinstalling their corporate-issued devices or adding unofficial devices to the network because they were often the same people who were in charge of enforcing any rules. In other words, they either had the root password to do their jobs, or they were the drinking buddies of the people who did. Since installing Linux on a repurposed desktop was probably the absolute least amount of mischief these people could get up to with that sort of access, since they knew how to stay much safer than average users even on unofficial systems, and since far from interfering with their jobs, all this often made for happier and sometimes even more productive techies, the Powers That Be tended to turn a blind eye.

I was into it before it was cool

With the arrival of devices as light and as simple as iPads and iPhones, this behaviour has moved from being something a few techies might do to become something anyone might do. I still remember the day my mother, a woman who would invite me to lunch just so she could dictate a few e-mails to me and get me to format and print out some bills for her, asked me a question about her iPad, and I gradually understood that she had upgraded the thing to iOS6 on her own, completely unaided. Until then, I would have stated with confidence that my mother was about as likely to update an operating system as to take up competitive unicycling. This was something different, opening up new capabilities to a very different audience.

With that change in audience came a marked change in attitudes. Suddenly BYOD was visible, because people would show up to meetings with iPads or flagrantly non-company-issue MacBook Airs (yes, that would be me), and so suddenly it was a Problem.

In the Enterprise world, for every Problem there is a Solution, or sometimes a Suite. However most of these Solutions are very short-sighted. The whole reason I went out and Spent My Own Money on a MacBook Air when my employer had bought me a perfectly good Dull was that a) the Air weighed about as much as one of the Dull's hinges, so I could actually carry it without one shoulder ending up lower than the other like some fakir, and b) the Air was not weighted down with all the Security Solutions that meant the (quite powerful) Dull took half an hour from boot to when I could actually use it.

Forcing employees - the most dedicated employees, the ones who spend their own money in order to do their job better - to place that yoke back on their own necks is like the cliché of the drunk looking for his keys under the streetlight, even though he lost them somewhere else, because "that's where the light is". The problem isn't my unofficial MacBook, the problem is that my corporate-issue laptop is unusable.

Thou shalt not have a life

MDM applies specifically to mobile devices, such as iPhones or iPads. Many of these are also brought in by employees, although that tide is beginning to turn as the Blackberry loses its grip on enterprise mobile customers. The problem is that where controls on an open device like a laptop can be fairly fine-grained (write-protect this directory, block that port, prevent services from starting, and so on), with phones the granularity is much lower. Often it's limited to preventing installation of particular apps entirely.

Phones and tablets are even more personal than laptops. Just because I volunteer to read my business e-mail on a device, you want to prevent me from installing Dropbox and sharing pictures with my family? No thank you!

phone-personas-scaled500.jpg

Split personality

One solution which has been proposed is to have "personas" on the device, so that at work the phone goes into work mode and only lets you do work things, and at home it locks up all the work content and lets you do personal things. The problem is that we don't live our lives that way any more. My Twitter feed is about one-third work ("look at my company's cool product"), one-third personal ("guess where I am this week!"), and one-third mixed (friends met through work, professional conversations spilling over into conversations about beer, and so on). Should Twitter be blocked, filtered, or left alone? What about Facebook? Hey, what about Foursquare? Isn't it a security risk to know which customers' offices I'm visiting, or which colleagues I'm travelling with?

Get off!

b6aff-get-off-scaled500.jpg

Look, very little of what I do has wider relevance. If a competitor were to get hold of the sort of documents I might be likely to put in my Dropbox, I seriously doubt it would have any effect whatsoever on their planning. The worst thing that might happen if my entire laptop got leaked is that some customers get annoyed because their name gets associated in public with my employers' without their approval, or perhaps some analysts get miffed because information got out before their turn in the briefing schedule. Our stock price would be completely unaffected.

There are maybe a dozen people in the typical company with access to that type of data - M&A plans, that sort of thing - and a few more who hold data that, while not sensitive in itself, is legally protected - personal data on employees or customers, material that is subject to shareholder or SEC disclosure rules - but there are few enough of these people that they can be handled as an exception, without getting in everybody else's way.

There's an old joke about a CEO meeting his CIO and CFO. The CIO is asking for more budget for training his staff. The CFO asks: "What if we train all our staff, and then they leave?". The CIO shoots back: "What if we don't - and they don't?".

Treat your employees like adults, and most of them will behave like adults. The ones who won't will figure out ways over, around or under any walls you care to erect, so you might as well empower the good eggs instead of annoying them.

Windows 8 fun

Since I want to be able to disparage it scientifically and with knowledge aforethought, I downloaded the Windows 8 Consumer Preview.

First off, and already quite questionable: the first download users are offered is actually an installer which will run on a live Windows 7 install and irreversibly upgrade it to Windows 8 CP. Excuse me? Why in the name of Great Cthulhu would I ever want to do that? Fortunately it's also possible to get ISOs, so that is what I did.

The free VMware Player makes it super easy to try out a different OS for a while. I was already using it to power my nostalgia trip into BeOS (you can actually download a pre-configured image from here
- how cool is that?).

Once the ISO is downloaded, Windows 8 installs pretty quickly. It wants a Windows Live! account for the default login, so I used mine and signed in. At this point you get the now-famous Metro UI, with all its various tiles. Behind this there is actually a normal Windows desktop, which apart from the absence of a Start menu would not be too unfamiliar even from Windows 95. The Metro tiles replace the Start menu, or at least its launcher functions, but more on that later.

I decided to spend my time in Metro, since that is the main innovation in Windows 8 as far as I can tell. It quickly becomes apparent that this is by no means a desktop UX model, though. On a 26" monitor running at 1920x1200 pixels, the wasted screen real estate is horrific. I usually spend most of my customization effort in trying to cram more information onto the screen at any given time. Metro, with its drive to full-screen everything, is the antithesis of that model.

The apps look nice enough, although it took me a moment to figure out that a lot of functions require the "charm key" (Windows+C). Just pressing the Windows key takes you back to the Metro tiles, while a lot of app customization options, not to mention features like the control panel, only appear after pressing that "charm key" combination.

mail-scaled500.png

One annoyance is that IE is not the same in Metro as in the old-style desktop. For instance, Flash works on the desktop, but not in the full-screen Metro IE. This sort of minor inconsistency is par for the course from Microsoft, but it shows the pitfalls inherent in trying to create a unified user experience.

people-scaled500.png

The main reaction to all this is "meh". Why bother with this thing? We live in a world which includes the MacOS and all manner of X-based Linux desktop environments for desktop PCs, not to mention of course Microsoft's own efforts in that direction. On the tablet side, there's iOS and Android, both doing a pretty decent job already. What extra sauce does Windows 8 bring to the party?

Leaving aside Metro, which all desktop users will be eager to do after about five minutes, this is a Windows 7 service pack, if that. Metro might be a good fit for tablets, but not if it has Windows Vista crouching in its guts the way this preview does. Installing any of the millions of existing Windows apps sounds great in theory, but what actually happens is that each app creates several new tiles: the app itself, the uninstall icon, and the various other "helpful" icons that all apps seem mandated to install these days. It's one thing when this cruft is hidden several levels down the start menu, but if it's all going to be out in the open, things are going to get messy fast.

desktop-scaled1000.png

And if Windows 8 doesn't include the Windows desktop everywhere, then what's the point of unifying the desktop and tablet interfaces? A mouse pointer has an enormously greater level of precision than fingers, so any interaction model which has to cater to both will end up being compromised.

Tablets and desktops have different form factors and use cases. While there is a case to be made for some convergence, as Apple is showing with increased sharing between iOS and MacOS, I think Windows 8 takes that convergence a bit too far, and the result is half-baked. It's a pity, because I do like the Metro UI - just not on my desktop PC.