Normally I really enjoy getting as much information about a new Apple product, but the iPad is different. For me there are two things that stand out – the price (at least in the US – no UK specific pricing yet) and the fact that computing has just taken that first step along the path of no-IT-support-necessary computing appliances.
I can see the iPad taking off as a cheap and easy way for businesses to have have employees connected when out and about and still be productive. Having a totally closed operating system on a phone seems just fine – no real impact on the rest of the desktop / laptop / netbook market (the basis of my argument.) By simply increasing the screen size but keeping the closed operating system, you have removed the need for IT support – or at least the ability for you or someone else to sort out a problem. Basically it comes down to whether you have a wifi or 3G connection. No internet means the iPad loses its principal attraction of practical connectivity – and there’s nothing you can do except go somewhere there is a better signal or double check the logon details that you have been given. No use calling IT, they can’t help – they have no more access to the operating system than you do. You wouldn’t call IT because you have a bad signal on your phone (although it does happen occasionally…) so why with a “tablet” machine? But then, with the coming of 4G networks over the next few years, even connectivity on the move should undergo a revolution.
The iPad is a computer appliance rather than a computer – closed in both hardware and software - just like the iPhone is more computer than phone. Although this is a challenge for the whole of IT support (there will be other machines that will come along, on different platforms – and what’s wrong with making computing more straight-forward, even just plain simple?), those who work principally on the Apple platform (ie: me) need to make sure that we offer more than just support.
And that’s Apple’s big achievement this time round…
UPDATE: Just saw this from John Gruber, summing it up in automative fashion:
Used to be that to drive a car, you, the driver, needed to operate a clutch pedal and gear shifter and manually change gears for the transmission as you accelerated and decelerated. Then came the automatic transmission. With an automatic, the transmission is entirely abstracted away. The clutch is gone. To go faster, you just press harder on the gas pedal.
That’s where Apple is taking computing. A car with an automatic transmission still shifts gears; the driver just doesn’t need to know about it. A computer running iPhone OS still has a hierarchical file system; the user just never sees it.
That’s not to say there aren’t trade-offs involved. Car enthusiasts (and genuine experts like race car drivers) still drive cars with manual transmissions. They offer more control; they’re more efficient. But the vast majority of cars sold today are automatics. So too it’ll be with computers. Eventually, the vast majority will be like the iPad in terms of the degree to which the underlying computer is abstracted away. Manual computers, like the Mac and Windows PCs, will slowly shift from the standard to the niche, something of interest only to experts and enthusiasts and developers.
UPDATE 2: Dave Winer raises the spectre of the end of the Mac, amongst other thoughts on the iPad:
At some point Steve is going to get up on stage and tell us it’s the end of the road for the Mac, because the iPad/iPhone OS has sucked all the energy from the Mac.
UPDATE 3: An excellent comparison of “Old world” and “New world” computing – worth reading the whole thing, but here is some:
In the New World, computers are task-centric. We are reading email, browsing the web, playing a game, but not all at once. Applications are sandboxed, then moats dug around the sandboxes, and then barbed wire placed around the moats. As a direct result, New World computers do not need virus scanners, their batteries last longer, and they rarely crash, but their users have lost a degree of freedom. New World computers have unprecedented ease of use, and benefit from decades of research into human-computer interaction. They are immediately understandable, fast, stable, and laser-focused on the 80% of the famous 80/20 rule.
Is the New World better than the Old World? Nothing’s ever simply black or white.
The reason I’m starting to think the Old World is ultimately doomed is because we are bracketed on both sides by the New World, and those people being born today, post-iPhone and post-iPad, will never know (and probably not care) about how things used to work. Just as nobody today cares about floppies, and nobody has to care about manual transmissions if they don’t want to.
…what do you think is more important? An easy-to-use, crash-proof device? Or a massively complex tangle of toolbars, menus, and windows because that’s what props up an entrenched software oligarchy?
The question is not “will the desktop metaphor go away?” The question is “why has it taken this long for the desktop metaphor to go away?”
It looks like IBM’s focus on Smart technologies has distinctly helped its bottom line. The latest IBM investor webcast is probably the driest, most sterile thing I’ve ever listened to but check out the first 10 minutes. Also Big Sam is quoted in the press release:
In 2009, we invested in opportunities such as Smarter Planet solutions, cloud computing and advanced analytics. These new capabilities position IBM to grow as the economy recovers. The increased operational leverage we have established by creating a globally integrated enterprise will enable us to drive greater profits as revenue growth returns. We are confident about 2010 and our ability to achieve the high end of our long-term roadmap.
This Week in Google does it again – the best discussion of Google’s threat to pull out of China after the attacks on their servers. The Chinese government stands accused, even if there is no smoking gun proof… The show’s guest Siva Vaidhyanathan puts his points across brilliantly.
• This Week in Google: The People’s Republic of Google
UPDATE: Eric Schmidt says that Google wants to stay in China…
“We like the Chinese people, we like our Chinese employees. We like the business opportunities there, but we’d like to do that on somewhat different terms than we have but we remain quite committed to being there.”
Mr Schmidt refused to provide any further details, beyond confirming that Google was “in conversation” with the Chinese government. However sources at Google said there were no plans to back-track over its stand on censorship.
I’d not heard of this Wolfram brother, so it was interesting to hear him talk (very eloquently) about his company, Wolfram Alpha and Mathematica. One of the things on my list for this year is to check out Mathematica in a lot more detail…
• Nodalities: In conversation with Conrad Wolfram
Also, here he is at a recent TED conference speaking about the way that maths is (and should) be taught in schools:
He’s a bit defensive and a little uneasy, but he has some good points to make.
I’ve been looking for good predictions of what 2010 will bring, and I’ve found a great one via @timoreilly:
• J.D. Meier’s Blog: Trends for 2010
Plenty to think about.
Any other predictions I find that are worth checking out will be linked to at Spare Cycles’ Delicious list for 2010…
Looking back, 2009 was the year that I clicked with the idea of Smart technologies – analysis of data to make better decisions and predictions, preferably in (near) real time, from a range of sources but principally large data sets and sensors.
It built on a Wired article and a book that I’d read in 2008, but only really kicked in with the arrival of the IBM Smarter Planet advertising campaign half way through the year. I understand that it is designed to sell IBM kit and services but there is definitely momentum to the idea (look up things like “smart grid” or “the internet of things“), and not just from Big Blue.
One step was to start an Open University course on “Data, computing and information“, and although demanding on my time, is interesting. It is a compulsory course to any computing degree with the OU, so may ultimately lead to something bigger…
2010 will be the year to take these ideas and turn them into something more solid as I get my head around what’s really involved. Apologies if some posts here get distinctly geeky over the course of the next year…