They took the credit for your second symphony
Rewritten by machine on new technology
And now I understand the problems you could see.
Could be this has no predictive value regarding how regular people will think about Windows 8, but it’s an eye-opener regarding the risk Microsoft is taking by making essential UI navigation elements hidden until you hover the mouse in the right spots. People navigate with their eyes, not by scrubbing the screen with the mouse. It’s a few minutes long but worth watching for the payoff at the end.
James Whittaker:
The Google I was passionate about was a technology company that empowered its employees to innovate. The Google I left was an advertising company with a single corporate-mandated focus.
Technically I suppose Google has always been an advertising company, but for the better part of the last three years, it didn’t feel like one. Google was an ad company only in the sense that a good TV show is an ad company: having great content attracts advertisers.
Speaking of pals, my friends and former colleagues Bryan Bell and Chris Morris have just released an excellent free game for the iPhone: Instagram meets Memory. Really fun way to do Instagram, and retro Polaroid-inspired UI is just exquisitely well done.
Reuters:
CNN is in talks to buy social media news site Mashable for more than $200 million, according to a source familiar with the discussion.
Going to be hard for anyone to beat my pal Mike Monteiro’s take on this.
Michael Lopp:
Reasonable people are often scared by the new. This is because reasonable people are not Barbarians and they are not hackers. They appreciate the predictable, profitable, and knowable world that comes with a well-defined process, and I would like to thank each and everyone of them because these people keep the trains running and on time. No one likes Barbarians because the Barbarian strategy is one at odds with civilization. By definition, a Barbarian, a hacker, is building on a strategy that is at odds with the majority.
It’s awesome.
Brad McCarty, TheNextWeb:
It’s true, today I’m disappointed in Apple. Not because of the iPad thing. I’m pretty impressed with what the company announced today. My disappointment is a matter of something deeper – a sign that Apple gave in to a carrier, rather than standing up for the customers. Anyone with iOS 5.1 and an AT&T iPhone 4S will now see a 4G symbol in certain areas. The only problem? It’s not really 4G, it’s marketing BS from AT&T.
It is bullshit, but when a new iPad is connected via LTE, it says “LTE” up in the status bar, not “4G”, so it’s not like Apple is pretending “4G” and “LTE” are the same thing.
Marco Arment:
Codifying “via” links with confusing symbols is solving the wrong problem.
The London Evening Standard:
Q: What are your goals when setting out to build a new product?
A: Our goals are very simple — to design and make better products. If we can’t make something that is better, we won’t do it.
Not sure I see Twitter’s angle on this one, unless they see Tumblr as a serious competitive threat.
Second prize: three Motorola Droid Xyboard tablets.
Justin Miller of MapBox on the new maps Apple is using in iPhoto for iOS.
iWork.com always seemed a little silly because Apple never got around to making it work with the iWork Mac apps. They’ve come a long way in a few years, though — by this time next year I’d wager that most iWork users will be storing most of their active documents in iCloud.
I was unavailable, but don’t fret: my pal Jim Dalrymple took my place on this week’s episode of The Talk Show. The topic, of course, is the new iPad.
Farhad Manjoo, writing for Slate:
Let’s say you’re Steve Ballmer, Michael Dell, Meg Whitman, Larry Page, or Intel’s Paul Otellini. How are you feeling today, a day after Apple CEO Tim Cook unveiled the new iPad? Are you discounting the device as just an incremental improvement, the same shiny tablet with a better screen and faster cellular access? Or is it possible you had trouble sleeping last night? Did you toss and turn, worrying that Apple’s new device represents a potential knockout punch, a move that will cement its place as the undisputed leader of the biggest, most disruptive new tech market since the advent of the Web browser? Maybe your last few hours have been even worse than that. Perhaps you’re now paralyzed with confusion, fearful that you might be completely boxed in by the iPad — that there seems no good way to beat it.
Exactly right.
My thanks to MacLegion for sponsoring this week’s DF RSS feed to promote their spring 2012 bundle. It’s a great deal: Billings Pro, Kinemac, MoneyWell, Hydra Pro, Circus Ponies NoteBook, GarageSale, Home Inventory, My Living Desktop, App Tamer, and WhatSize all for only $49.99. That’s $800 worth of apps.
No gimmicks, no tricks, and each app is the latest version and includes full upgrade privileges for future updates.
Paula Rooney, writing for ZDNet:
Apple and Microsoft are getting all the ink in the tablet wars these days but no doubt Android tablets will be matching if not outselling iPads within a year or so.
Or so.
Good piece by Christina Warren at Mashable on the implications of Netflix’s “just use your iTunes account to subscribe” setup on the new Apple TV.
Nice explanation by Iljitsch van Beijnum at Ars, on how iTunes 1080p content can look better without doubling the download size from 720p.
Mika Mobile:
From a purely economic perspective, I can no longer legitimize spending time on Android apps, and the new features of the market do nothing to change this.
So great.
Alistair Barr, reporting for Reuters:
Google Inc has been pressuring applications and mobile game developers to use its costlier in-house payment service, Google Wallet, as the Internet search giant tries to emulate the financial success of Apple Inc’s iOS platform.
Google warned several developers in recent months that if they continued to use other payment methods — such as PayPal, Zong and Boku — their apps would be removed from Android Market, now known as Google Play, according to developers, executives and investors in mobile gaming and payment sectors.
Open beats closed, every time.
“Only Apple could deliver this kind of innovation, in such a beautiful, integrated, and easy-to-use way. It’s what we love to do. It’s what we stand for. And across the year, you’re going to see a lot more of this kind of innovation. We are just getting started.”
That was Tim Cook, closing yesterday’s event introducing the new retina display iPad. Here’s the thing: he was right. To pretend otherwise you have to put your head in the sand (or some other hole).
Cook’s remarks may be immodest, but they are not hyperbole. No other company could today produce something like this new iPad. Not at these prices, at these quantities, at a worldwide scope, with a content ecosystem and user experience of the iPad’s quality. Apple is in a league of its own, and the iPad exemplifies it.
Two years after announcing the original iPad, Apple has produced a version that simply blows that original model away in every single regard. It’s faster, it’s thinner, it feels better in hand, it supports LTE networking, and yet battery life is better. The retina display is simply astounding to behold. Eight days from today they’re shipping a product that two years ago would have been impossible at any price, and they’ve made it look easy.
Nothing is guaranteed to last. The future’s uncertain and the end is always near. Apple’s position atop the industry may prove fleeting. But right now, Apple is Secretariat at the Belmont. And the company, to a person, seems hell-bent on not letting any competitor catch up. ★
I couldn’t disagree more strongly with this piece by Jolie O’Dell, starting with her criticism of the closing image of the event, a photo of which accompanies her article. Evoking the classic six-color Apple logo warmed my heart, and the message could not be more clear: Apple is still Apple.
Claim chowder of the day.
I’m happily surprised by this — I expected them to charge more for LTE than they do for 3G.
Ryan Block’s first impressions of the new iPad:
It’s the best display I’ve ever seen. Anywhere, period. And it makes a meaningful difference to the experience — it’s not just a spec.
Yes.
The Onion covers the new iPad.
Bill Holmes, Netflix:
Starting today, you can sign-up for Netflix directly on your Apple TV and pay via your iTunes account. Plus, with the new third generation Apple TV, you’ll also be able to watch thousands of hours of great movies and television streaming on Netflix in 1080p high definition and with room-filling Dolby Digital 5.1 audio.
You can sign up for MLB.tv on Apple TV through your iTunes account now, too. All of this is coming to everyone with the previous Apple TV, too — the only difference between the new and previous Apple TV is 1080p.
Correction: Another difference: the new Apple TV supports Bluetooth 4, which means going forward, it might support advanced remote peripherals that the old 720p Apple TV doesn’t.
Jonas Lekevicius on Apple’s decision to call the new iPad just “iPad”, and refer to it in marketing as “the new iPad”:
And here’s a prediction: the next iPhone will simply be “The new iPhone”.
I like that prediction.
Speaking of cool new photo editing and management software.
Unbelievably impressive software. The tools are useful and innovative, the use of touch is both natural and fun, and it’s chock full of nice little touches, like being able to choose which side of the display you want the thumbnails on.
I thought the understatement of the day was at the very end of Randy Ubillos’s demo, when he added, as a mere aside, that iPhoto is a universal binary that runs on the iPhone too.
They sure don’t look like Google Maps to me.
Update 1: But I asked, and was told that the maps data is indeed still from Google Maps.
Update 2: OK, what I’m hearing now is that Places still uses Google Maps, but the maps in Journals and slideshows are not using Google Maps, and are Apple’s own stuff.
Whole thing is worth watching, as usual, but I thought Tim Cook’s wrap-up was especially interesting:
“Only Apple could deliver this kind of innovation, in such a beautiful, integrated, and easy-to-use way. It’s what we love to do. It’s what we stand for. And across the year, you’re going to see a lot more of this kind of innovation. We are just getting started.”
Fraser Speirs:
You’re either buying into a platform or you’re buying gadgets. The fundamental disconnect between the apprently solid Android engineering that’s happening at Google and the actual packaging and deployment that’s happening to end-users is turning into a real problem. To my mind, it’s a dealbreaker for schools or anyone thinking beyond their next carrier subsidy.
Toy Story meets The Shining — perfectly-cast storyboards by Kyle Lambert. (Via Andy Baio.)
Ben Brooks:
Lastly I just think they are largely a copycat business with a free model and a heavy focus on UI design over UX design.
Like Brooks, I’ve never been comfortable with the way they collect money on behalf of publishers. And their app is nowhere near as good as Instapaper.
From zero to the U.S. president’s daily briefing in two years.
Chris Sauve examines iOS version fragmentation:
Some folks have told me that it is unfair to compare iOS and Android on this metric because iOS is effectively just three devices (iPod Touch, iPad, iPhone), whereas Android is a multi-manufacturer ecosystem with dozens of devices. This line of thinking is extremely frustrating to me. Developers and users don’t care that the two platforms aren’t the same. Users want the most recent features and security updates, and will demand them either directly (by complaining) or indirectly (by making a different purchasing decision), and developers want a unified base to minimize testing. Android apologists can list off the differences between the two all day long but it doesn’t change the fact that more versions with smaller share is worse for, at the very least, developers and users.
I expect major-new-version adoption rates for iOS to get even better now that Apple has implemented over-the-air software updates.
Joel Hruska, writing for Hot Hardware, which I swear is not a porno site:
So bits of data are just $10 per GB if you buy 3GB in advance, but $67 per GB if you buy a 300MB plan — and this somehow reflects the reality of a competitive situation, or maps in some reasonable fashion to issues like spectrum usage and bandwidth availability. The goal here is to push 3GB+ users with unlimited plans over to tiered options where they’ll pay at least $40 for that use. If this was truly about keeping the network balanced, AT&T would implement a throttling solution that didn’t choke users by as much as 95% once they exceeded the 3GB threshold. It would also offer data plans that created more reasonable tiers of service. As things stand now, AT&T has a major selling point — if you exceed 300MB a month on the $20 plan, you’ll actually end up paying $40 — $10 more then you’ll pay with that nice, roomy 3GB option.
Don’t get me wrong regarding yesterday’s piece on AT&T “unlimited” plan users — AT&T’s data plans have never been fair and the “unlimited” plan was never honest.
Actual search results are falling below the fold.
A week ago, John Battelle wrote a curious response to this Wall Street Journal report about Google circumventing Safari’s (and, notably, Mobile Safari’s) default setting only to accept cookies from visited websites.
Long story short: Web cookies are small bits of saved data that websites can store in your browser. Cookies are restricted by domain; if example.com stores a cookie in your browser, the only website your browser sends that cookie back to is example.com. But, by default, most desktop web browsers allow “third-party” cookies. That means if a page on example.com loads JavaScript from a different domain, that JavaScript is able to use cookies too. One common use is by ad networks; an ad network can set a cookie and then access that same cookie from any website that uses the same ad network. Google makes use of such cookies to display its ads. Ad networks that use cookies in this manner do so in order to track users across websites.
All major browsers give the user control over cookie permissions. Usually, with three options:
The difference with Safari is in the default for this setting. Most major browsers default to the first option, allowing all cookies. Safari and Mobile Safari default to the second, allowing only first-party cookies.
What the WSJ discovered is that Google (and a few other ad networks) found a way to store third-party cookies in Safari and Mobile Safari even when the option was set only to accept cookies from visited websites, as it is by default.
That brings us to Battelle’s response, “A Sad State of Internet Affairs: The Journal on Google, Apple, and ‘Privacy’”. (Background information: Battelle is an expert on Google, and is the founder and executive chair of Federated Media, an ad network.)
Battelle writes:
Here’s the lead in the Journal’s story, which requires a login/registration:
“Google Inc. and other advertising companies have been bypassing the privacy settings of millions of people using Apple Inc.’s Web browser on their iPhones and computers — tracking the Web-browsing habits of people who intended for that kind of monitoring to be blocked.”
Now, from what I can tell, the first part of that story is true — Google and many others have figured out ways to get around Apple’s default settings on Safari in iOS — the only browser that comes with iOS, a browser that, in my experience, has never asked me what kind of privacy settings I wanted, nor did it ask if I wanted to share my data with anyone else (I do, it turns out, for any number of perfectly good reasons).
Battelle has a good point here, which is that the Journal’s use of “intended” is too broad a stroke. Some Safari users have deliberately specified their cookie privacy settings, but most (and I’d wager nearly all) have never changed the default, and don’t even know what cookies are. But that’s true for users of all browsers, not just Safari. And it’s true for all settings, not just cookie preferences. Most users don’t change settings and just use the defaults. Default settings are incredibly important.
I can’t recall any browser prompting me about cookie privacy settings before using it.
Apple assumes that I agree with Apple’s point of view on “privacy,” which, I must say, is ridiculous on its face, because the idea of a large corporation (Apple is the largest, in fact) determining in advance what I might want to do with my data is pretty much the opposite of “privacy.”
The difference with Safari isn’t that Apple has made an assumption about the user’s view regarding cookie privacy; it’s that Apple has made a different assumption than that made by other browser vendors.
Then again, Apple decided I hated Flash, too, so I shouldn’t be that surprised, right?
Deciding that iOS would be better without Flash is not the same thing as deciding that all iOS users “hate” Flash.
In short, Apple’s mobile version of Safari broke with common web practice, and as a result, it broke Google’s normal approach to engaging with consumers.
I’d have used “tracking” in place of “engaging with”, but that’s semantics. My quibble is with the notion that Safari “broke with common web practice”. All major browsers have an option to block third-party cookies. And I’ll bet Safari is not the first to block them by default. What’s new is that Safari (a) blocks third-party cookies by default, and (b) is popular and growing (particularly in mobile).
Safari hasn’t broken the web; it has simply broken the heretofore safe assumption that an overwhelming majority of web surfers accepted third-party cookies.
Was Google’s “normal approach” wrong? Well, I suppose that’s a debate worth having — it’s currently standard practice and the backbone of the entire web advertising ecosystem — but the Journal doesn’t bother to go into those details. One can debate whether setting cookies should happen by default — but the fact is, that’s how it’s done on the open web.
Here, I think Battelle falls off the rails. No one is criticizing Google for using third-party tracking cookies in general. No one. What’s being criticized is Google devising and implementing a method to store third-party cookies in web browsers which are set not to accept third-party cookies. It didn’t happen by accident. Google wrote code specifically to circumvent this setting in Safari.
The Journal article does later acknowledge, though not in a way that a reasonable reader would interpret as meaningful, that the mobile version of Safari has “default” (ie not user activated) settings that prevent Google and others (like ad giant WPP) to track user behavior the way they do on the “normal” Web. That’s a far cry from the Journal’s lead paragraph, which again, states Google bypassed the “the privacy settings of millions of people.” So when is a privacy setting really a privacy setting, I wonder? When Apple makes it so?
Again: we’re all in agreement here that this is a dispute about default settings, not which settings are available for user tweaking. So let’s concede that Battelle has a point that Google didn’t bypass the privacy settings of millions of people so much as they bypassed the privacy settings of millions of web browsers. What Battelle is implicitly arguing here is that it’s OK — or at least not so bad — for Google to bypass browser privacy settings if most users didn’t specify those settings manually. (There’s no way for Google to tell which Safari users block third-party cookies simply by default and which ones block them because they understand what’s going on and have made an explicit choice.)
Since this story has broken, Google has discontinued its practice, making it look even worse, of course.
I’d argue that Google would look worse if they had continued the practice, even after it had been publicized.
But let’s step back a second here and ask: why do you think Apple has made it impossible for advertising-driven companies like Google to execute what are industry standard practices on the open web (dropping cookies and tracking behavior so as to provide relevant services and advertising)? Do you think it’s because Apple cares deeply about your privacy?
Really?
But they haven’t made it impossible. They’ve only changed the default. But the truth is the default setting is all that matters to Google and other immense ad networks, because what matters to them is aggregate user tracking, not individual user tracking.1
I certainly can’t prove that Apple specified this default setting for the sake of user privacy, rather than out of competitive spite against Google.2 But I’m thinking that if you took a thousand random iOS and Mac users, sat them down and explained to them in layman’s terms what browser cookies are and how Google uses them to track their behavior across the web, and then conducted a survey among them as to what Safari’s default cookie privacy setting should be, we’d find out that Apple chose well to break with tradition here.
In this case, what Google and others have done sure sounds wrong — if you’ve going to resort to tricking a browser into offering up information designated by default as private, you need to somehow message the user and explain what’s going on.
Sounds wrong, or is wrong?
Then again, in the open web, you don’t have to — most browsers let you set cookies by default.
So Safari isn’t part of the “open web” because it doesn’t allow ad networks to track users across websites by default? Used to be that all major browsers allowed websites to create pop-up (and pop-under) windows for advertising; are browsers that block such pop-ups by default not part of the “open web” as well?
What I detect in Google’s behavior (and Battelle’s more-or-less defense of it) is a sense of entitlement. That because in the past ad networks could track almost all users via cookies, they are entitled to continue tracking almost all users across the web via cookies, even when a large (and growing) number of them begin using a web browser which, by default, tries to prevent it.
Arguing that Google didn’t do anything wrong — or all that wrong — is one thing. But trying to spin this into an argument that Apple has done something wrong, and that Google was just reacting naturally, is something else. ★
The same goes for Flash. Mobile Safari wasn’t the first browser to ship without support for Flash or other media plugins. What made Mobile Safari’s lack of Flash support controversial is that it was the first popular browser to ship without Flash support, and thus the first to disrupt the assumption that “almost all” browsers had Flash support. ↩
It would be interesting to know what this setting defaulted to on the original iPhone back in 2007, when Apple and Google were buddy-buddy and Eric Schmidt even got to come on stage during the iPhone introduction and talk about what great corporate friends Apple and Google were.
Update 1:This iLounge screenshot gallery of iPhone OS 1.0.2 suggests the default setting has always been to disallow third-party cookies in Mobile Safari.
Update 2:I haven’t found any proof I can link to, but several DF readers attest that Safari for Mac debuted in 2003 with the same cookie privacy default setting. This is also reported by Jonathan Mayer, the privacy/security researcher at Stanford who first uncovered Google’s circumvention of Safari’s cookie privacy settings. ↩
“We’re starting to do some things differently,” Phil Schiller said to me.
We were sitting in a comfortable hotel suite in Manhattan just over a week ago. I’d been summoned a few days earlier by Apple PR with the offer of a private “product briefing”. I had no idea heading into the meeting what it was about. I had no idea how it would be conducted. This was new territory for me, and I think, for Apple.
I knew it wasn’t about the iPad 3 — that would get a full-force press event in California. Perhaps new retina display MacBooks, I thought. But that was just a wild guess, and it was wrong. It was about Mac OS X — or, as Apple now calls it almost everywhere, OS X. The meeting was structured and conducted very much like an Apple product announcement event. But instead of an auditorium with a stage and theater seating, it was simply with a couch, a chair, an iMac, and an Apple TV hooked up to a Sony HDTV. And instead of a room full of writers, journalists, and analysts, it was just me, Schiller, and two others from Apple — Brian Croll from product marketing and Bill Evans from PR. (From the outside, at least in my own experience, Apple’s product marketing and PR people are so well-coordinated that it’s hard to discern the difference between the two.)
Handshakes, a few pleasantries, good hot coffee, and then, well, then I got an Apple press event for one. Keynote slides that would have looked perfect had they been projected on stage at Moscone West or the Yerba Buena Center, but instead were shown on a big iMac on a coffee table in front of us. A presentation that started with the day’s focus (“We wanted you here today to talk about OS X”) and a review of the Mac’s success over the past few years (5.2 million Macs sold last quarter; 23 (soon to be 24) consecutive quarters of sales growth exceeding the overall PC industry; tremendous uptake among Mac users of the Mac App Store and the rapid adoption of Lion).
And then the reveal: Mac OS X — sorry, OS X — is going on an iOS-esque one-major-update-per-year development schedule. This year’s update is scheduled for release in the summer, and is ready now for a developer preview release. Its name is Mountain Lion.1
There are many new features, I’m told, but today they’re going to focus on telling me about ten of them. This is just like an Apple event, I keep thinking. Just like with Lion, Mountain Lion is evolving in the direction of the iPad. But, just as with Lion last year, it’s about sharing ideas and concepts with iOS, not sharing the exact same interaction design or code. The words “Windows” and “Microsoft” are never mentioned, but the insinuation is clear: Apple sees a fundamental difference between software for the keyboard-and-mouse-pointer Mac and that for the touchscreen iPad. Mountain Lion is not a step towards a single OS that powers both the Mac and iPad, but rather another in a series of steps toward defining a set of shared concepts, styles, and principles between two fundamentally distinct OSes.
Major new features
iCloud, with an iOS-style easy signup process upon first turning on a new Mac or first logging into a new user account. Mountain Lion wants you to have an iCloud account.
iCloud document storage, and the biggest change to Open and Save dialog boxes in the 28-year history of the Mac. Mac App Store apps effectively have two modes for opening/saving documents: iCloud or the traditional local hierarchical file system. The traditional way is mostly unchanged from Lion (and, really, from all previous versions of Mac OS X). The iCloud way is visually distinctive: it looks like the iPad springboard — linen background, iOS-style one-level-only drag-one-on-top-of-another-to-create-one “folders”. It’s not a replacement of traditional Mac file management and organization. It’s a radically simplified alternative.
Apps have been renamed for cross-OS consistency. iChat is now Messages; iCal is now Calendar; Address Book is now Contacts. Missing apps have been added: Reminders and Notes look like Mac versions of their iOS counterparts. Now that these apps exist for the Mac, to-dos have been removed from Calendar and notes have been removed from Mail, leaving Calendar to simply handle calendaring and Mail to handle email.
The recurring theme: Apple is fighting against cruft — inconsistencies and oddities that have accumulated over the years, which made sense at one point but no longer — like managing to-dos in iCal (because CalDAV was being used to sync them to a server) or notes in Mail (because IMAP was the syncing back-end). The changes and additions in Mountain Lion are in a consistent vein: making things simpler and more obvious, closer to how things should be rather than simply how they always have been.
Schiller has no notes. He is every bit as articulate, precise, and rehearsed as he is for major on-stage events. He knows the slide deck stone cold. It strikes me that I have spoken in front of a thousand people but I’ve never been as well-prepared for a presentation as Schiller is for this one-on-one meeting. (Note to self: I should be that rehearsed.)
This is an awful lot of effort and attention in order to brief what I’m guessing is a list of a dozen or two writers and journalists. It’s Phil Schiller, spending an entire week on the East Coast, repeating this presentation over and over to a series of audiences of one. There was no less effort put into the preparation of this presentation than there would have been if it had been the WWDC keynote address.
What do I think so far, Schiller asks. It all seems rather obvious now that I’ve seen it — and I mean obvious in a good way. I remain convinced that iCloud is exactly what Steve Jobs said it was: the cornerstone of everything Apple does for the next decade. So of course it makes sense to bring iCloud to the Mac in a big way. Simplified document storage, iMessage, Notification Center2, synced Notes and Reminders — all of these things are part of iCloud. It’s all a step toward making your Mac just another device managed in your iCloud account. Look at your iPad and think about the features it has that would work well, for a lot of people, if they were on the Mac. That’s Mountain Lion — and probably a good way to predict the future of the continuing parallel evolution of iOS and OS X.3
But this, I say, waving around at the room, this feels a little odd. I’m getting the presentation from an Apple announcement event without the event. I’ve already been told that I’ll be going home with an early developer preview release of Mountain Lion. I’ve never been at a meeting like this, and I’ve never heard of Apple seeding writers with an as-yet-unannounced major update to an operating system. Apple is not exactly known for sharing details of as-yet-unannounced products, even if only just one week in advance. Why not hold an event to announce Mountain Lion — or make the announcement on apple.com before talking to us?
That’s when Schiller tells me they’re doing some things differently now.
I wonder immediately about that “now”. I don’t press, because I find the question that immediately sprang to mind uncomfortable. And some things remain unchanged: Apple executives explain what they want to explain, and they explain nothing more.
My gut feeling though, is this. Apple didn’t want to hold an event to announce Mountain Lion because those press events are precious. They just used one for the iBooks/education thing, and they’re almost certainly on the cusp of holding a major one for the iPad. They don’t want to wait to release the Mountain Lion preview because they want to give Mac developers months of time to adopt new APIs and to help Apple shake out bugs. So: an announcement without an event. But they don’t want Mountain Lion to go unheralded. They are keenly aware that many observers suspect or at least worry that the Mac is on the wane, relegated to the sideline in favor of the new and sensationally popular iPad.
Thus, these private briefings. Not merely to explain what Mountain Lion is — that could just as easily be done with a website or PDF feature guide — but to convey that the Mac and OS X remain both important and the subject of the company’s attention. The move to a roughly annual release cycle, to me, suggests that Apple is attempting to prove itself a walk-and-chew-gum-at-the-same-time company. Remember this, five years ago?
iPhone has already passed several of its required certification tests and is on schedule to ship in late June as planned. We can’t wait until customers get their hands (and fingers) on it and experience what a revolutionary and magical product it is. However, iPhone contains the most sophisticated software ever shipped on a mobile device, and finishing it on time has not come without a price — we had to borrow some key software engineering and QA resources from our Mac OS X team, and as a result we will not be able to release Leopard at our Worldwide Developers Conference in early June as planned. While Leopard’s features will be complete by then, we cannot deliver the quality release that we and our customers expect from us. We now plan to show our developers a near final version of Leopard at the conference, give them a beta copy to take home so they can do their final testing, and ship Leopard in October. We think it will be well worth the wait. Life often presents tradeoffs, and in this case we’re sure we’ve made the right ones.
Putting both iOS and OS X on an annual release schedule is a sign that Apple is confident it no longer needs to make such tradeoffs in engineering resources. There’s an aspect of Apple’s “now” — changes it needs to make, ways the company needs to adapt — that simply relate to just how damn big, and how successful, the company has become. They are in uncharted territory, success-wise. They are cognizant that they’re no longer the upstart, and are changing accordingly.
It seems important to Apple that the Mac not be perceived as an afterthought compared to the iPad, and, perhaps more importantly, that Apple not be perceived as itself considering or treating the Mac as an afterthought.
I’ve been using Mountain Lion for a week, preinstalled on a MacBook Air loaned to me by Apple. I have little to report: it’s good, and I look forward to installing the developer preview on my own personal Air. It’s a preview, incomplete and with bugs, but it feels at least as solid as Lion did a year ago in its developer previews.
I’m interested to see how developer support for Mac App Store-only features plays out. Two big ones: iCloud document storage and Notification Center. Both of these are slated only for third-party apps from the Mac App Store. Many developers, though, have been maintaining non-Mac App Store versions of their apps. If this continues, such apps are going to lose feature parity between the App Store and non-App Store versions. Apple is not taking the Mac in iOS’s “all apps must come through the App Store” direction, but they’re certainly encouraging developers to go Mac App Store-only with iCloud features that are only available to Mac App Store apps (and, thus, which have gone through the App Store approval process).
My favorite Mountain Lion feature, though, is one that hardly even has a visible interface. Apple is calling it “Gatekeeper”. It’s a system whereby developers can sign up for free-of-charge Apple developer IDs which they can then use to cryptographically sign their applications. If an app is found to be malware, Apple can revoke that developer’s certificate, rendering the app (along with any others from the same developer) inert on any Mac where it’s been installed. In effect, it offers all the security benefits of the App Store, except for the process of approving apps by Apple. Users have three choices which type of apps can run on Mountain Lion:
The default for this setting is, I say, exactly right: the one in the middle, disallowing only unsigned apps. This default setting benefits users by increasing practical security, and also benefits developers, preserving the freedom to ship whatever software they want for the Mac, with no approval process.
Call me nuts, but that’s one feature I hope will someday go in the other direction — from OS X to iOS. ★
As soon as Schiller told me the name, I silently cursed myself for not having predicted it. Apple is a company of patterns. iPhone 3G, followed by a same-form-factor-but-faster 3GS; iPhone 4 followed by a same-form-factor-but-faster 4S. Leopard followed by Snow Leopard; so, of course: Lion followed by Mountain Lion. ↩
On the Mac, Notification Center alerts are decidedly inspired by those of Growl, a longstanding open source project that is now sold for $2 in the Mac App Store. I hereby predict “Apple ripped off Growl” as the mini-scandal of the day. ↩
There is a feature from the iPhone that I would love to see ported to the Mac, but which is not present in Mountain Lion: Siri. There’s either a strategic reason to keep Siri iPhone 4S-exclusive, or it’s a card Apple is holding to play at a later date. ↩