With WWDC 2016 upon us and its expected focus on the much-anticipated third-party Siri API, I’d like to reflect on how Apple has been preparing native app developers for a world where the experience of a given application service or brand is multifaceted, continuous across platforms, extensible, and, ironically, as a consequence of increasing reach, also at risk of being fragmented, subsumed, and erased.
Last year while attending WWDC, I had the good fortune to participate in the inaugural Layers conference, which was introduced as a design-oriented complement to the generally developer-focused WWDC “main event” of the week. The conference size, the diversity of its events, the approachability of its speakers, the range of topics covered, and the engagement level of the audience all came together to make Layers a truly inspiring and memorable experience. One presentation from last year in particular has stuck with me as I’ve thought through the changes we are seeing in the way we interact with services and brands on our iPhones.
Neven Mrgan spoke about some of his favorite things, and, loosely, the sometimes unexpected goodness that comes from sharing what you love. Chuck Jones’ seminal Duck Amuck cartoon was one of the beloved things Mrgan shared (@ around 20:00), a rare treasure which I agree is both wildly entertaining and endures as required reading for anyone interested in the art and craft of animation, filmmaking, or storytelling and visual art in general. But what caught my attention that day was Mrgan’s off-handed comment about the sequence near the end of the cartoon where the frame literally begins to close in on poor Daffy, collapsing from all directions. He joked the image reminded him of his time designing responsive web applications.
The analogy was spot on and, as I thought about it more, seemed appropriate not only to web app design but equally reflective if perhaps in a more abstract way of the fundamental changes native mobile apps have witnessed in recent years, especially in light of Apple’s announcements that very week, which included: app “thinning” and targeting application assets for different devices, Siri Proactive and app deep linking, and multitasking and split-screen window management in both OS X and iOS. Through the lens of Duck Amuck, Apple’s announcements could be seen as the latest in a series of operating system enhancements (cf. iOS 8’s share sheets and extensions framework, notification center widgets) designed to deconstruct what it means to be — and what it means to experience — a modern, mobile (iOS) application.
The trend continued with the hardward-specific “peek and pop” feature enabled by 3D Touch announced later in the year as a feature exclusive to the iPhone 6S and 6S Plus. So it wasn’t surprising that many industry commentators began to question the long-term fate of native apps. And given recent investments in messaging and chat bots, major platform players, especially those who have been laggards in mobile, appear anxious to speed the demise of the standalone app (store), instead encouraging companies to deliver their services on their respective “aggregator apps” or “portals” as Ben Evans astutely describes them, such as Facebook Messenger, Microsoft Skype, Kik, Slack, and the most successful to date, WeChat. Even Amazon’s Echo requires a companion mobile app to manage third-party aggregated services through so-called “skills”.
In parallel, responsive web design emerged to address the proliferation of devices accessing the world wide web today, and effectively, with limitations, provide native app-like experience to web apps when accessed by mobile devices. Rather than developing and serving mobile-specific sites (remember WAP?) for mobile devices, the responsive web is built on the idea that a site should be fluid in nature and able to gracefully adapt its content and services to accommodate the way in which that content is accessed and used.
As with responsive web design, native apps will continue to evolve with greater adaptability based on device type and screen size (including no screen at all) and, by extension, their respective interaction models. Traditional graphical UIs will co-exist with conversational voice and text options. The challenge will be how best to maintain context across these interaction models and move users as seamlessly as possible between them when necessary. And users, for their part, should prove equally adaptable as long as expectations are clearly communicated, transitions are consistent and predictable, and transactions are fast, accurate, and add value.
Rest in peace, and thank you for the films you leave with us.
Netscape Navigator was a browser created by a group led by a twenty-four-year-old named Marc Andreessen, who was described in Newsweek as “the über-super-wunder whiz kid of cyberspace.” The company’s I.P.O., on August 9, 1995, was a huge success. Five million shares went on sale on Nasdaq, at twenty-eight dollars a share; they closed the day at $58.25. The Times called it “the best opening day for a stock in Wall Street history for an issue of its size.”
A little more than two weeks later, Microsoft released Windows 95, backed by what was reported to be a three-hundred-million-dollar marketing campaign, along with its own browser, Internet Explorer 1.0, and the browser wars were on. Netscape, of course, was quickly and easily outmuscled by Microsoft. In 1998, Netscape was acquired by AOL, and it faded into insignificance.
On the eve of Apple's "Spring Forward" event, I thought it would be worthwhile to revisit Robert Cringely's prediction for 2015 as the year "when nothing happened" and, particularly, his take on the significance and reception of the Apple Watch:
The Apple Watch is Cupertino grabbing mindshare and early adopter wallets, nothing else.
Even those who are typically bullish on Apple seem to be restraining their enthusiasm and predictions of massive success for the Apple Watch. All except the stock market itself, with AAPL up over 14% YTD. The naysayers are hedging a bit too. If you widen the view on Cringely's comment, he isn't exactly dismissing the Apple Watch to the dustbin of history, instead pointing to 2016 as the year to watch.
I am hopeful he's wrong. I personally find even the most fanciful aspects of the Apple Watch experience intriguing, at least as they have been described up to this point, and applaud Apple for the tact they have taken as they methodically enter the nascent wearables market. I expect many recent enhancements to the iOS experience like Touch ID and the application extensions framework will be all the more relevant once the Watch is in the wild.
Critically, though, unlike the release of the iPhone in 2007, there isn't an obvious problem begging to be simplified and redefined. We aren't dissatisfied with our time pieces in the same way that so-called smartphones left much to be desired eight years ago. The Apple Watch is a much harder sell because of this; it is trying to extend, and ideally in many situations, replace the iPhone experience itself (which obviously is a bit thorny for Apple, though they have been famously comfortable with product cannibalization before) rather than displace something already taking up space on everyone's wrist.
Will the convenience and attempt at a kind of naturalness by situating tech on your arm rather than in your pocket, be as obvious when we look back on 2015 as portable touch screens appear now, when we reflect on the dark ages of 2006?
It might just come down to the distillation Apple is promising with the interface elements pictured in the photo above: the new pressure-sensitive screen, the much-fetishized Digital Crown, and, simply, the Button. While much has been written recently about the attention Apple is paying to the fashion-related aspects of the Apple Watch (e.g., luxury options, extensive customization compared to previous products), perhaps the obviousness and must-haveness of the device will emerge in its everyday use, where routine things will get done faster and information will be transmitted with less friction and without the encumbrances of even the most modern of smartphone interactions. And that is to say nothing of the integration of Apple's Siri personal assistant and speech recognition tech, which Tim Cook boasts using "all the time". Will Apple Watch be Siri's debutante coming-of-age?
One thing is certain: as my father-in-law reminded me tonight, given Apple's recent history, it would be foolhardy to categorically dismiss anything they aspire to do with the Apple Watch. For perhaps the first time in the company's history, it seems Apple has earned the benefit of the doubt.
We are moving the company’s downtown Chicago offices to a larger space closer to Union Station; both good things. We invited Eastlake Studio to help make the new location awesome and I recently visited their offices in the Chicago Tribune Tower. I’ve walked and driven past the 1920s landmark countless times before but never had the opportunity to go inside.
The above photo is the view from Eastlake's offices looking west-southwest. Not bad, right? How wonderful to have such inspiration just outside your window every day.
A couple of technical notes
1) This capture was taken very quickly with an iPhone 5S. A snapshot. Nonetheless, I am thoroughly impressed with the detail and resolution the 5S pulls off — from street-level sidewalks to the distant top floors of the Willis Tower to the weathered stonework of the Wrigley Building and the shadow detail of IBM Plaza (now AMA Plaza). Perhaps I’ve just grown accustomed to digital aesthetics, and a future print likely will be the true test, but I'm not sure I would have been able to produce anything like this with my 35mm gear, certainly not without preparation.
2) If you peek at the metadata, you will find that I used Adobe Lightroom 5 to post-process the image (monochrome conversion, lens correction, and perspective adjustment). As a long-time Apple Aperture user and evangelist, this marks my first tentative steps toward an all-Adobe workflow. It’s no small decision and part of me is holding out hope that the mere thought of ditching Aperture will have some cosmic effect resulting in the announcement and release of Aperture 4.0 at WWDC next month, delivering all the goodness of Lightroom and more. But I am doubtful. Despite my deep (some would say non-rational) reservations about Adobe generally, for anyone looking to get more serious about photography today, I would be hard-pressed to recommend Aperture. I wish things were different.
Since one cannot easily move one's work and time invested in one tool to the other, perhaps the best choice is to rely on them as little as possible. There are likely alternatives out there which I am not aware of, but it seems to me that photo file management and lightweight image post-processing is an area begging for innovation, including cross-platform support, long-term scalable network storage, and auto-curation beyond map views, face recognition, and “on this day” flashbacks (as provided by the now-defunct Everpix).
It only took six years, but I am pleased to report I am one state away from visiting all fifty United States, a long-standing box to be checked on the bucket list.
Again, SG came through and arranged for a weekend excursion to nearby Little Rock, Arkansas. Unlike our surprise getaway to Omaha, Nebraska eight years ago, we had a couple of extra travelers in tow this time, which definitely put a different spin on things (note to self: next time, fly). We all agreed the River Rail Streetcar was a highlight. Other recommended destinations include the William J. Clinton Presidential Library, Little Rock Central High School and National Historic Site, and the Arkansas State Capitol Building. While the boys were a bit young to fully appreciate the significance of LRCHS in the history of African-American civil rights, the visit was a good learning opportunity for all of us.
With tornadoes forecasted, we decided to head back home early on Sunday. Waking Monday morning, I was surprised to hear how much damage had occurred; my thoughts are with those who lost loved ones to the storms.
The Sports Critic writing on the 10th anniversary of the 2003 NLCS:
The Cubs led the 2003 NLCS three games to one [sic]. In Game 6, they led the Marlins 3-0 going into the top of the 8th at Wrigley Field. With one out and a runner on second, the Marlins’ Luis Castillo lofted a foul ball destined for infamy. Left fielder Moises Alou chased it to the stands. He leaped for the ball that was directly over the wall. A fan attempting to catch the ball himself knocked it away from Alou. Castillo ended up walking, and the Marlins then scored eight runs in the inning to eventually win the game.
In Game 7, the Cubs led 5-3 after two, thanks to home runs from Kerry Wood and Alou before the pitching gave up six runs to lose it.
Speaking at this year's San Francisco International Film Festival, for its State of Cinema address, Steven Soderbergh offered the following definition of cinema (emphasis mine) as part of his general assessment of today's Hollywood film industry:
The simplest way that I can describe it is that a movie is something you see, and cinema is something that’s made. It has nothing to do with the captured medium, it doesn’t have anything to do with where the screen is, if it’s in your bedroom, your iPad, it doesn’t even really have to be a movie. It could be a commercial, it could be something on YouTube. Cinema is a specificity of vision. It’s an approach in which everything matters. It’s the polar opposite of generic or arbitrary and the result is as unique as a signature or a fingerprint. It isn’t made by a committee, and it isn’t made by a company, and it isn’t made by the audience. It means that if this filmmaker didn’t do it, it either wouldn’t exist at all, or it wouldn’t exist in anything like this form.
He later adds . . .
But the problem is that cinema as I define it, and as something that inspired me, is under assault by the studios and, from what I can tell, with the full support of the audience.
Writing for the New York Times, A.O. Scott suggests that Soderbergh's self-described rant is more about the realization of his much-publicized retirement from traditional filmmaking and embrace of other modes of cinematic production (e.g., television and even Twitter) in order to express one's "vision" than a fully baked notion of cinema with a capital C. His embrace of new technologies, especially in terms of where and how cinema might be encountered (say, in contrast to David Lynch's colorful and unambiguous contempt for watching movies on mobile phones) is open-minded and provocative, though risks too broad a stroke; as Scott points out, Soderbergh uses the term [cinema] "more or less as a synonym for art".
Yet, I find it curious, along with a casual dismissal of generic conventions and the accidental ("arbitrary") aspects of the creative process, he is quick to implicate those who would feed him, his audience, to adopt a seemingly old school auteurist view, where movies attain the status of cinematic endeavor at the hands of their director-author, especially because of his or her (not necessarily literal) struggle with an indifferent, even hostile studio system. Soderbergh further contends the narrowing of options for filmmakers today goes beyond the studio executive's stereotypical intolerance for ambiguity and narrative complexity, and is symptomatic of an American appetite for escapism in response to 9/11, the trauma of which still haunts the box office, if not our everyday lives.
When the Lumières first exhibited their new invention the cinématographe and accompanying short films, including La Sortie des usines Lumière à Lyon (1895), in Paris on December 28, 1895, the idea of cinema (as the intersection of a paying audience watching moving images projected on a screen) was born. Since then, I'm not sure there has ever been a time when its identity, especially in terms of how films should be presented and truly experienced, hasn't been in some sort of crisis; for example in response to the emergence of television and the "domestication" of cinema during the 1950s and 1960s or the advent of cable television, VCRs and laser discs in the early 1980s, to name just two of the better known threats. Today, cinema is experiencing redefinition through the lens of the Internet, tablet computers and iPhones, and digital projection. I am thankful for Soderbergh's candid and obviously passionate observations concerning the economic realities of contemporary Hollywood but I also think it is important not to discount the role of the audience as we contemplate what makes cinema (beyond aesthetics, tools, and authorship). As with new methods for the signature production and mass distribution of something that might be considered cinematic (per Soderbergh's qualifications), new audiences also emerge and are equally relevant to cinema's continuing evolution and transformation.