Did 1995 Change Everything?

Twenty years (and some change) ago:

Netscape Navigator was a browser created by a group led by a twenty-four-year-old named Marc Andreessen, who was described in Newsweek as “the über-super-wunder whiz kid of cyberspace.” The company’s I.P.O., on August 9, 1995, was a huge success. Five million shares went on sale on Nasdaq, at twenty-eight dollars a share; they closed the day at $58.25. The Times called it “the best opening day for a stock in Wall Street history for an issue of its size.”

A little more than two weeks later, Microsoft released Windows 95, backed by what was reported to be a three-hundred-million-dollar marketing campaign, along with its own browser, Internet Explorer 1.0, and the browser wars were on. Netscape, of course, was quickly and easily outmuscled by Microsoft. In 1998, Netscape was acquired by AOL, and it faded into insignificance.

In the midst of Netscape’s I.P.O, which W. Joseph Campbell contends woke the world up to the Internet, I was happily ensconced on campus at The University of Chicago, spending my time reading, attending screenings and workshops, and, on occasion, shivering in the computer lab (curiously reachable only through Harper Library if memory serves) exploring the nascent world wide web, but more likely checking email, the most used “app” by far at the time. Thinking back on the “browser wars,” I don’t agree that Netscape faded into insignificance so quickly, though I do remember acknowledging their impossible odds. That said, I recall well into 2003 still contending with the vagaries of Netscape and IE while designing and coding (i.e. grappling with javascript) websites. Though, by that point, the seeds of change certainly had been planted. On January 7, 2003, Steve Jobs announced Safari (Apple’s fork of KHTML), helping to set the stage for improved cross-browser standardization which is still largely the trend today.

Apple Watch and the Benefit of the Doubt

Predictions on the significance and reception of Apple’s latest product.

On the eve of Apple’s “Spring Forward” event, I thought it would be worthwhile to revisit Robert Cringely’s prediction for 2015 as the year “when nothing happened” and, particularly, his take on the significance and reception of the Apple Watch:

The Apple Watch is Cupertino grabbing mindshare and early adopter wallets, nothing else.

Even those who are typically bullish on Apple seem to be restraining their enthusiasm and predictions of massive success for the Apple Watch. All except the stock market itself, with AAPL up over 14% YTD. The naysayers are hedging a bit too. If you widen the view on Cringely’s comment, he isn’t exactly dismissing the Apple Watch to the dustbin of history, instead pointing to 2016 as the year to watch.

I am hopeful he’s wrong. I personally find even the most fanciful aspects of the Apple Watch experience intriguing, at least as they have been described up to this point, and applaud Apple for the tact they have taken as they methodically enter the nascent wearables market. I expect many recent enhancements to the iOS experience like Touch ID and the application extensions framework will be all the more relevant once the Watch is in the wild.

Critically, though, unlike the release of the iPhone in 2007, there isn’t an obvious problem begging to be simplified and redefined. We aren’t dissatisfied with our time pieces in the same way that so-called smartphones left much to be desired eight years ago. The Apple Watch is a much harder sell because of this; it is trying to extend, and ideally in many situations, replace the iPhone experience itself (which obviously is a bit thorny for Apple, though they have been famously comfortable with product cannibalization before) rather than displace something already taking up space on everyone’s wrist.

Will the convenience and attempt at a kind of naturalness by situating tech on your arm rather than in your pocket, be as obvious when we look back on 2015 as portable touch screens appear now, when we reflect on the dark ages of 2006?

It might just come down to the distillation Apple is promising with the interface elements pictured in the photo above: the new pressure-sensitive screen, the much-fetishized Digital Crown, and, simply, the Button. While much has been written recently about the attention Apple is paying to the fashion-related aspects of the Apple Watch (e.g., luxury options, extensive customization compared to previous products), perhaps the obviousness and must-haveness of the device will emerge in its everyday use, where routine things will get done faster and information will be transmitted with less friction and without the encumbrances of even the most modern of smartphone interactions. And that is to say nothing of the integration of Apple’s Siri personal assistant and speech recognition tech, which Tim Cook boasts using “all the time”. Will Apple Watch be Siri’s debutante coming-of-age?

One thing is certain: as my father-in-law reminded me tonight, given Apple’s recent history, it would be foolhardy to categorically dismiss anything they aspire to do with the Apple Watch. For perhaps the first time in the company’s history, it seems Apple has earned the benefit of the doubt.

View from Chicago Tribune Tower, 3:08 PM

We are moving the company’s downtown Chicago offices to a larger space closer to Union Station; both good things. We invited Eastlake Studio to help make the new location awesome and I recently visited their offices in the Chicago Tribune Tower. I’ve walked and driven past the 1920s landmark countless times before but never had the opportunity to go inside.

The above photo is the view from Eastlake’s offices looking west-southwest. Not bad, right? How wonderful to have such inspiration just outside your window every day.

A couple of technical notes

1) This capture was taken very quickly with an iPhone 5S. A snapshot. Nonetheless, I am thoroughly impressed with the detail and resolution the 5S pulls off — from street-level sidewalks to the distant top floors of the Willis Tower to the weathered stonework of the Wrigley Building and the shadow detail of IBM Plaza (now AMA Plaza). Perhaps I’ve just grown accustomed to digital aesthetics, and a future print likely will be the true test, but I’m not sure I would have been able to produce anything like this with my 35mm gear, certainly not without preparation.

2) If you peek at the metadata, you will find that I used Adobe Lightroom 5 to post-process the image (monochrome conversion, lens correction, and perspective adjustment). As a long-time Apple Aperture user and evangelist, this marks my first tentative steps toward an all-Adobe workflow. It’s no small decision and part of me is holding out hope that the mere thought of ditching Aperture will have some cosmic effect resulting in the announcement and release of Aperture 4.0 at WWDC next month, delivering all the goodness of Lightroom and more. But I am doubtful. Despite my deep (some would say non-rational) reservations about Adobe generally, for anyone looking to get more serious about photography today, I would be hard-pressed to recommend Aperture. I wish things were different.

Since one cannot easily move one’s work and time invested in one tool to the other, perhaps the best choice is to rely on them as little as possible. There are likely alternatives out there which I am not aware of, but it seems to me that photo file management and lightweight image post-processing is an area begging for innovation, including cross-platform support, long-term scalable network storage, and auto-curation beyond map views, face recognition, and “on this day” flashbacks (as provided by the now-defunct Everpix).


It only took six years, but I am pleased to report I am one state away from visiting all fifty United States, a long-standing box to be checked on the bucket list. 

Again, SG came through and arranged for a weekend excursion to nearby Little Rock, Arkansas. Unlike our surprise getaway to Omaha, Nebraska eight years ago, we had a couple of extra travelers in tow this time, which definitely put a different spin on things (note to self: next time, fly). We all agreed the River Rail Streetcar was a highlight. Other recommended destinations include the William J. Clinton Presidential Library, Little Rock Central High School and National Historic Site, and the Arkansas State Capitol Building. While the boys were a bit young to fully appreciate the significance of LRCHS in the history of African-American civil rights, the visit was a good learning opportunity for all of us.

With tornadoes forecasted, we decided to head back home early on Sunday. Waking Monday morning, I was surprised to hear how much damage had occurred; my thoughts are with those who lost loved ones to the storms.

Game 7

The Sports Critic writing on the 10th anniversary of the 2003 NLCS:

The Cubs led the 2003 NLCS three games to one [sic]. In Game 6, they led the Marlins 3-0 going into the top of the 8th at Wrigley Field. With one out and a runner on second, the Marlins’ Luis Castillo lofted a foul ball destined for infamy. Left fielder Moises Alou chased it to the stands. He leaped for the ball that was directly over the wall. A fan attempting to catch the ball himself knocked it away from Alou. Castillo ended up walking, and the Marlins then scored eight runs in the inning to eventually win the game.

In Game 7, the Cubs led 5-3 after two, thanks to home runs from Kerry Wood and Alou before the pitching gave up six runs to lose it.

Steven Soderbergh on Cinema

Don’t discount the role of the audience as we contemplate cinema’s latest crisis.

Speaking at this year’s San Francisco International Film Festival, for its State of Cinema address, Steven Soderbergh offered the following definition of cinema (emphasis mine) as part of his general assessment of today’s Hollywood film industry:

The simplest way that I can describe it is that a movie is something you see, and cinema is something that’s made. It has nothing to do with the captured medium, it doesn’t have anything to do with where the screen is, if it’s in your bedroom, your iPad, it doesn’t even really have to be a movie. It could be a commercial, it could be something on YouTube. Cinema is a specificity of vision. It’s an approach in which everything matters. It’s the polar opposite of generic or arbitrary and the result is as unique as a signature or a fingerprint. It isn’t made by a committee, and it isn’t made by a company, and it isn’t made by the audience. It means that if this filmmaker didn’t do it, it either wouldn’t exist at all, or it wouldn’t exist in anything like this form.

He later adds . . .

But the problem is that cinema as I define it, and as something that inspired me, is under assault by the studios and, from what I can tell, with the full support of the audience.

Writing for the New York Times, A.O. Scott suggests that Soderbergh’s self-described rant is more about the realization of his much-publicized retirement from traditional filmmaking and embrace of other modes of cinematic production (e.g., television and even Twitter) in order to express one’s “vision” than a fully baked notion of cinema with a capital C. His embrace of new technologies, especially in terms of where and how cinema might be encountered (say, in contrast to David Lynch’s colorful and unambiguous contempt for watching movies on mobile phones) is open-minded and provocative, though risks too broad a stroke; as Scott points out, Soderbergh uses the term [cinema] “more or less as a synonym for art”.

Yet, I find it curious, along with a casual dismissal of generic conventions and the accidental (“arbitrary”) aspects of the creative process, he is quick to implicate those who would feed him, his audience, to adopt a seemingly old school auteurist view, where movies attain the status of cinematic endeavor at the hands of their director-author, especially because of his or her (not necessarily literal) struggle with an indifferent, even hostile studio system. Soderbergh further contends the narrowing of options for filmmakers today goes beyond the studio executive’s stereotypical intolerance for ambiguity and narrative complexity, and is symptomatic of an American appetite for escapism in response to 9/11, the trauma of which still haunts the box office, if not our everyday lives.

When the Lumières first exhibited their new invention the cinématographe and accompanying short films, including La Sortie des usines Lumière à Lyon (1895), in Paris on December 28, 1895, the idea of cinema (as the intersection of a paying audience watching moving images projected on a screen) was born. Since then, I’m not sure there has ever been a time when its identity, especially in terms of how films should be presented and truly experienced, hasn’t been in some sort of crisis; for example in response to the emergence of television and the “domestication” of cinema during the 1950s and 1960s or the advent of cable television, VCRs and laser discs in the early 1980s, to name just two of the better known threats. Today, cinema is experiencing redefinition through the lens of the Internet, tablet computers and iPhones, and digital projection. I am thankful for Soderbergh’s candid and obviously passionate observations concerning the economic realities of contemporary Hollywood but I also think it is important not to discount the role of the audience as we contemplate what makes cinema (beyond aesthetics, tools, and authorship). As with new methods for the signature production and mass distribution of something that might be considered cinematic  (per Soderbergh’s qualifications), new audiences also emerge and are equally relevant to cinema’s continuing evolution and transformation.


Cross-posted to Medium on July 3, 2013.

Roger Ebert (1942 – 2013)

I never met Roger Ebert and I doubt he knew of me directly. Yet he played an important if brief role in my graduate education that I’d like to share in his memory.

During my stint at the U of C, I volunteered in various roles at Doc Films. For me, to spend evenings threading a projector or dreaming up (and sometimes programming) film series with fellow movie buffs was a welcomed antidote to the removed, sometimes too abstract, relationship one has with cinema as a student of critical theory and cultural studies. In 1998, for a  series I co-curated comprised of films by foreign directors with the idea of America and American culture as central themes, a fellow Doc volunteer, who also worked as a projectionist for Roger’s continuing ed course at the Graham School, asked Roger one night for his advice on movies we should consider including. His response surprised me: W.R.: Mysteries of the Organism (Dušan Makavejev, 1971), not the kind of film I would have ever expected from the critic whom I had dismissed over the years as too mainstream and forgiving in his tastes, responsible for reducing film criticism to thumbs pointed in one direction or another. The obscurity and uniqueness of the suggestion and (I’m told) the alacrity with which it was offered opened my eyes to better appreciate and understand the breath of Ebert’s knowledge of film history and his unpretentious approach to and appreciation of movies, in all forms. For him, it may very well have been a passing thought at the end of a long day or week, one of countless recommendations made over a brilliant, sustained, and unprecedented career (even then) but in my mind’s eye, I felt squarely put in my place knowing there was much still to learn and to see.

No doubt my story will be joined by many other, perhaps similar, remembrances in the coming days and weeks, of the small yet profound way Roger Ebert touched our lives. It has always been reassuring to know he was there to turn to — whether through his movie reviews, books, blog, interviews, or in casual conversation wrapping up a course screening — ready to share his passion for movies and his love of life.

Timed Comments and a Call for Blog Side Notes

The most impressive thing about social media site SoundCloud is its signature feature: to graphically represent in spatial terms what is usually experienced non-graphically in time, the waveform1 of an uploaded audio clip. By laying out the amplitude of the audio recording, SoundCloud emphasizes duration of experience, pointing to its peaks and valleys, and, most important for my purposes here, allows for the insertion of time-coded feedback.

As consumption of web-based media has evolved over the past decade+, we’ve grown accustomed to eating whole this or that bit and then, when offered the opportunity, provide feedback at the end and participate in a comment thread. Granted, one can excerpt the relevant content (or time stamp in the case of audio or video) for which the comment is addressed but this localization is still displaced temporally. With SoundCloud, we are given the opportunity to attach one’s commentary to a specific moment within the audio stream so that it can be part of the initial experience, as one is “reading” the audio stream. The site provides a visual representation of the referenced clip time stamp, a feature called “timed comments”2. It seems pretty simple and obvious in hindsight but I haven’t encountered a precedent.

One risk of this approach is fragmentation, a letting go of the way in which a comment thread as it exists today coheres disparate voices into a kind of dialogue. Perhaps localizing commentary also runs the risk of losing context, of misinterpreting an argument by pressing too hard at the sentence or word level. It also isn’t immediately obvious how best to encapsulate visually a myriad of localized comments within the current blogging paradigm. A blog post could easily be overwhelmed with side notes demanding equal attention. Perhaps for this reason especially, we’ve not yet seen its adoption, SoundCloud notwithstanding. Still, I find the prospect compelling, a means to engage web writing with greater specificity and intimacy.