Early Site Design (1997 - 2002)

Below is a sample of designs for this website from very early on. Back then, html was primitive and javascript was in its infancy compared to modern web standards and tools. That said, as has been suggested more eloquently elsewhere, much was accomplished in the face of these limitations. There was a real sense of possibility and experimentation which resulted in a wide range of styles and layouts, which gradually became more generic and predictable as major template-based CMS platforms (e.g., Wordpress, Squarespace, etc.) grew in popularity.

Apps Amuck

Duck Amuck, Chuck Jones, Warner Bros., 1953

Duck Amuck, Chuck Jones, Warner Bros., 1953

With WWDC 2016 upon us and its expected focus on the much-anticipated third-party Siri API, I’d like to reflect on how Apple has been preparing native app developers for a world where the experience of a given application service or brand is multifaceted, continuous across platforms, extensible, and, ironically, as a consequence of increasing reach, also at risk of being fragmented, subsumed, and erased.

Last year while attending WWDC, I had the good fortune to participate in the inaugural Layers conference, which was introduced as a design-oriented complement to the generally developer-focused WWDC “main event” of the week. The conference size, the diversity of its events, the approachability of its speakers, the range of topics covered, and the engagement level of the audience all came together to make Layers a truly inspiring and memorable experience. One presentation from last year in particular has stuck with me as I’ve thought through the changes we are seeing in the way we interact with services and brands on our iPhones.

Neven Mrgan spoke about some of his favorite things, and, loosely, the sometimes unexpected goodness that comes from sharing what you love. Chuck Jones’ seminal Duck Amuck cartoon was one of the beloved things Mrgan shared (@ around 20:00), a rare treasure which I agree is both wildly entertaining and endures as required reading for anyone interested in the art and craft of animation, filmmaking, or storytelling and visual art in general. But what caught my attention that day was Mrgan’s off-handed comment about the sequence near the end of the cartoon where the frame literally begins to close in on poor Daffy, collapsing from all directions. He joked the image reminded him of his time designing responsive web applications.

The analogy was spot on and, as I thought about it more, seemed appropriate not only to web app design but equally reflective if perhaps in a more abstract way of the fundamental changes native mobile apps have witnessed in recent years, especially in light of Apple’s announcements that very week, which included: app “thinning” and targeting application assets for different devices, Siri Proactive and app deep linking, and multitasking and split-screen window management in both OS X and iOS. Through the lens of Duck Amuck, Apple’s announcements could be seen as the latest in a series of operating system enhancements (cf. iOS 8’s share sheets and extensions framework, notification center widgets) designed to deconstruct what it means to be — and what it means to experience — a modern, mobile (iOS) application.

Instagram “Peek” on iPhone 6S running iOS 9

Instagram “Peek” on iPhone 6S running iOS 9

The trend continued with the hardward-specific “peek and pop” feature enabled by 3D Touch announced later in the year as a feature exclusive to the iPhone 6S and 6S Plus. So it wasn’t surprising that many industry commentators began to question the long-term fate of native apps. And given recent investments in messaging and chat bots, major platform players, especially those who have been laggards in mobile, appear anxious to speed the demise of the standalone app (store), instead encouraging companies to deliver their services on their respective “aggregator apps” or “portals” as Ben Evans astutely describes them, such as Facebook Messenger, Microsoft Skype, Kik, Slack, and the most successful to date, WeChat. Even Amazon’s Echo requires a companion mobile app to manage third-party aggregated services through so-called “skills”.

In parallel, responsive web design emerged to address the proliferation of devices accessing the world wide web today, and effectively, with limitations, provide native app-like experience to web apps when accessed by mobile devices. Rather than developing and serving mobile-specific sites (remember WAP?) for mobile devices, the responsive web is built on the idea that a site should be fluid in nature and able to gracefully adapt its content and services to accommodate the way in which that content is accessed and used.

As with responsive web design, native apps will continue to evolve with greater adaptability based on device type and screen size (including no screen at all) and, by extension, their respective interaction models. Traditional graphical UIs will co-exist with conversational voice and text options. The challenge will be how best to maintain context across these interaction models and move users as seamlessly as possible between them when necessary. And users, for their part, should prove equally adaptable as long as expectations are clearly communicated, transitions are consistent and predictable, and transactions are fast, accurate, and add value.

Did 1995 Change Everything?

1995: The Year of Monica Lewinsky, O.J. Simpson, Timothy McVeigh... and the Internet.

1995: The Year of Monica Lewinsky, O.J. Simpson, Timothy McVeigh... and the Internet.

Twenty years (and some change) ago:

Netscape Navigator was a browser created by a group led by a twenty-four-year-old named Marc Andreessen, who was described in Newsweek as “the über-super-wunder whiz kid of cyberspace.” The company’s I.P.O., on August 9, 1995, was a huge success. Five million shares went on sale on Nasdaq, at twenty-eight dollars a share; they closed the day at $58.25. The Times called it “the best opening day for a stock in Wall Street history for an issue of its size.”
A little more than two weeks later, Microsoft released Windows 95, backed by what was reported to be a three-hundred-million-dollar marketing campaign, along with its own browser, Internet Explorer 1.0, and the browser wars were on. Netscape, of course, was quickly and easily outmuscled by Microsoft. In 1998, Netscape was acquired by AOL, and it faded into insignificance.

In the midst of Netscape's I.P.O, which W. Joseph Campbell contends woke the world up to the Internet, I was happily ensconced on campus at The University of Chicago, spending my time reading, attending screenings and workshops, and, on occasion, shivering in the computer lab (curiously reachable only through Harper Library if memory serves) exploring the nascent world wide web, but more likely checking email, the most used "app" by far at the time. Thinking back on the "browser wars," I don't agree that Netscape faded into insignificance so quickly, though I do remember acknowledging their impossible odds. That said, I recall well into 2003 still contending with the vagaries of Netscape and IE while designing and coding (i.e. grappling with javascript) websites. Though, by that point, the seeds of change certainly had been planted. On January 7, 2003, Steve Jobs announced Safari (Apple's fork of KHTML), helping to set the stage for improved cross-browser standardization which is still largely the trend today.

Apple Watch and the Benefit of the Doubt

On the eve of Apple's "Spring Forward" event, I thought it would be worthwhile to revisit Robert Cringely's prediction for 2015 as the year "when nothing happened" and, particularly, his take on the significance and reception of the Apple Watch:

The Apple Watch is Cupertino grabbing mindshare and early adopter wallets, nothing else.

Even those who are typically bullish on Apple seem to be restraining their enthusiasm and predictions of massive success for the Apple Watch. All except the stock market itself, with AAPL up over 14% YTD. The naysayers are hedging a bit too. If you widen the view on Cringely's comment, he isn't exactly dismissing the Apple Watch to the dustbin of history, instead pointing to 2016 as the year to watch.

I am hopeful he's wrong. I personally find even the most fanciful aspects of the Apple Watch experience intriguing, at least as they have been described up to this point, and applaud Apple for the tact they have taken as they methodically enter the nascent wearables market. I expect many recent enhancements to the iOS experience like Touch ID and the application extensions framework will be all the more relevant once the Watch is in the wild.

Critically, though, unlike the release of the iPhone in 2007, there isn't an obvious problem begging to be simplified and redefined. We aren't dissatisfied with our time pieces in the same way that so-called smartphones left much to be desired eight years ago. The Apple Watch is a much harder sell because of this; it is trying to extend, and ideally in many situations, replace the iPhone experience itself (which obviously is a bit thorny for Apple, though they have been famously comfortable with product cannibalization before) rather than displace something already taking up space on everyone's wrist.

Will the convenience and attempt at a kind of naturalness by situating tech on your arm rather than in your pocket, be as obvious when we look back on 2015 as portable touch screens appear now, when we reflect on the dark ages of 2006?

It might just come down to the distillation Apple is promising with the interface elements pictured in the photo above: the new pressure-sensitive screen, the much-fetishized Digital Crown, and, simply, the Button. While much has been written recently about the attention Apple is paying to the fashion-related aspects of the Apple Watch (e.g., luxury options, extensive customization compared to previous products), perhaps the obviousness and must-haveness of the device will emerge in its everyday use, where routine things will get done faster and information will be transmitted with less friction and without the encumbrances of even the most modern of smartphone interactions. And that is to say nothing of the integration of Apple's Siri personal assistant and speech recognition tech, which Tim Cook boasts using "all the time". Will Apple Watch be Siri's debutante coming-of-age?

One thing is certain: as my father-in-law reminded me tonight, given Apple's recent history, it would be foolhardy to categorically dismiss anything they aspire to do with the Apple Watch. For perhaps the first time in the company's history, it seems Apple has earned the benefit of the doubt.

View from Chicago Tribune Tower, 3:08 PM

View from Chicago Tribune Tower, 3:08 PM, 2014

View from Chicago Tribune Tower, 3:08 PM, 2014

We are moving the company’s downtown Chicago offices to a larger space closer to Union Station; both good things. We invited Eastlake Studio to help make the new location awesome and I recently visited their offices in the Chicago Tribune Tower. I’ve walked and driven past the 1920s landmark countless times before but never had the opportunity to go inside.

The above photo is the view from Eastlake's offices looking west-southwest. Not bad, right? How wonderful to have such inspiration just outside your window every day.

A couple of technical notes

1) This capture was taken very quickly with an iPhone 5S. A snapshot. Nonetheless, I am thoroughly impressed with the detail and resolution the 5S pulls off — from street-level sidewalks to the distant top floors of the Willis Tower to the weathered stonework of the Wrigley Building and the shadow detail of IBM Plaza (now AMA Plaza). Perhaps I’ve just grown accustomed to digital aesthetics, and a future print likely will be the true test, but I'm not sure I would have been able to produce anything like this with my 35mm gear, certainly not without preparation.

2) If you peek at the metadata, you will find that I used Adobe Lightroom 5 to post-process the image (monochrome conversion, lens correction, and perspective adjustment). As a long-time Apple Aperture user and evangelist, this marks my first tentative steps toward an all-Adobe workflow. It’s no small decision and part of me is holding out hope that the mere thought of ditching Aperture will have some cosmic effect resulting in the announcement and release of Aperture 4.0 at WWDC next month, delivering all the goodness of Lightroom and more. But I am doubtful. Despite my deep (some would say non-rational) reservations about Adobe generally, for anyone looking to get more serious about photography today, I would be hard-pressed to recommend Aperture. I wish things were different.

Since one cannot easily move one's work and time invested in one tool to the other, perhaps the best choice is to rely on them as little as possible. There are likely alternatives out there which I am not aware of, but it seems to me that photo file management and lightweight image post-processing is an area begging for innovation, including cross-platform support, long-term scalable network storage, and auto-curation beyond map views, face recognition, and “on this day” flashbacks (as provided by the now-defunct Everpix).

Arkansas

2012-1129-States-Visited-Map-Arkansas.png

It only took six years, but I am pleased to report I am one state away from visiting all fifty United States, a long-standing box to be checked on the bucket list. 

Again, SG came through and arranged for a weekend excursion to nearby Little Rock, Arkansas. Unlike our surprise getaway to Omaha, Nebraska eight years ago, we had a couple of extra travelers in tow this time, which definitely put a different spin on things (note to self: next time, fly). We all agreed the River Rail Streetcar was a highlight. Other recommended destinations include the William J. Clinton Presidential Library, Little Rock Central High School and National Historic Site, and the Arkansas State Capitol Building. While the boys were a bit young to fully appreciate the significance of LRCHS in the history of African-American civil rights, the visit was a good learning opportunity for all of us.

With tornadoes forecasted, we decided to head back home early on Sunday. Waking Monday morning, I was surprised to hear how much damage had occurred; my thoughts are with those who lost loved ones to the storms.

Game 7

Game 7, 2003

Game 7, 2003

The Sports Critic writing on the 10th anniversary of the 2003 NLCS:

The Cubs led the 2003 NLCS three games to one [sic]. In Game 6, they led the Marlins 3-0 going into the top of the 8th at Wrigley Field. With one out and a runner on second, the Marlins’ Luis Castillo lofted a foul ball destined for infamy. Left fielder Moises Alou chased it to the stands. He leaped for the ball that was directly over the wall. A fan attempting to catch the ball himself knocked it away from Alou. Castillo ended up walking, and the Marlins then scored eight runs in the inning to eventually win the game.
In Game 7, the Cubs led 5-3 after two, thanks to home runs from Kerry Wood and Alou before the pitching gave up six runs to lose it.

Steven Soderbergh on Cinema

Cinematographe-Lumieres.jpg

Speaking at this year's San Francisco International Film Festival, for its State of Cinema address, Steven Soderbergh offered the following definition of cinema (emphasis mine) as part of his general assessment of today's Hollywood film industry:

The simplest way that I can describe it is that a movie is something you see, and cinema is something that’s made. It has nothing to do with the captured medium, it doesn’t have anything to do with where the screen is, if it’s in your bedroom, your iPad, it doesn’t even really have to be a movie. It could be a commercial, it could be something on YouTube. Cinema is a specificity of vision. It’s an approach in which everything matters. It’s the polar opposite of generic or arbitrary and the result is as unique as a signature or a fingerprint. It isn’t made by a committee, and it isn’t made by a company, and it isn’t made by the audience. It means that if this filmmaker didn’t do it, it either wouldn’t exist at all, or it wouldn’t exist in anything like this form.

He later adds . . .

But the problem is that cinema as I define it, and as something that inspired me, is under assault by the studios and, from what I can tell, with the full support of the audience.

Writing for the New York Times, A.O. Scott suggests that Soderbergh's self-described rant is more about the realization of his much-publicized retirement from traditional filmmaking and embrace of other modes of cinematic production (e.g., television and even Twitter) in order to express one's "vision" than a fully baked notion of cinema with a capital C. His embrace of new technologies, especially in terms of where and how cinema might be encountered (say, in contrast to David Lynch's colorful and unambiguous contempt for watching movies on mobile phones) is open-minded and provocative, though risks too broad a stroke; as Scott points out, Soderbergh uses the term [cinema] "more or less as a synonym for art".

Yet, I find it curious, along with a casual dismissal of generic conventions and the accidental ("arbitrary") aspects of the creative process, he is quick to implicate those who would feed him, his audience, to adopt a seemingly old school auteurist view, where movies attain the status of cinematic endeavor at the hands of their director-author, especially because of his or her (not necessarily literal) struggle with an indifferent, even hostile studio system. Soderbergh further contends the narrowing of options for filmmakers today goes beyond the studio executive's stereotypical intolerance for ambiguity and narrative complexity, and is symptomatic of an American appetite for escapism in response to 9/11, the trauma of which still haunts the box office, if not our everyday lives.

When the Lumières first exhibited their new invention the cinématographe and accompanying short films, including La Sortie des usines Lumière à Lyon (1895), in Paris on December 28, 1895, the idea of cinema (as the intersection of a paying audience watching moving images projected on a screen) was born. Since then, I'm not sure there has ever been a time when its identity, especially in terms of how films should be presented and truly experienced, hasn't been in some sort of crisis; for example in response to the emergence of television and the "domestication" of cinema during the 1950s and 1960s or the advent of cable television, VCRs and laser discs in the early 1980s, to name just two of the better known threats. Today, cinema is experiencing redefinition through the lens of the Internet, tablet computers and iPhones, and digital projection. I am thankful for Soderbergh's candid and obviously passionate observations concerning the economic realities of contemporary Hollywood but I also think it is important not to discount the role of the audience as we contemplate what makes cinema (beyond aesthetics, tools, and authorship). As with new methods for the signature production and mass distribution of something that might be considered cinematic  (per Soderbergh's qualifications), new audiences also emerge and are equally relevant to cinema's continuing evolution and transformation.

Roger Ebert (1942 - 2013)

Roger-Ebert-Pulitzer.jpg

I never met Roger Ebert and I doubt he knew of me directly. Yet he played an important if brief role in my graduate education that I'd like to share in his memory.

During my stint at the U of C, I volunteered in various roles at Doc Films. For me, to spend evenings threading a projector or dreaming up (and sometimes programming) film series with fellow movie buffs was a welcomed antidote to the removed, sometimes too abstract, relationship one has with cinema as a student of critical theory and cultural studies. In 1998, for a  series I co-curated comprised of films by foreign directors with the idea of America and American culture as central themes, a fellow Doc volunteer, who also worked as a projectionist for Roger's continuing ed course at the Graham School, asked Roger one night for his advice on movies we should consider including. His response surprised me: W.R.: Mysteries of the Organism (Dušan Makavejev, 1971), not the kind of film I would have ever expected from the critic whom I had dismissed over the years as too mainstream and forgiving in his tastes, responsible for reducing film criticism to thumbs pointed in one direction or another. The obscurity and uniqueness of the suggestion and (I'm told) the alacrity with which it was offered opened my eyes to better appreciate and understand the breath of Ebert's knowledge of film history and his unpretentious appreciation of movies, in all forms. For him, it may very well have been a passing thought at the end of a long day or week, one of countless recommendations made over a brilliant, sustained, and unprecedented career (even then) but in my mind's eye, I felt squarely put in my place knowing there was still much to learn and to see.

No doubt my story will be joined by many other, perhaps similar, remembrances in the coming days and weeks, of the small yet profound ways Roger Ebert touched our lives. It has always been reassuring to know he was there to turn to — whether through his movie reviews, books, blog, interviews, or in casual conversation wrapping up a course screening — ready to share his passion for movies and his love of life.

Timed Comments and a Call for Blog Side Notes

The most impressive thing about social media site SoundCloud is its signature feature: to graphically represent in spatial terms what is usually experienced non-graphically in time, the waveform1 of an uploaded audio clip. By laying out the amplitude of the audio recording, SoundCloud emphasizes duration of experience, pointing to its peaks and valleys, and, most important for my purposes here, allows for the insertion of time-coded feedback.

As consumption of web-based media has evolved over the past decade+, we’ve grown accustomed to eating whole this or that bit and then, when offered the opportunity, provide feedback at the end and participate in a comment thread. Granted, one can excerpt the relevant content (or time stamp in the case of audio or video) for which the comment is addressed but this localization is still displaced temporally. With SoundCloud, we are given the opportunity to attach one’s commentary to a specific moment within the audio stream so that it can be part of the initial experience, as one is “reading” the audio stream. The site provides a visual representation of the referenced clip time stamp, a feature called “timed comments”. It seems pretty simple and obvious in hindsight but I haven’t encountered a precedent.

One risk of this approach is fragmentation, a letting go of the way in which a comment thread as it exists today coheres disparate voices into a kind of dialogue. Perhaps localizing commentary also runs the risk of losing context, of misinterpreting an argument by pressing too hard at the sentence or word level. It also isn’t immediately obvious how best to encapsulate visually a myriad of localized comments within the current blogging paradigm. A blog post could easily be overwhelmed with side notes demanding equal attention. Perhaps for this reason especially, we’ve not yet seen its adoption, SoundCloud notwithstanding. Still, I find the prospect compelling, a means to engage web writing with greater specificity and intimacy.

Day has Come

Photo: Lea Suzuki for The Chronicle

Photo: Lea Suzuki for The Chronicle

First, let me say, I hope that Steve Jobs’s no doubt difficult decision to resign as CEO of Apple allows him to focus his energy and strength on a speedy recovery from illness and return to good health. That is job number one. Yesterday’s announcement, delivered by personal letter to the Apple Board and Apple community at large, has generated considerable reaction in the tech and media communities, for good reason I think. Even though we’ve all known about Jobs’s health issues,  I think we’ve held out hope that it wouldn’t have to come to this, that it was something that could be willed and managed into permanent remission, part-time. Acknowledging this is not the case, that Steve is human, we all are, is difficult but also liberating.

Much of the talk has circulated around the fate of Apple. After the initial flutter, I think most folks are concluding that the company has a “deep bench” and with Tim Cook at the helm in particular, there is little risk of execution flagging in the wake of Jobs’s transition to Chairman. Cook appears to be cut from the same cloth when it comes to restraint, quality, and attention to detail. We will have to wait and see how cultivated a sense of whimsy and invention he has, the “hacker” pedigree which has also been an important strand of Apple’s DNA under Jobs.

Apple Macintosh (1984)
Apple Macintosh (1984)

I basically grew up with Apple gear: Apple II in high school, original Mac 128K in college (nicknamed the MacMelt due to a faulty power supply), my first laptop the Powerbook 140, a Performa(!) desktop during my (lean) days in graduate school, and of course numerous devices over the incredible run during the past decade after a brief Apple-free stint in the late 90s. These things have helped shape my thinking, have helped me express who I am. And for most of that history, Jobs has been a significant part of the buy-in, especially when Macs were dismissed as toys at best. There has been a trust in his vision, his passion, his origins, and a reassurance in knowing that he is dreaming in California of the next thing and sweating the details too. I can try to convince myself that nothing changes much with this announcement, and maybe that is largely true in the immediate day-to-day, but, naturally, I will also have to adjust my attitude about Apple. It isn’t business as usual on a gut level.

That’s not necessarily a bad thing. There is no question that Apple is in the best position it has ever been, and it is a testament to Jobs on down the line that they’ve created a nice cushion to weather this transition. And it just might be a wonderful opportunity, a time to double down on the team that Jobs has assembled and now trusts with his baby. Those same reasons that have kept me loyal, the memories of wonder and amazement when first using Apple computers, have also framed my expectations to some extent. Those are big shoes to fill and a tremendous responsibility, no doubt, but what a prize Cook has been handed! I welcome getting to know him better and seeing where he and the rest of the management team take the company next.

And in the meantime, get well Steve and keep us posted.

Launch

Hello world! I’ve decided to return joecarey.com to its roots as a personal website, a place to share and promote my photography, recommend articles and other items of interest, and sketch ideas.

It’s been an interesting journey since I shuttered the site three years ago. I tried various hosted, sometimes social, venues (e.g., Facebook notes, MobileMe, Flickr, Posterous, and, most recently, Tumblr) to publish my photos and infrequent notes. In each case, over time these services fell a bit short of my particular requirements, which is not to say they all don’t excel at what they do for their target audience(s). So, I’ve turned to a relative newcomer, Squarespace, in hopes that their approach will strike the right balance between convenience and flexibility. So far, I’ve been impressed with the features and speed of the platform, the company’s close attention to detail, and the obvious thought that has gone into the design of their UI/dashboard. All good stuff.

In migrating the blog archive from previous incarnations (for a good stretch, managed using Movable Type), I’ve spent some time tidying things up (not least, link rot), excising a few entries, and adding or updating supporting material where it seemed useful. The availability of new software tools and APIs, the mass of content available online, and the pervasiveness of the web now in the guise of social media and mobile devices have changed the landscape of options available to experiment with here. Happy to be back.

Make Way for Tomorrow Release

2010-Seth-Criterion-Make-Way-for-Tomorrow-2.jpg

Taking a little break from photography, I wanted to alert everyone about a new Criterion DVD release of Make Way for Tomorrow. If you are into Depression-era classic Hollywood masterpieces (and who isn’t!), you might want to pick it up, or add to your Netflix queue.

Special bonus is cover art by the very talented Seth.

Paolo Ventura's Winter Stories

2007-Paolo-Ventura-Winter-Stories-39.jpg

Paolo Ventura’s much-anticipated Winter Stories has arrived. A departure from what I am typically drawn to in photography, it is Ventura’s depiction of the details of the everyday that really wins me over. The gun metal bed frame and smoky mirror, the muddy puddles, the smudged window panes, all give his imaginary tableaux a rumpled yet vibrant lived in-ness.

The artist discusses his process here. And if you are in NYC, his work is at Hasted Hunt Kraeutler through January 23, 2010.

Terra Incognita

2012-1216-Coover-Voyage-Into-The-Unknown.jpg

For as long as I’ve known Rod Coover, his web-based media projects have regularly gone against the grain of convention and often, almost by definition, pushed the limits of modern browsers. With his latest publication, Voyage Into The Unknown, it seems he is still pushing those limits, as he warns on the landing page:

Voyage Into The Unknown is designed for 1024X768 or greater. If you have a small screen please go into FULL SCREEN viewing mode in your browser. You are entering a very wide landscape; if you have a smaller screen size you will need to scroll more to travel into the landscape–use all the space you can get!

Rod’s project got me thinking about how landscapes stand in for a kind of knowledge of place and one’s brief time in it–as Rod points out, we might name anew a bend in a river, but how many names may have gone prior, or after? We think of unknown territories as somehow a thing of the past in the age of Google Maps and GPS positioning and we can easily forget that today’s maps are not the territories to which they point and can only, at best, approximate (even with street-level photographic evidence).

Friendly Neighborhood Psychotherapist

Robert Wiene, The Cabinet of Dr. Caligari, 1919

Robert Wiene, The Cabinet of Dr. Caligari1919

Dave Kehr reviews a new box set of German Expressionist films issued by Kino International and name drops so-called naïve realist Siegfried Kracauer and his 1947 study From Caligari to Hitler: A Psychological History of the German Film. It’s good to see Robert Wiene’s iconic The Cabinet of Dr. Caligari given some historical and stylistic context and, moreover, to see this period in film history brought into the light of mainstream, non-academic attention. Now if only I could convince SG to “revisit” these classics.

What Makes a Great Portrait?

Ansel Adams, Dorothy and Cole Weston at Home, Ca. 1940

Ansel Adams, Dorothy and Cole Weston at HomeCa. 1940

With the recent arrival of my now 2 1/2 month old son, I’ve been struggling with this very question, especially a “portrait” of someone that is just figuring out who he is, at best, and who is changing so dramatically from week to week. It’s as if the metamorphosis itself is what I am trying to capture when I press the shutter. It’s really made me rethink my approach to taking pictures, and the results thus far have been more the product of sheer chance than any kind of skill. The experience has led me to appreciate portraiture all the more.

Timothy Archibald:

Trying to really pinpoint what makes a great portrait is almost like trying to figure out why it feels good when someone smiles at you or why it is disturbing when someone yells at you.

Jörg Colberg posed the question to various photographers, curators and bloggers. Their responses, including example portraits, are definitely worth a read (via JK).