Monday, February 1, 2010

Stop complaining and start refining!

Increasingly, I feel the long-lived desktop model is a poorly implemented one. Anyone who has relatives with computers should attest as much. My grandmother called me once because she couldn't find her files ... it turns out she had simply saved everything she had ever done to the desktop, until the desktop had run out of space, stacking files on top of files and rendering them un-clickable.

Now, in my Gentoo days, I would have (and probably did) rail on her. How does lack of user training somehow reflect on a system's performance? "Gosh," I would say, "how can someone not understand something as simple as a hierarchical filesystem?!" One of the more transformative things that has happened to me is overcoming this attitude. Nobody worries about using their microwave because they're "going to break it", and yet this is a common concern we hear about the desktop. I feel we as developers and HCI experts have failed users every step of the way for over thirty years. We should be literally ashamed of what we have allowed computing to become.

Gizmodo would go on to put it in more concrete terms for me, describing what an HCI expert from the 70's named Jef Raskin called an "Information Appliance." I thought the description brilliant: a device that could immediately transform itself to be purpose-built to any task. Gone was the stress of being surrounded by ten attention-demanding windows floating on top of a hierarchical filesystem; what we were left with was a device that was good at everything, just when it needed to be. In retrospect, it was this power that initially attracted me to the iPhone. While my interest piqued only after I could develop for it, I immediately understood the potential of the decision.

So too was I pleasantly surprised when Apple announced the iPad, but the potential this time seems greater. Instead of challenging meek opponents* in an underdeveloped market like smartphones, Apple now has laptops and netbooks in its crosshairs. They intend to expand this brilliant metaphor to increasingly robust devices. Yet off of it hangs one terrible caveat: the App Store.

As an iPhone developer I've been there: the ludicrous provisioning profiles, the vague app store rejections, the $100 per year fee just for the privilege of riding the new wave ... it was enough to make Joe Hewitt leave the development scene, a terrible loss for the community. Despite its shortcomings, however, I cannot bemoan this decision too much. I understand the desire to provide a user with a cohesive, consistent user experience. Apple has done it best for years and it continues to do so now, with the App Store acting as a very public way to muscle developers into thinking like Apple wants them to. "Most people" will benefit from this decision, and for that I can't fault it.

However, claims of a dystopian future where Steve Jobs beams Apple-approved software into our heads are knee-jerk reactions at best and deceptive FUD at worst. I'm not saying that the app store is necessarily the most healthy decision ever made in computing history, but diversity will never disappear. Open source projects like Google's Android, companies like Dell, Asus, and HTC ... they are making the same strides as Apple, but in a significantly more transparent manner. Nokia's N900 is probably the most "open" device you can buy, and from what I understand it is doing quite well with the Linux/hobbyist crowd. Palm's WebOS has its own application portal, and brings with it one of the best user interfaces available on an embedded device. This market space is literally booming right now, and we have focused our attention solely on one device and declared it the devil.

This sort of vilification is, in short, wrong. We should be applauding their innovative approach, attempting to learn from it and refining it. It is my opinion that Android has done exactly this from the get-go. Many parts of the interface feel pleasantly reminiscent of my time with the iPhone, and yet the notification bar and multitasking bring back important elements of a powerful user interface. I'm extremely excited about the future of embedded computing with Android at the helm, just as I'm excited about the iPad and its potential transformative effects.

Rather than being afraid of new technologies, or dismissing them outright, we need to look at the bigger picture of what these devices bring to the table and how it could potentially impact how we interact with computers. The iPad is a great start, but we need to keep pushing, refining, tuning, and enhancing the user experience for everyone, "end users" and "power users" alike.

* If you honestly believe that RIM/Blackberry was moving in the right direction, you and I need to have a long discussion so I can ascertain exactly what people see in those devices. My experience with them has been so lackluster I can barely remember using them.