What Did the iPad Get Right?

In 1987, Apple released a visionary film featuring a book-like device that would manage data, schedules, email, and voice contacts. It had a voice-command interface as well as speech synthesis for interaction. They called it the Knowledge Navigator.

Our visions of what a tablet-sized mobile device should be capable of have changed considerably over the last 25 years or so, and although the iPad isn't voice-driven and doesn't have a bow-tie wearing host, it is a pretty remarkable step forward in computer interaction. But with all the hype, our expectations were high, and it's easy to understand why there's so much negative press surrounding the release of the iPad. In short, it didn't meet our expectations.

I feel obligated to weigh in on the other side, though because I see a device with a potential to become completely revolutionary (a user interface), disguised in a wrapper that's not so revolutionary (an oversized iPod Touch), and that means it has the key ingredients to become a successful product.

Historically, technologies are either accepted or rejected based on two criteria: 1) Does it do something MUCH BETTER than anything else that's already available, and 2) Does it work like anything else that's already available. Hopefully you agree that these are mutually opposing forces, so it's understandable why any innovation in technology must strike a delicate balance between the familiar and the radical. I would argue that Apple has done an excellent job of finding this balance.

Since people reading this have likely already read a number of articles pointing out the tablet's flaws, I wouldn't be doing my job if I didn't at least acknowledge some key sticking points and explain why they're not that big a deal. The biggest criticism is the use of DRM--the tired refrain is the same one that's been choking digg with criticisms about the iPhone and the iPod Touch: It's locked down, and the App Store approval process sucks. If this bothers you, just go buy a netbook. With the iPhone and iPad OS, Apple is trying to give us the option of moving away from the plague of viruses, malware, and crapware that accompany an open computing platform. Even for gurus, managing these problems can be time-consuming. Apple also moving away from long boot times and loading big, clunky programs that have a bunch of features we'll never use. Another key criticism is the inability to multitask, but what benefit does multitasking really offer besides distraction from what you're working on? Conceptually, all this means is the ability to switch from one task to another seamlessly, or take a piece of media (or clipboard contents) to another program. The iPad seems to have no trouble hopping from one application to the next, and the clipboard will likely function just like it does on the iPhone. For most people, this is more than adequate. Last but not least is the lack of physical connectivity (USB, etc.). This may prove to be the Achilles' heel of the device, but maybe not. With wifi and cloud-computing tools like Dropbox, the need for physical connectivity to a device is quickly becoming obsolete. I can't even remember the last time I plugged my iPhone into my computer--aside from making backups it's just not that big a deal.

I want to leave you with some underrepresented features that I believe are truly revolutionary. The three productivity applications (Keynote, Pages, Numbers) represent the next generation of software that doesn't rely on a traditional mouse and keyboard, and that, to me, is the most exciting announcement of all. Adjusting images in presentations, designing a page layout, and even manipulating number tables in a spreadsheet just works better when you have direct contact with your content. I've mentioned that I've worked with SMART Boards before, and when you can literally touch the content you're interacting with, the interface just sort of melts away, and you don't even think about it. That's revolutionary.

What are Some Alternatives to Passwords?

http://www.flickr.com/photos/brenda-starr/3466560105/in/set-72157617806022838/This morning, I got yet another message from my work login system: "Your password will be expiring in 7 days. Customers are required to change their passwords at least once every 180 days."

Hmm. Already? It seems like it's only been 180 days since... oh right. But there's more! The new password requirements must meet three of the following four criteria:

1. Must contain english uppercase letter
2. Must contain english lowercase letter
3. Must contain a Westernized Arabic numeral (0-9, etc.)
4. Must contain a special character (e.g. punctuation mark)

It also can't be any of the last 5 passwords I've selected before. Really? A password that I used 2 years ago isn't secure now? I wrote a letter to the administrators:

I'll get right to the point: longer and more onerous passwords are not more secure if I have to write them down or store them somewhere. Over the last 8 years or so, the password policies have become progressively anti-user to ever-increasing levels of absurdity. More characters, upper and lowercase letters, non-dictionary words, even MORE characters, and now I have to punctuate? It's not an essay, it's a password. I understand that faster processors and increased computing power make it theoretically easier for a machine to break my code, but many professors on campus that I know keep an unlocked, unhidden rolodex full of passwords because of requirements like this. I know that you're trying to protect us from hackers, and I appreciate that. But consider that our brains do not follow Moore's law to keep in step with increased processing power, and at some point you will need to rethink your strategy.

Sure I can store the passwords in some kind of utility program designed for that purpose. But then don't I need another password for that system? And aren't I at the mercy of whatever encryption scheme that system uses? And isn't that a far more desirable target for hackers and identity thieves? What worries me is that, at some future point, my university might consider doing what banks have done, which is to implement stupid security questions to verify my identity. You know what I'm talking about: "What was your first pet's name? What's the name of your high school?" It's the kind of stuff that appears on the average Facebook account, and can be gleaned more easily than a moderately well-crafted password.

My question to you today is this: What is a secure alternative to passwords of ever-increasing complexity (and ever-declining usability)?

Does Multitasking Hurt Your Brain?

"Attention, multitaskers (if you can pay attention, that is): Your brain may be in trouble" writes Adam Gorlick of the Stanford Report. According to communication professor Clifford Nass, multitaskers are "suckers for irrelevancy." That's the claim that Stanford researchers are investigating in a recent study that compares media multitaskers with... um... non... media... multitaskers. Whatever that means. Although this study is riddled with confounding variables, such as inherent personality traits, the basic question behind it is sound, which is whether you really multitask, or you just spend less time doing any one thing. The early assumption was that people who multitask must have extraordinary skills in concentration, working efficiently in different capacities and compartmentalizing tasks in their brains. The research suggests that the opposite may be true, and that multitaskers don't have mental abilities superior to those of non... multi... yah... hopefully by now you're asking yourself the same question I am, which is "What is the definition of media multitasking, and what arbitrary criteria have been established for who's capable of multitasking, or who's good at it?" It's a safe bet that the 100 people in the sample self-identified whether they were either frequent multitaskers or not.

I'm going to conduct my own observational experiment right here. I don't have scientific equipment flashing obnoxious visual cues for you, and your answers will be based on real-world experiences, and I contend that it is because of those reasons (and not despite) that my methods are no less valid than those of the Stanford study. Will you help me with it? I need 100 people to report in the comments 1) how you "multitask" and what that means to you, and 2) whether you find that it increases or decreases your mental capacity.