A UX Designer Takes Siri for a Drive
As a UX designer (UX translating to ‘User eXperience’), I make a living thinking about how people interact with technology.
So when I heard that Siri, perhaps the most ambitious, high-profile voice interaction system ever, was coming to an Apple store near me, I was anxious to see if she could take my daily drive time and finish the transformation that my iPhone 3GS had started.
The 3GS was the first device that gave me everything I wanted for my commute:
- Car stereo integration
- Hands free calling
- One-touch, voice-controlled dialing
- GPS with driving directions
This was a big improvement on the old-school experience of limited listening options; iffy, blind shuffling around for CDs to switch out; glancing with lightning speed at maps or directions scrawled on paper; or hoping for a stoplight to have time to dial a 10-digit phone number.
With the iPhone 3G, I still needed to look at the device to launch most apps before I got on the road, but sending and receiving phone calls was completely eyes-off.
So I was intrigued by the idea that Siri might revolutionize my daily commute. Would she be my ideal co-pilot?
Sadly, no. Not yet, anyway.
The Siri technology is a cool new twist in iPhone’s evolution, but my (literal) test drive left me disappointed.
Siri’s advance billing promised she would make my commute even better and more productive:
- Hands-free text and email
- Hands-free reminders
- Location-based reminders (“Remember to pick up milk when I leave work”)
- A single touch to launch most functions
- Web search for businesses and phone numbers
But it turns out Siri and I are not a UX match.
She solves some problems quite elegantly, but doesn’t address my user needs in the context of driving a car. She fails my “eyes-off” test, meaning my driving experience isn’t an improvement over the voice commands in the iPhone 3GS.
- Long texts and email are so likely to contain errors that messages need to be reviewed before sending.
- Reminders also need visual confirmation.
- Siri is not yet savvy enough to make location-based reminders based on locations not listed in contacts – i.e., “remind me to pick up milk when I’m near the grocery store.”
- Siri’s searches usually returns multiple hits, requiring the user to read and select the correct item.
When I’m designing user interactions for a client project, I zone in on the needs of the person who will be using a product — what will they use it for? What problems will they solve with it? What is the context? What’s the environment they’ll be working within?
So when I’m in the market for a personal gadget, you can bet that I’m putting it through the same paces. In the end, my decision to buy or not to buy comes down to whether I’ve decided a gadget will be an improvement on what my existing product is capable of doing.
Here’s the UX bump I’d hoped to gain via Siri:
- Touch-free activation mode (Siri actively listening for commands while I am driving)
- Integration with a turn-by-turn navigation system (“I need driving directions to the nearest gas station”)
- When presented with multiple options, Siri would read them out loud and accept voice commands for selection.
- “Call Domino’s Pizza”
- Voice prompt for selection from the list
- Dial #
In the end, Siri is a powerful voice command system … but it is a hybrid system, meaning it isn’t possible to use the device with only voice interaction. Sight and touch are still required.
That being said, I get that Siri is still at the ‘beta,’ or test phase, even if it isn’t being advertised that way. Bugs are still being worked out, features are being enhanced. Once that happens, it will likely become everything I could hope for in a traveling companion.
Bottom line: Siri is not yet fulfilling its potential or the promise of its advertising.
It is very cool now, but when it fully enables true voice interaction it will be revolutionary.
Has Siri changed the way you use your iPhone?