I’ve found myself making the feedback sounds alongside VoiceOver while using my iPod.

Since the beginning, we’ve strived to make the experience of using Compeer with a disability not just functional, but delightful. User testing provides excellent feedback, but it has its limitations.

I think it’s hard as a maker to innovate unless you can relate—at some level—to what your users are describing.

That’s why I decided to spend 7 days using only VoiceOver on my iPod. Normally, I would test things with the screen curtain on (screen off) and then turn the screen curtain off (screen on) whenever I got stumped.

I realized that I was missing a huge opportunity to learn the things that were frustrating to me and things that were potentially a roadblock to users with vision impairments.

It’s been just over a day and already I’ve gained perspective on how I can make better design decisions in Compeer.

Realization #1: I’m forced to be better at VO

It’s amazing how much better I am getting at using VoiceOver now that I don’t have the screen on as my safety net. Now, whenever I get stuck, I tap around trying a bunch of gestures before giving up and pressing the home button. Meh! I’m getting better.

Realization #2: VO gives system cues to the user that otherwise aren’t provided

There’s certain information that’s only given to VoiceOver users. For instance if I’m swiping through apps on the home screen, VoiceOver will tell me whether an app is running.

Realization #3: Siri + Dictation + Fleksy rule

Without Fleksy, speech dictation and Siri I’m not sure I would have finished typing a single sentence yesterday. Being able to offload the pains of text input on Siri by saying “send a message saying I’m on my way” or “open Rdio” is an incredible time-saver.

Realization #4: Phrasing in apps is incredibly important

It’s amazing how the phrasing of an action can make me think I enabled a setting, when it’s actually disabling it. For instance, toggling a button may read “location on,” which confuses me on whether that action turns it on or if it already is on.

I think by having VoiceOver read “location is on,” the user would have a better idea of the status of the button.

I’m not saying by any means that barriers I experience are the same severity a person with a real vision impairment experiences. However, it does definitely help me empathize in my design execution.

Either way, this has been really beneficial and educational.