Once the team's happy with the scenarios we've come up with, we have a pretty good idea of which steps, pages or screens will be needed to create the experience. From there, I map out a user flow, taking each scenario from start to finish, illustrating user actions, key decision points, and basic system cues.
I use a combination of pen, paper, sticky notes and whiteboards for my user flows. When it's pretty much ready to share, I usually recreate everything in Illustrator, so it looks a bit more pro. Then it's on to wireframing.
This set of simple wires began with a navigation model and became a way to easily find a user's contacts and projects from anywhere in the system with no more than two gestures or inputs.
The wireframes for One Drop's mobile app were created in Adobe XD, which does a great job of showing navigation between screens without having to switch into live mode.
These wires for a suite of instructional design services allows teachers to not only locate curriculum from a giant assortment of lessons, but create their own custom lesson plans for each class or individual students without having to switch products. This was a huge improvement over the last experience, where instructors would constantly have to leave one area to find another, often losing their way.
This was a browser extension designed to highlight multiple tweets and display them side by side. I made a set of interactive wireframes to show the coder exactly how the service should act, and he admitted that this flow eliminated about half the steps from how he was originally planning it.
These Behr wires, built in Marvel, were used to show how users would be able to quickly locate and add colors to their palette, then preview in their own photos.
I often deliver wireframes and screen mockups together, along with interaction callouts, just to help make it extra clear what we're going for. With this e-commerce app, because I was dealing with a handful of outsourced developers, I wanted to make sure that nothing was lost when we moved from wires to mockups, so they were always presented together like this.
Okay, so there's a ton of images coming up. I wanted to show all of them here, because each one was super important in improving the accessibility and comfort of the product I was working on. The original version had so many usability problems that I had to make sure the new version would allow for any kind of input without failing the user. After testing this system, it was 100% clear that things were much easier for everyone.
This first set was a recreation of the click & drag interaction already built into the app. It worked fine, as long as you had a working mouse and perfectly functional motor skills. But it wasn't enough for every user. I watched so many students fail to complete tests not because they didn't know the answers, but because the system wasn't built to let them demonstrate their knowledge. I knew we had to do better.
For users with limited capabilities or hardware that made it difficult to complete a click & drag operation with a single click, our second option lets users click once to initiate the drag, then again to drop the item. This greatly improved the experience for trackpad users and those whose fingers couldn't hold the mouse button down while focusing on precisely moving the cursor to the right spot.
A third option was to allow keyboard input, using arrow keys and Space or Enter to make decisions. Why this wasn't already part of the testing environment, I still can't figure out, but once we had these guides in place, we had keyboard entry implemented in the next release.