It's a live interactive video support product: an operator can see a problem through a customer's mobile phone's camera.
Lens allows a real-time video stream between technicians and end-user customers via their Android or Apple smartphone cameras to quickly identify technical issues of any kind for just about any product or service — from a broken dishwasher to machinery too large to move. Remotely guide end users through troubleshooting, problem resolution, product setup, and more.
See more here: Rescue Lens official page
Normally, if someone wants to come out a new product one should be done is a competitive analysis, checking similar products, doing some interviewing, etc.
It was a completly new product in the end of 2014. So no competitor, no users who use it, we cannot do any user interview ... on the other hand we had a very-very serious deadline from the leads.
I had an initial idea - but no app at that time.
So I decided to do user test No1: it was actually a paper prototype.
Why paper? it was fast, and we had an initial feedback about if we are on the right way!
2 weeks later the developer guys were able to come out an initial version of the mobile app.
Hooray! Let's do user test No2: with a very early version of the app!
We tested the followings cases:
One week later we had a newer mobile app version, so we repeated the same test - It was user test No.3 !
We still tested the same scenarios like in test No. 3.
At this time more or less we had the full app flow covering almost all scenarios.
This is the flow for the Android version of the app:
Designing the menu system
One of a biggest challenge was designing a menu to this app. I and the UI designer and the PO preferred a kind of "hamburger" menu:
Actually the developers had ideas about "3 dots" and "floating" type of menus besides our (=designers) hamburger menu idea. How we could decide which menu we should use in the app?
The developers were able to create these three different versions of the app, so we managed to test it with users (user test No 4)!
This test was huge test - actually with 6 users! We tested each of the menu versions with two users.
Though the main goal of that test was that which menu we should use but we had additional goals:
Result of the test
All menu can be usable, so we finally chose the hamburger version - our original idea.
- overall very good experience, no serious usability issue
2. Should be improved:
The most important was:
For Pause and when disconnected and try to reconnect the application we used an upper bar. The users usually used the menu instead of the upper bar for resume from pause.
Consequence: the affordance of the action text in the upper bar is low.
Next step: give more affordance to the action part of the text: break the text into to line, with more space, and giving more highlight the action text: "Tap here to resume"
The test was useful because we detected some problems (not serious) so we had the chance to fix it.
Initial screen design
The design of the starting screens was difficult a bit:
Plus for iOS there is an additional system screen for the allow camera case.
As a result we finally came out this flow for initial screens:
After almost one year we decided to add a significant update to the product: an audio feature!
This time we thought we should use direct buttons instead of the hamburger menu.
The developers created the prototype of the app with the new UI and we tested it with some users. Since we had not encountered any usability issue we decided to go ahead and use this new interface.