When you move content up or down on that screen, you're expecting stuff to move accordingly. It has worked that way in mobile touch interfa... STOP!
A solid plan no doubt.
It seems like the smartphone industry, with certainly some apps to go around, is self-guiding itself towards maximizing the variety in horizontal gesture use. No holds barred.
Let's just flick it.
- In the first app, by flicking left to right, you open a menu of some sort with additional controls or content categories. Just like pressing the menu button does from the app header.
- Another app does the same but using a different direction, from right to left. Surprise!
- The third one has a different behavior and a much longer flick is required. Son, you need to be more specific. A thumb extension surgery is a good idea if one handed use is your thing.
- A fourth app requires a specific speed for the flick. It didn't understand what your kind-of-a-flick was trying to accomplish. Not just any flick qualifies. Go on, try again.
- The fifth subject has only been reading user comments for the past two years, and hasn't yet implemented the required horizontal flick support. Please press a button on the top right corner. Or left, wait, it was at the bottom somewhere. Google it.
- The sixth contender is insecure, but polite. It asks you to define what would you prefer for that particular flick to do. Neither developers, designers nor managers could pick one, of all the possible features the monster of an app offers. They're hoping for you to solve the problem for them. You open the flick manager. While browsing through all those options to select one, you forgot the purpose of the app.
- Our last example application is the most advanced of them all. By flicking on top of an individual content item, and altering the flick direction, speed and blood pressure; you can delete, manage, reply, call back, link, fold protein and travel through time. In every multiverse. Times Pi.
Fully independent, self-sufficient, self-governed and self-centered.
How did it got so fragmented?
If you want to create an app, you save time in both design and implementation because the default behavior and functionality for those flicks is already built in to the way the OS handles application pages.
Now, let's take a closer look at them. To help remember and relate easier to Sailfish OS gestures, I'll give them names: "Symbolic swipes" and "Functional flicks". Don't worry, these are not official terms.
Start from the display bezel and slide your finger over any screen edge to perform a symbolic swipe. They are controlling the operating system in the same way as a Home or power button would do on other devices, symbolizing the function of those buttons.
In the first picture, swiping from the either side takes you to Home screen. On the middle, swiping from the bottom edge shows the Events view with all your notifications. The last image illustrates a swipe from the top edge. It ends your current activity by closing the application you're in. As mentioned in my previous post, moving the notification access to the bottom edge helps greatly one handed use.
In the first picture, flicking to right takes you to the previous application page, replacing the common back button. In the middle, flicking to the left opens a page related to the current one (not shown in the video), replacing a common menu button. Close it by going back (right flick). A dialog page uses right and left functional flicks in canceling or accepting a common yes/no confirmation from an app.
Flicking up or down moves the content vertically as it does in many other device.
However, controls that directly relate to the current page content (create a new message, search etc) can be accessed by using the same movement direction. Imagine the content as an extension of your hand, like a rope. When you pull the content down, a pulley menu starts to open. As you keep revealing more menu options while pulling down, they become highlighted, one at a time. Releasing will select and perform the highlighted action. The name comes from an apparatus that's used to lift heavy loads.
Because you just used the content to access that menu, it doesn't matter what size your hand is, or is it a tiny phone or a tablet. You're not trying to reach and tap an icon or a button. Only the distance you pull down matters, and you can feel a small vibration when a new option is selected. Move content back up to hide the menu again.
Most of you have already done these in many apps that already exist out there. So it's hardly a first encounter.
Sailfish OS has simply harmonized and promoted common touch gestures.
If you think it's complicated, you most likely haven't tried it. Because it's not. It just takes few days for your hands to wake up from the button-based smartphone hibernation.
On almost all button-based smartphone operating systems, when you go to a sub-page, that page animates in from the right. The movement clearly communicates that it came from that direction. So what's the best way to undo that movement?
Yep, move it back to where it came from. I'll do another post at some point about interface animations and transitions.
Harmonizing and promoting gestures like done in Sailfish OS doesn't only makes moving and working inside applications faster and more ergonomic, but also much closer to how we're used to interacting with the physical world.
Nothing happens until you let go and stop affecting an object. If you lift a coffee mug from one table to another, the time you release your grip has a big impact whether it's a disaster or a graceful landing.
As long as you keep your finger touching the screen, you're in control. You can test what a gesture does as you can see what is happening moving your finger. If it was a wrong one, simply reverse the gesture to your starting point without releasing.
Thanks for reading and see you in the next post. In the meantime, agree or disagree, debate or shout. Bring it on and spread the word.