Home>Blogs>Introducing Gravity View
By Ronak Modi
February 26, 2015

Introducing Gravity View

From conception to reality, here’s how we built Gravity View. Now tilt your way through products, only on Fynd.

One of the things we humans take for granted is the effect of gravity in our lives. When you watch astronauts at the International Space Station describing how challenging it is do simple things like tooth-brushing or drinking a glass of water, you start appreciating life on earth a little more.A few weeks back Fynd’s product team started discussing how routine most mobile shopping apps are while on the other hand games are phenomenally engaging and have so much real world physics. This got us thinking —

“Why isn’t there any physics in shopping apps and why not in Fynd?”

Behind the Scenes

This is a tricky question. Consumers don’t expect shopping to extend beyond the visual cues in a layered two dimensional space. What do we change first in the 2D world to transition the shopping experience to the 3D world with motion physics?

After much deliberation, debates, and countless fine tuning we are pleased to introduce Gravity View - the world’s first and only shopping app with this UX. You can now tilt your phone to browse through product images without having to swipe. Mobile shopping just got so much fun! :)

Farooq’s vision was to empower users to navigate quickly through all product images without having to swipe through all of them. It was necessary to ensure seamless transition between images to avoid nausea and irritation.

We took inspiration from Facebook’s Instant Articles and implemented smooth tilt on a single image. We showed this around to folks at Fynd and they loved it. This got us excited about extending this feature to all images, and all at once. Now this was one challenging task!

When we started building a proof of concept we realised that the implementation wasn’t as smooth as we wanted it to be because we had enabled pagination to load each image in a separate canvas earlier. The only solution to this problem was to load all images in a single canvas and that meant re-writing a lot of code. Now, achieving the right tilt angle along with acceleration was key. As usual, this lead to more challenges :)

Engineering Approach

Here’s what Rahul has to say about his experience building Gravity View on iOS:

Our approach was to pan across the canvas depending on the amount of device inclination. But the problem with that was once you reach one end of the canvas and still keep tilting, it gets tougher to recenter the canvas. The reason is that there’s always an added offset because of that extra tilt on edge of the canvas. So we fixed the angle of rotation which would make sure it moves only when device is tilted within that range of angle with respect to starting position.

This solution lead to another problem. Since we had fixed the angle, the number of images had another impact. Restricting the range meant that product with lesser images would take forever to tilt and the ones with more images would tilt quickly. One option that we considered to solve this problem was to keep the pan/degree factor constant. However, this meant a small twist of the wrist will always pan across 2 images irrespective of total number of images. This made it nearly impossible for a user to reach the last image in the canvas if there were more than 5 images.

It was even more difficult to implement Gravity View on Android because while 99% devices have an accelerometer sensor, not all have a gyroscope. The challenge with using only an accelerometer is that it does not return accurate information and resulted in a jerky Gravity View experience.

Fahim Sakri tells us about an approach he took to solve this problem:

Initially, we explored a few libraries to help us achieve a smooth tilt experience and played around with the g vector values and tilt measurements but none of them gave us the results we wanted.

Since an accelerometer alone wasn’t sufficient to avoid issues like gimbal lock, we used Sensor Fusion technique to provide accurate information by using rotation vector sensor, accelerometer, geomagnetic sensor, and gyroscope.

Finally, we sat down and did some math to solve this problem. We considered the number of images, range of tilting angle, pan speed for the canvas, and a few other variables to form an equation . Here’s a snippet —

You can view the code on Github. Do not forget to write to us on Twitter! We’d love to hear your feedback.

Our upcoming releases will introduce more normalisation filters to make Gravity View experience even smoother.



Blog Home