Puma
MediaMonks
March - April 2019
User Experience
FWA of the Day - September 4, 2019
As part of Puma's campaign for their new Hybrid Astro shoes, a pop-up pavilion in Shanghai's busiest squares allow potential customers to test out the footwear in an interactive running experience. It's not about competition or how fast you can run, but the freedom of ‘running your own way’ and the beauty of exploration.
Runners customize their own Puma avatar that's used to track their movement and speed as they move on a treadmill to navigate through three imaginary virtual worlds.
There were many iterations I had to make during this whole process as the decision for the type of technologies that will be employed and the floor plan of the pavilion was all still in progress.
Here is the final high level flow from the moment the user enters the pavilion to when they exit. I then went further into each collapsed flow to identify specific interactions.
Before running, customers can make their own version of a Puma avatar that represents them in the virtual world. With the goal of putting technology at the forefront of the experience, I explored motion gestures to interact with the personalization screens. Based on reasearch of current motion gestures, I have narrowed down the most intuitive movements for the best usability and easiest onboarding.
The size, placement, and how far someone is from the screen contribute to the layout of components. Keeping in mind the selections should all be within reach while standing in one spot.
After figuring out the possibilities and limitations of gesture interactions, I created wireframes of the personalization screens.
Once users enter the personalization room, they are greated with instructions to scan the QR code on screen to add Puma ambassador as a friend on WeChat. The QR code is large enough to be scanned from a distance. After adding, an indication on the left will show them to continue on by raising both arms.
User's progress in the personalization process can be seen to the left of the screen. The first step scans user's face and seamlessly morphs into their own likeness of a Puma. Once the morphing is complete, the motion sensor detects user's movement and mimics them onto the Puma.
The Puma's colour, material and running soundtrack can be customized. All the options are to the right of the screen. Hover hand over and wait for circle to complete to select, confirm selection by raising both arms.
Once finished, the Puma zooms out to show its entire body. And animates to walk off the screen towards the running room, beckoning users to follow it.
The content has been condensed for the purpose of this portfolio. However, if you're interested, please feel free to contact me to discuss them further!