How AR Can Boost a Retail App Experience
Claire Lynch
Claire Lynch

How AR Can Boost a Retail App Experience

Claire Lynch, Senior iOS Engineer

The release of ARKit at Apple’s 2017 WWDC conference was a game-changer. Augmented reality represents a truly new way for users to interact with mobile apps. I’m lucky to work within a programming culture that is excited by innovations like this and encourages creativity and innovation—but, truth be told, I don’t often have the opportunity to chart truly unexplored territory. Enter

In the months since the release of ARKit, we’ve seen a number of home improvement companies and home furnishings retailers leap at this opportunity. The use case is clear: give prospective buyers a tool to visualize items in their homes to bridge the limitations of static photos in a digital marketplace. Even brick-and-mortar showrooms can’t compare to a real living room. If leveraged strategically, AR can convert customers who hesitate to make a purchase because they’re unable to see an item in its intended environment into confident, satisfied buyers.

The business impact is undeniable and potentially huge, but there are still risks. In fact, unless AR is implemented seamlessly and strategically, it could cause disillusionment rather than being a value-add for users. Working with, an exclusively online home improvement retailer, my team at Prolific was able to avoid the pitfalls and capitalize on the power of AR. We created an AR feature that stood out to Apple, promoted Build’s business goals, and offered users a fun and informative experience. 

Iterative UX: Introducing users to something totally new

As an engineer, it’s natural to act like a kid in a candy store when something as ground-breaking as ARKit gets added to your toolbox overnight. But what’s undeniably sweet and tempting for programmers must be broken down and digested from a user perspective in order to make waves. Which is all to say: AR is new for everyone, and not necessarily intuitive for middle-aged homeowners and renovators! A lot of experimentation and testing is required for UX to meet the needs of a public that is just beginning to interface with AR technology. Perusing reviews of some of the other home furnishing AR features out there, I found testimonies to broken UX. For example:

“It got the angles wrong and I couldn’t figure out how to rotate the furniture… Needs a tutorial! Do I literally point the camera straight at the floor, or do I need the place where the floor meets the walls? How do I rotate? It looks like I can, but tapping, swiping, or trying to grab the wheel-like circle under the furniture does nothing.”

We obviously wanted to avert such frustrations and guarantee a frictionless AR experience for users. In order to get there, we implemented an onboarding flow which provides image- and text-based instructions to users at each distinct step from opening the AR view to positioning a 3D model in their environment. We adhered to two guiding principles throughout the process.

Early sketch of features.

One, iterate relentlessly.

We designed and redesigned “indicator” elements, including a “scanner” for finding the floor plane, a moving measuring stick for determining the ceiling plane, and a target for final placement of the 3D model. We finessed and rewrote copy to clarify the meaning of each. We introduced animations showing the appropriate ways of angling the device and revised them according to feedback. The team grew accustomed to a constant back-and-forth between product manager, designer, and engineer to keep narrowing the gap between each new version and a perfectly seamless experience.

Designs for plane detection indicators.

Two, don’t be afraid to throw things away.

While prototyping the feature, we developed two different “modes”—automatic and manual—for users to detect the ceiling plane and place a 3D model of a light fixture. It was a resource-intensive decision to engineer both. But testing revealed that the automatic detection didn’t perform consistently, and it was cumbersome to switch between modes. We scrapped it and honed the manual detection mode exclusively.

Design for original 2 modes switch.

The Reality Promise: Showing models with real-life accuracy

Users expect and need AR visualizations to be accurate for viewing home improvement products. Otherwise, it’s no better than viewing a 2D picture of the product, and maybe worse since photos are more performant. Consider this nightmare situation: a user views the chandelier of their dreams in AR, trusting that the scale, color, texture and other product details are rendered with real-life accuracy. They make the plunge, only to find with bitter disappointment that there’s some major discrepancy between AR and the real thing. Again, I found examples when reading reviews of other major AR features in the App Store:

“All the furniture it places is still hilariously gigantic and floats mid-air… If you try to move [items] they’re suddenly floating and either huge or tiny. How can I tell if something will look right in my house if the size is so off?”

Our team wanted to ensure that our promise of real-life fidelity was a true one for the AR feature. The main refrain of the product owner at Build was our responsibility to “bridge the touch-and-feel gap,” and we all rallied behind this goal. Again, we focused on two things to get it done.

Don’t cut corners when creating the 3D content.

The reality is that 3D model creation is still a time consuming and highly skilled undertaking. We were lucky to work with a true artist at and to have one on our team. They optimized file sizes without sacrificing the complexity of each model’s polygonal geometry, even down to the detail of a light bulb’s filament. They set up the material structure manually, rather than using other less time-consuming techniques, to leverage physical based rendering and achieve the most realistic light effects possible.  

It’s okay to sacrifice other conveniences to preserve and enhance realism.

This returns to our decision to nix automatic ceiling detection. It would have been optimal for the user to point their phone’s camera toward the ceiling then have our under-the-hood algorithm identify the correct plane without the user needing to measure anything, but we chose to deprecate this capability because the algorithm would sometimes place the ceiling closer or farther than it was in reality. The light fixtures would then be rendered larger or smaller, respectively, then the actual scale. We knew this would be a deal breaker for users looking to see the real thing. So, we agreed to prioritize realism of scale over the benefit of greater automation.

Make it Fun: Allowing users to interact with 3D models.

Real life isn’t static! I fully expect that as content creators, designers and developers gain greater expertise working with AR, and as the technology itself continues to improve, we will see increasingly dynamic 3D content. is ahead of the game at this point. We made our AR feature fun and more realistic by allowing users to interact with the 3D models in various ways. Users can turn 3D light fixtures on and off, and extend or retract the chain length. With faucets, they can turn the water on and off, and adjust the temperature and spray pattern. Items from both product categories can be rotated, repositioned, and viewed with different finishes. The example of empowering users to select different finishes is particularly illustrative. As an online-only business, is able to offer its customers an unparalleled selection of SKUs, more finishes, and configurations than could ever be held in a physical warehouse. By giving their customers the capability to see all these variations, rather than just a swatch, they also nullify a former advantage over their brick-and-mortar competitors.


I think back to the announcement of ARKit at WWDC, and the brimming excitement among all the technologists here at our organization. Our eagerness to start building and tinkering isn’t nearly enough, though. As with any other part of the mobile landscape, we need product leaders, design and UX experts, and talented content creators to make AR stick. Developing the feature allowed our team to move decisively in this direction. We went through countless iterations in order to optimize our UX. We allocated time for our content creators to bring a high level of artistry to the 3D models. We created something that’s performant, realistic and fun. There is still ground to cover, particularly in scaling content creation and management, but when we get there, we will have transformed the digital marketplace into what feels like a tangible one.