The Next Product Shelf Is Your Visual Field
Research on recommendation interfaces in immersive environments shows that spatially highlighting products in a shopper's visual field outperforms list-based presentation. The next product discovery surface may not be a search bar or chatbot but a spatial overlay tied to the room the shopper is standing in.
Neritus Vale
Recommendation engines have spent two decades refining the ranked list. Close to a decade of research on spatial computing interfaces suggests the list may be the wrong surface. When a recommended item appears in spatial context rather than a ranked sidebar, shoppers respond better, a pattern that holds across different devices, sample sizes, and research groups working independently. If that holds at scale, the next product discovery surface will not be a search bar or a chatbot but a spatial overlay tied to physical context.
The evidence base runs deeper than one experiment. In 2018, researchers used a Microsoft HoloLens to place recommended products in the physical spaces where they would be used, varying recommendation quality across three interface types; the results suggested that evaluating a product in the space where it will live is a different cognitive act than evaluating it in a ranked grid. At SIGIR 2023, researchers showed that aligning a VR shopping environment with search context significantly shaped how eighteen participants experienced product results. The ARShopping prototype, presented at IEEE VIS 2022, proposed overlaying comparison data directly onto physical products in-store, targeting the decision friction that arises when shoppers cross-reference a phone screen with a shelf. Across different devices, sample sizes, and research groups, the pattern holds: spatial context changes what the shopper sees, and what the shopper sees changes the decision.
A ranked list asks the shopper to imagine the product in context; a spatial overlay removes the need to imagine.
The hardware that makes spatial overlays practical is shipping faster than the interface conventions they demand. Smart glasses shipments rose 110 per cent year-on-year in the first half of 2025, driven by Meta’s Ray-Ban line and a wave of AI-enabled competitors. That growth rate signals the transition from novelty to utility: when glasses ship with heads-up displays and real-time object recognition, spatial overlays stop being lab demonstrations and start being distribution channels. Shopify data shows shoppers 65 per cent more likely to place an order after interacting with AR product views than after viewing flat images on a standard screen. The conversion data comes from screen-based AR, which still requires a phone or tablet; the studies above test what happens when the screen drops away entirely. Current AR conversion gains likely mark the lower bound for spatial interfaces that eliminate the device as a visible intermediary.
The competitive question is who owns the spatial layer. Meta opened pre-orders for prescription-optimised AI frames in March 2026, with retail availability from April 14, extending its Ray-Ban line’s existing object recognition to prescription wearers — the first consumer hardware that could plausibly render spatial recommendations at the point of purchase. Apple drew Mytheresa and J.Crew to build immersive shopping apps for visionOS within months of Vision Pro’s launch, establishing retail as a spatial commerce category. Shopify has begun embedding AI-powered discovery into its platform but has not yet extended it to spatial interfaces. Google’s search index feeds most product discovery today, yet its investment in spatial computing remains peripheral. None of the four has built a spatial recommendation engine as a core product, and the first to do so sets the rules for a surface with no established auction model and no standard ranking signal.
The strongest objection: eighteen participants in a controlled VR environment do not predict behaviour in the visual clutter of a physical retail floor, and consumers have shown limited willingness to wear headsets for everyday shopping. This counter holds if hardware adoption stays niche and spatial recommendation demands a device most people refuse to carry. But the form factor objection has a shelf life. Each generation of smart glasses moves closer to conventional eyewear in weight, price, and appearance; Meta’s prescription AI frames are the latest step in that convergence. If the device disappears into the frame, the adoption problem dissolves into a software problem, and the software problem is where the value accrues. The strongest version of the counter is not that spatial recommendation fails but that it arrives before the installed base can exploit it.
For fashion retail, the stakes are specific. Style is a spatial judgement: a shopper evaluates a jacket in a room, on a body, next to what she already owns. A recommender that places that jacket in her visual field — sized correctly, contextualised against her wardrobe — answers a question that no ranked list can pose. The models powering today’s personalisation engines were built to output a sorted array of product IDs for a scrollable screen; spatial commerce demands a different output, a placement decision about which item to illuminate at which position in the shopper’s environment. If the research trend holds, the most valuable position in retail will not be the top of a search result but the overlay that appears when a shopper looks at her closet. That is a different kind of real estate, and none of the incumbents own it yet.