Next generation AI technology promises to level up your shopping experience
What if you could unearth the most exclusive fashion finds... without even setting foot in a physical store?
When Prada sent oversized bowling bags down the men’s autumn/winter ’25/’26 runway – one in a jammy shade of wine, another in a luscious tri-tone of cognac, chocolate and cream – it set an early 00s memory whirring. That piping, the 70s curve of their tops and the deliberately worn patina was straight from Prada’s spring/summer 2000 show, one I hadn’t thought had made any special impact at the time but had sealed a latent desire deep in my fashion memory. Suddenly, I wanted to find one. In black with piping. Not yet armed with the correct season but prepared for nights of serious sleuthing on the internet, I entered the search term ‘Prada Bowling Bag early 2000s’. One similar style appeared, but alas, a re-edition from resort 2020 rather than the original. Followed by another, this time in a sickly shade of green with white accents. Then a thought: what if AI, equipped with pattern recognition and the ability to powerfully process vast amounts of data, could do the looking for me?
One right click, several minutes of price comparison, and a few days later watching items on eBay and Vestiaire Collective (check-ins that lasted only seconds each), and the piece – an original spring/summer ’00 Prada bag in black with white piping and silver padlock, the same style that swung from the arm of a young Gisele Bündchen in oxblood hotpants and patent pumps on the runway – was mine. And with a saving of $650 over the first version found.
I had used Google Lens, one of many AI-powered tools making the shopping experience more efficient. Loading an image into the search bar or right clicking on an existing image pulls up multiple similar images, with shoppable links acting as fashion super-sourcers in an internet marketplace that has more data and options than a single person could parse in a lifetime. While AI has long played a role in our shopping experience, from Alexa writing our shopping lists to AI-scribed descriptions of items on a seller’s eBay, in 2025 it is becoming increasingly sophisticated – and in demand. The Business of Fashion’s The State of Fashion 2025 report found 82 per cent of shoppers want AI to help reduce the time they spend researching what to buy. The possibilities for the fashion nerd and the time-poor (and those who are both) are even more exciting.
“AI is already transforming every layer of shopping, from virtual try-ons and personalised storefronts to supply chain optimisation and predictive inventory,” says Vivek Wadhwa, tech entrepreneur, academic and co-author of The Driver in the Driverless Car. Its current uses range from stores managing their stock flow to increased accuracy of recommendations.
Virtual try-ons, which helped me decide on and secure a current and sought-after pair of Miu Miu glasses online before they’d sold out, and chatbots that tell you when something’s next in stock, are now commonplace. In two years, Google Lens has gone from the fashion-insider search function of choice – one that doesn’t require words – to mainstream tool, clocking 25 billion visual searches each month in 2025.
“Lens can be a personal showroom, helping people discover and follow what catches their eye and shop the world around them,” says Lilian Rincon, VP of product at Google Shopping. Users can snap photos of what they see in front of them – a shoe on a person on the street, say – and use saved images. Currently one in four Lens searches has commercial intent, and the number of people shopping on Lens grew 10 per cent in this year’s first quarter.
“Lens can offer people a real sense of delight and convenience when they can find exactly what they are looking for,” Rincon continues. “A few months ago, for example, I wanted to repurchase an eyeshadow and Lens was able to find it despite the label having rubbed off. That felt like magic.”
Beneath all of this, though, more AI is invisibly at work, improving the clicking and browsing experience by synthesising and analysing what is saved to a wish list and which searches are returned to time and again. Data pulled includes “clickstreams, purchase history, social media engagement and visual data from platforms like Instagram or Pinterest”, says Wadhwa.
Meanwhile, a renewed version of Google Shopping debuted in the US last year and is now being tested in Australia. “We aim to show shoppers products, brands and merchants that they’re likely to love,” says Rincon of the platform. “To do this, we work to understand your preferences based on what products you’ve interacted with on Google Search in the past.” The new Google Shopping allows for hyper-specific recommendations that draw on user reviews and consider learned preferences, like brands and categories. “Personalisation must be central to shopping innovation, not just in the information presented, but also in how shoppers can interact with it,” continues Rincon.
Late last year, Vestiaire Collective announced advanced AI tools would power its search function. AI would translate key search terms into image pattern recognition, rather than relying on sellers’ sometimes inaccurate descriptions (would a seller know to describe my bag as ‘early 2000s Prada’?), to serve “highly accurate” suggestions – aka, things we might actually be into. The company has said it doubled products sold through the ‘similar items’ suggestions since introducing the tech.
Wadhwa sees the potential to democratise fashion by giving tailored recommendations to consumers who lack access to stylists or knowledge of where to buy something they want, while reducing decision fatigue and increasing confidence in style choices. But, he cautions, there are pitfalls inherent to machine learning.
“If AI only feeds us what we already like, it risks putting our creativity on autopilot,” he points out. “It narrows our exposure, suppresses serendipity and conditions us to stay in safe zones. True self-expression comes from surprise, discomfort and discovery – not repetition. AI is trained on past data, not cultural zeitgeist. Fashion lives in the now – it’s about subtext, rebellion, context.”
Which means that ‘recommended for you’ may not contain an unexpected statement shoe that pushes personal taste forward. Who wants to be known for totally predictable style? “We often don’t know why we make the choices we do,” Wadhwa continues. “AI can’t yet decode irony, nostalgia or the layered signals trendsetters emit.”
Monisha Klar, director of fashion at WGSN, however, feels AI can support an evolution of taste. WGSN launched an AI-driven trends prediction platform meant to guide decisions for buyers – the people who decide what goes into boutiques. “We use our AI to synthesise what we know to be the greatest predictors of trend and project them two years into the future,” she says. “Through picking up early signals and movements in our e-commerce, social and catwalks data, we’re able to predict trends with 90 per cent accuracy. From the consumer vantage point, buying this way leads to more robust assortments. Traditionally, buyers are wedded to their historical information alone, which can lead to overreliance on bestsellers.” Translation: it helps to avoid same-same clothes.
The technology also allows buyers to spend more time on creative and strategic thinking, by returning to them time previously spent on data analysis.
As a consumer, however, knowing what we like is nebulous, shaped by the myriad intricacies that define personal taste. Do we really want a computer to serve us an approximation of this? “AI can track hashtags, runway shows and influencer posts, but it can’t read between the lines,” says Wadhwa. “A sophisticated fashion consumer senses shifts before they happen. AI can’t replicate that instinct.”
Klar is realistic about AI’s limitations. “We have an enormous amount of respect for buyers as a profession because we are former buyers,” she says. “So, while we utilise AI as sheer computing power and assistance into synthesising findings much quicker, we still ultimately believe the buyer is the true expert on their individual consumer.” She describes software that can automatically re-order clothes for buyers but may not be able to tell they are out of season. In this way, AI should be viewed as a tool. Real taste still needs human input.
“It’s for that reason the concerns about a ‘sea of sameness’ or consumers losing their voice in the process don’t concern us when we approach AI in this way,” Klar continues.
Like an archive tool, AI can also aggregate historical collections and brand histories, creating an easy way to access inspiration. “AI here really offers an opportunity to document and catalogue all of this to be preserved in the future,” says Klar.
Its next iterations will be designed to address both consumer pain points and the idea we want to be inspired by newness when we shop. “Things like agentic AI, improvements to personalisation and tapping into new forms of inspiration will all be areas at the forefront of shopping innovation,” says Rincon, referring to ‘agents’ who will have autonomous capabilities to purchase for you. You could give it a budget and say, “Buy me a new wardrobe for a winter weekend away with my girlfriends, with a dressed-up lunch and downtime.” Amazon’s Rufus is imagined this way. For now, you can ask questions about the best clothing for certain climates, materiality and size. Similarly, Perplexity can book a trip for users and uses a conversational style to ascertain what you want. This would require a lot of trust and “systems that understand our preferences, but also our context, mood and evolving identity”, says Wadhwa. It would also require an AI model that could show us something we didn’t know we wanted yet with a high hit rate. “Companies are experimenting with exploratory models that intentionally introduce novelty within the bounds of a user’s taste,” says Wadhwa. “The goal is inspiration, not error.”
This is still a little way off. “It’s still primitive,” he notes. “Most systems operate like advanced filters, not autonomous buyers. They can assist with replenishment or impulse buys, but interpreting complex human motivations, like why we fall in love with a dress, is far beyond their grasp.” It’s why, he says, we have to build systems sensitive to, and with a respect for, human values, like self-expression and diversity “or risk losing what makes us unique”. He is, however, excited by what’s next. “The boundary between intent and action is disappearing – shopping will soon feel like thought to fulfillment.”
One that starts with ‘a rare Prada bag I once loved’, and, within hours, ends with a computer finding, buying and having the same bag delivered to your door. Giving you even more time to think about the next one.
This story is from the June issue of Vogue Australia.
To join the conversation, please log in. Don't have an account? Register
Join the conversation, you are commenting as Logout