Google’s announcement at Google I/O 2025 introduces a new virtual try-on feature, allowing U.S. users to upload personal photos for a tailored clothing visualization experience. This survey note delves into the background, recent developments, technical aspects, and potential implications, ensuring a thorough understanding for stakeholders in e-commerce and technology.
Google has been innovating in AI-driven shopping experiences since 2023, initially launching a virtual try-on feature that displayed clothing on diverse models (XXS to XXXL, various ethnicities, skin tones), powered by generative AI to replicate in-store fitting rooms online. This was detailed in a June 2023 Google blog post and expanded to men’s tops and dresses by September 2024, as noted in TechCrunch. The latest update, announced on May 20, 2025, via a Google blog post, shifts to a user-centric model where individuals can upload their own full-length photos, aligning with broader AI advancements in shopping.
The new feature, part of Google’s AI Mode shopping experience, enables users to try on billions of apparel listings, including shirts, pants, skirts, and dresses, by uploading a photo. According to the blog post, it leverages the Shopping Graph, now holding over 50 billion product listings, refreshed hourly with more than 2 billion updates. Users can access it by opting into Search Labs, selecting the “try it on” button next to eligible items, and viewing results instantly, with options to save or share the generated images. The rollout began on May 20, 2025, exclusively for U.S. users, as confirmed in recent articles like Engadget and TechCrunch. This experimental phase suggests potential for future expansion, with user feedback likely shaping refinements.
The technology behind this feature is a custom image generation model for fashion, designed to understand human body shapes and clothing behavior, such as how materials fold, stretch, and drape. While specific technical papers are not publicly available, insights from previous implementations suggest it uses diffusion models, as mentioned in a 2023 Google blog post. For user photos, it likely involves pose estimation, body shape analysis, and AI-driven rendering to create photorealistic images, similar to techniques described in Vogue Business. The process requires a full-length photo, with guidelines likely including good lighting and fitted clothing, inferred from initial announcements.
The driving force is Google, with key figures like Lilian Rincon, Senior Director of Consumer Shopping Product, mentioned in announcements. The development involves Google’s shopping and AI teams, leveraging the Shopping Graph’s infrastructure, which integrates data from global and local retailers.
As of May 21, 2025, no major controversies are reported, but potential concerns include privacy, accuracy, and inclusivity. Privacy is significant, given users upload personal photos. While Google’s AR beauty try-on feature, detailed in Manufacturer Center Help, states no biometric data is collected or stored, specific policies for clothing try-on are unclear, governed by Google’s general privacy policy Google Privacy Policy. It seems likely that similar on-device processing is used, but confirmation is needed. Accuracy concerns involve how realistically the AI depicts clothing on diverse body types, potentially affecting user trust. Inclusivity is another area, with the need to ensure the AI model represents all demographics fairly, avoiding biases seen in other AI applications. These aspects warrant monitoring as user adoption grows.
Initial reactions on X, as seen in replies to Google’s announcement, are overwhelmingly positive, with users expressing excitement
“Omg this is for me only,” “This is what I wanted ever since I saw it in Clueless”
Humorous comments like
“virtual clothes? nice, but what if the shirt tries to take over my mind?”
suggest engagement, though substantive reviews are limited given the recent launch.
Compared to the 2023-2024 versions, which focused on model-based try-ons, this update personalizes the experience by using user photos. Earlier features, as detailed in Retail Dive, supported selecting models from XXS to 4XL, addressing diversity. The new feature extends this by tailoring to individual body shapes, potentially increasing buying confidence, as noted in Google Merchant Center Help.