Up to date on July 24 at 9:30 a.m. ET — Mashable’s Tech Editor Timothy Beck Werth initially tried the beta model of the Google Buying “Strive it on” characteristic in Might, again when it first grew to become obtainable for testing. And as of this writing, Google is launching the characteristic to all customers in the US on desktop and cell units. You may do this digital Clueless closet for your self inside Google Buying now — simply click on on an attire product and search for the “Strive it on” button.
At Google I/O 2025, the tech firm introduced a ton of recent AI options, and one of the crucial attention-grabbing is a digital clothes try-on software.
The Google Buying “Strive it on” characteristic lets customers add a photograph of themselves after which nearly strive on garments, principally the IRL model of the Clueless closet millennials have been dreaming about since 1995. Or, as Mashable Buying Reporter Haley Henschel put it, “Google’s newest procuring characteristic makes Cher Horowitz’s computerized closet a actuality.”
Virtually as quickly because the characteristic was launched, customers began attempting to “jailbreak” the software, which is turning into a enjoyable little custom for tech writers each time a brand new AI mannequin or software is launched. On Friday, The Atlantic reported that “Google’s new AI procuring software seems keen to provide J.D. Vance breasts.” Hilarious, proper? What’s much less hilarious — the identical software will even generate breasts for photographs of underage customers, once more per The Atlantic.
I made a decision to provide the “Strive it on” characteristic a check spin, and I am going to discover the great, the unhealthy, and the mortifying beneath. As a procuring software, I’ve to say I am impressed.
Methods to use Google’s “Strive it on” AI procuring software
The digital try-on characteristic is likely one of the free AI instruments launched by Google this week, and customers can signal as much as take part now. Formally, this product is a part of Google Labs, the place customers can check experimental AI instruments. Signing up is easy:
-
Sign up to your Google account
-
Head to Search Labs and click on to show the experiment on
-
Take a full-body image of your self and add it
-
Navigate to Google Buying and click on a product you need to “strive on”
-
Search for the “Strive it on” button over the product picture
The “Strive it on” button seems over the product picture.
Credit score: Screenshot courtesy of Google
As a trend software, Google’s “Strive it on” characteristic actually works
Purely as a software for attempting on garments, the brand new digital try-on expertise is fairly rattling spectacular. The software makes use of a customized picture technology mannequin educated for trend, per Google.
I am all the time skeptical of recent AI instruments till I’ve tried them myself. I additionally care about my very own private fashion and think about myself up-to-date on males’s trend traits, so I wasn’t certain what to anticipate right here. Nonetheless, the software does work as marketed. In a flashy I/O presentation, Google confirmed fashions seamlessly attempting on one outfit after the following, and whereas the precise software is just a little slower (it takes about 15 seconds to generate a picture), the precise product expertise is similar to the demo.
To point out you what I imply, let’s examine some selfies I lately took on a visit to Banana Republic right here in New York Metropolis to the AI photographs generated by Google for a similar garments. For reference, here is the unique picture I uploaded (and keep in mind that I am a Tech Editor, not a trend mannequin):

The picture I used to nearly strive on garments.
Credit score: Timothy Beck Werth / Mashable
On this first picture, I am sporting a blue cashmere polo, and the AI picture appears to be like kind of like the actual one taken within the Banana Republic dressing room:
Mashable Mild Velocity

Making an attempt on a blue polo…
Credit score: Timothy Beck Werth / Mashable

And here is how Google imagined the identical shirt. AI-generated picture.
Credit score: Timothy Beck Werth / Mashable
I discovered the AI procuring software got here fairly near capturing the general match and elegance of the shirts. It even modified my pants and footwear to higher match the product. If something, the digital try-on software errs on the aspect of creating me slimmer than I’m IRL.

I ended up shopping for this one.
Credit score: Timothy Beck Werth / Mashable

AI-generated picture.
Credit score: Timothy Beck Werth / Mashable

Yeah, I purchased this one, too.
Credit score: Timothy Beck Werth / Mashable

AI-generated picture.
Credit score: Timothy Beck Werth / Mashable
On this picture, Google added a necklace round my neck that I might by no means put on in actual life, and the AI-generated shirt is a bit more slim-cut than it is imagined to be, however usually the general fashion is correct.

I made a decision this is not my fashion.
Credit score: Timothy Beck Werth / Mashable

Neither is the imaginary necklace, watch, and matching white sneakers.
Credit score: Timothy Beck Werth / Mashable
Whereas the photographs are producing, you see a message that claims: “AI photos might embody errors. Match and look will not be actual.”
However for an experimental software, it is surprisingly on level. Folks have been hoping for a software like this for many years, and because of the age of synthetic intelligence, we lastly have one.
After all, not all the errors made by this software are so flattering…
Google additionally eliminated my shirt and imagined my chest hair
This is the place issues get attention-grabbing. In The Atlantic piece I discussed earlier than, the authors discovered that in the event you requested the software to generate a picture of a revealing gown or high, it might typically generate or increase breasts within the unique picture. That is significantly more likely to occur with ladies’s clothes, for causes that must be apparent.
After I used this software with a pink midi gown, the outcomes have been mortifyingly correct. I wager that is just about precisely what I might seem like sporting that individual low-cut midi gown.
I am going to spare you from the precise picture, however to think about me within the gown, Google needed to digitally take away most of my shirt and movie me with chest hair. Once more, I am stunned by how correct the outcomes have been. Now, after I “tried on” a pink ladies’s sweater, Google did give me some further padding within the breast part, however I’ve additionally been open about the truth that that is not fully Google’s fault in my case. Fortunately, this characteristic was not obtainable for lingerie.
What will be finished about these issues by Google? I am undecided. Males have each proper to put on cute pink midi attire, and Google can hardly prohibit customers from selecting cross-gender clothes. I would not be stunned if Google finally removes the software from any product that exhibits an excessive amount of pores and skin. Whereas The Atlantic criticizes Google for altering photos of them after they have been underage, they have been those who uploaded the photographs, and in violation of Google’s personal security insurance policies. And I think the offending outcomes would even be the identical with virtually any AI picture generator.
In an announcement to Mashable, a Google spokesperson stated, “Now we have sturdy protections, together with blocking delicate attire classes and stopping the add of photos of clearly identifiable minors. As with all picture technology, it gained’t all the time get it proper, and we’ll proceed to enhance the expertise in Labs.”
May individuals abuse the digital try-on software to cyberbully their friends or create deepfakes of celebrities? Theoretically, sure. However that is an issue inherent to AI usually, not this particular software.
In its security pointers for this product, Google bans two classes of photos, along with its normal AI content material pointers:
-
“Grownup-oriented content material, baby sexual abuse imagery, non-consensual sexual content material, and sexually express content material.”
-
“Inappropriate content material corresponding to harmful, derogatory, or stunning.”
Once more, you may check out this software at Google Search Labs.
Matters
Synthetic Intelligence
Google