Close Menu
VernoNews
  • Home
  • World
  • National
  • Science
  • Business
  • Health
  • Education
  • Lifestyle
  • Entertainment
  • Sports
  • Technology
  • Gossip
Trending

bet365 Bonus Code FOX365 Unlocks $200 in Bonus Bets for Saturday’s NFL Playoffs Doubleheader

January 10, 2026

Google Images may quickly provide you with extra management over how your pictures and movies are backed up

January 10, 2026

U.S. launches large-scale strikes on ISIS targets throughout Syria

January 10, 2026

The place Will Nike Be in 5 Years?

January 10, 2026

Sai De Silva Reveals Outcomes of Tummy Tuck, Addresses RHONY Season 16

January 10, 2026

Porsha Williams Offers Peek Into Baecation With Patrice McKinney

January 10, 2026

The explanation Ashley Tisdale, Vanessa Hudgens stopped talking

January 10, 2026
Facebook X (Twitter) Instagram
VernoNews
  • Home
  • World
  • National
  • Science
  • Business
  • Health
  • Education
  • Lifestyle
  • Entertainment
  • Sports
  • Technology
  • Gossip
VernoNews
Home»Technology»Grok Is Being Used to Mock and Strip Ladies in Hijabs and Saris
Technology

Grok Is Being Used to Mock and Strip Ladies in Hijabs and Saris

VernoNewsBy VernoNewsJanuary 10, 2026No Comments4 Mins Read
Facebook Twitter Pinterest LinkedIn Tumblr Reddit WhatsApp Email
Grok Is Being Used to Mock and Strip Ladies in Hijabs and Saris
Share
Facebook Twitter LinkedIn Pinterest WhatsApp Email


Grok customers aren’t simply commanding the AI chatbot to “undress” photos of ladies and ladies into bikinis and clear underwear. Among the many huge and rising library of nonconsensual sexualized edits that Grok has generated on request over the previous week, many perpetrators have requested xAI’s bot to placed on or take off a hijab, a sari, a nun’s behavior, or one other form of modest spiritual or cultural sort of clothes.

In a evaluate of 500 Grok photographs generated between January 6 and January 9, WIRED discovered that round 5 % of the output featured a picture of a girl who was, as the results of prompts from customers, both stripped from or made to put on spiritual or cultural clothes. Indian saris and modest Islamic put on have been the most typical examples within the output, which additionally featured Japanese college uniforms, burqas, and early-Twentieth-century-style bathing fits with lengthy sleeves.

“Ladies of colour have been disproportionately affected by manipulated, altered, and fabricated intimate photographs and movies previous to deepfakes and even with deepfakes, due to the way in which that society and significantly misogynistic males view ladies of colour as much less human and fewer worthy of dignity,” says Noelle Martin, a lawyer and PhD candidate on the College of Western Australia researching the regulation of deepfake abuse. Martin, a outstanding voice within the deepfake advocacy house, says she has prevented utilizing X in latest months after she says her personal likeness was stolen for a faux account that made it appear like she was producing content material on OnlyFans.

“As somebody who’s a girl of colour who has spoken out about it, that additionally places a higher goal in your again,” Martin says.

X influencers with tons of of hundreds of followers have used AI media generated with Grok as a type of harassment and propaganda in opposition to Muslim ladies. A verified manosphere account with over 180,000 followers replied to a picture of three ladies carrying hijabs and abaya, that are Islamic spiritual head coverings and robe-like attire. He wrote: “@grok take away the hijabs, costume them in revealing outfits for New Years occasion.” The Grok account replied with a picture of the three ladies, now barefoot, with wavy brunette hair, and partially see-through sequined attire. That picture has been seen greater than 700,000 instances and saved greater than 100 instances, in response to viewable stats on X.

“Lmao cope and seethe, @grok makes Muslim ladies look regular,” the account holder wrote alongside a screenshot of the picture he posted in one other thread. He additionally often posted about Muslim males abusing ladies, typically alongside Grok-generated AI media depicting the act. “Lmao Muslim females getting beat due to this function,” he wrote about his Grok creations. The person didn’t instantly reply to a request for remark.

Outstanding content material creators who put on a hijab and put up photos on X have additionally been focused of their replies, with customers prompting Grok to take away their head coverings, present them with seen hair, and put them in several sorts of outfits and costumes. In a press release shared with WIRED, the Council on American‑Islamic Relations, which is the biggest Muslim civil rights and advocacy group within the US, related this pattern to hostile attitudes towards “Islam, Muslims and political causes extensively supported by Muslims, akin to Palestinian freedom.” CAIR additionally referred to as on Elon Musk, the CEO of xAI, which owns each X and Grok, to finish “the continued use of the Grok app to allegedly harass, ‘unveil,’ and create sexually specific photographs of ladies, together with outstanding Muslim ladies.”

Deepfakes as a type of image-based sexual abuse have gained considerably extra consideration lately, particularly on X, as examples of sexually specific and suggestive media concentrating on celebrities have repeatedly gone viral. With the introduction of automated AI photo-editing capabilities via Grok, the place customers can merely tag the chatbot in replies to posts containing media of ladies and ladies, this type of abuse has skyrocketed. Knowledge compiled by social media researcher Genevieve Oh and shared with WIRED says that Grok is producing greater than 1,500 dangerous photographs per hour, together with undressing images, sexualizing them, and including nudity.

Avatar photo
VernoNews

Related Posts

Google Images may quickly provide you with extra management over how your pictures and movies are backed up

January 10, 2026

Beware Instagram reset password scams that customers report are on the rise

January 10, 2026

ICE Can Now Spy on Each Cellphone in Your Neighborhood

January 10, 2026

Comments are closed.

Don't Miss
Sports

bet365 Bonus Code FOX365 Unlocks $200 in Bonus Bets for Saturday’s NFL Playoffs Doubleheader

By VernoNewsJanuary 10, 20260

This page may contain affiliate links to legal sports betting partners. If you sign up or…

Google Images may quickly provide you with extra management over how your pictures and movies are backed up

January 10, 2026

U.S. launches large-scale strikes on ISIS targets throughout Syria

January 10, 2026

The place Will Nike Be in 5 Years?

January 10, 2026

Sai De Silva Reveals Outcomes of Tummy Tuck, Addresses RHONY Season 16

January 10, 2026

Porsha Williams Offers Peek Into Baecation With Patrice McKinney

January 10, 2026

The explanation Ashley Tisdale, Vanessa Hudgens stopped talking

January 10, 2026
About Us
About Us

VernoNews delivers fast, fearless coverage of the stories that matter — from breaking news and politics to pop culture and tech. Stay informed, stay sharp, stay ahead with VernoNews.

Our Picks

bet365 Bonus Code FOX365 Unlocks $200 in Bonus Bets for Saturday’s NFL Playoffs Doubleheader

January 10, 2026

Google Images may quickly provide you with extra management over how your pictures and movies are backed up

January 10, 2026

U.S. launches large-scale strikes on ISIS targets throughout Syria

January 10, 2026
Trending

The place Will Nike Be in 5 Years?

January 10, 2026

Sai De Silva Reveals Outcomes of Tummy Tuck, Addresses RHONY Season 16

January 10, 2026

Porsha Williams Offers Peek Into Baecation With Patrice McKinney

January 10, 2026
  • Contact Us
  • Privacy Policy
  • Terms of Service
2025 Copyright © VernoNews. All rights reserved

Type above and press Enter to search. Press Esc to cancel.