An elderly man cradles a chihuahua like a newborn. A farmer splashes around in a swimming pool filled with eggs. A runaway bride evades police on a golf cart. An alien chugs beer at a house party.
This is a television commercial. It aired during Game 3 of the NBA Finals on ABC, one of the most expensive advertising slots in America. One person made it in 48 hours for $2,000.
The ad was for Kalshi, a prediction markets platform. It was created entirely with AI. It generated over 3 million views on X alone, sparked a global industry conversation, and outperformed campaigns that cost 200 times as much to produce.
It looked terrible. That was the point.
Something strange is happening in digital advertising. The worst-looking ads on the internet are starting to outperform the best-looking ones.
The industry calls the aesthetic “AI slop.”
Merriam-Webster named it their 2025 Word of the Year. The term originally described the flood of low-quality AI content on social media: bizarre images of Shrimp Jesus, sentimentalised veterans, and grotesque food combinations racking up millions of engagements on Facebook.
Most of that content is garbage. But a small cohort of marketers figured out how to use the same aesthetic with full self-awareness and creative intent, and their ads are converting.
At MHI Media we manage over £1 million a week in Meta ad spend for fashion and DTC brands. We produce 300+ ad variations per month. We test everything. So when AI slop ads started showing up in our feeds, we did what we always do: we tested the format.
This report is everything we’ve learned: the data behind why it works, the psychology of the scroll-stop, and the exact process we use to make them.
WHY MOST AI ADS FAIL (AND THESE DON’T)
Brands getting destroyed by AI backlash right now (Coca-Cola’s Christmas ads, McDonald’s Netherlands’ holiday campaign, Samsung’s Galaxy S26 promotions) all made the same mistake.
They tried to use AI as a cheap substitute for traditional production. They wanted the AI to look real. The audience caught them, and the betrayal triggered outrage.
AI slop ads that convert do the opposite. The AI-ness is the entire point. The ugliness isn’t a bug. It’s the creative strategy. Nobody feels deceived because nobody’s being tricked.
In November 2025, ad testing firm System1 and digital agency Jellyfish tested 18 AI-produced video ads against System1’s database of over 100,000 traditionally produced commercials. The AI ads didn’t just match the benchmarks. They beat them.
The finding was so counterintuitive that the lead researcher admitted he “didn’t have the guts to predict” it beforehand. Ads that viewers strongly recognised as AI-generated didn’t suffer in emotional response. The more obviously AI the aesthetic, the better the ad performed on emotional metrics.
Their theory: AI allows brands to create far bigger, more visually extreme creative than traditional production budgets allow. Extremity drives emotion, whether it’s extremely beautiful or extremely absurd. Emotion drives memory. Memory drives sales.
The caveat is important. When AI is used without creative intent, it fails. At Super Bowl LX in February 2026, 18% of ads featured AI messaging. They averaged just 2.1 Stars. The top performers that night were heartfelt, human-centred stories.
Intentional AI wins. Lazy AI loses. The creative direction matters more than the production method.
THE PSYCHOLOGY OF THE SCROLL-STOP
Consider what a social media feed actually looks like in 2026. The average person scrolls past hundreds of ads per day. Most look the same: clean product shots, aspirational lifestyle imagery, polished UGC with good lighting and a strong hook. DTC advertising conventions have become so standardised that the brain skips them automatically. Same phenomenon as banner blindness, just applied to an entire aesthetic category.
When something shows up that’s visually wrong (oversaturated, uncanny, clearly AI-generated, depicting something the brain can’t immediately categorise) it triggers a pattern break. The scroll stops. The brain needs a moment to process what it’s seeing. That moment is worth more than any amount of production value.
But pattern breaks alone don’t explain why people watch, engage, and share. The deeper mechanism is cognitive dissonance. Research into absurdist humour and brainrot culture shows that when the brain encounters illogical information, it tries to reconcile it and fails. That failure is jarring but deeply memorable. We remember nonsensical stimuli not because they make sense, but precisely because they don’t.
There’s a social layer too. The internet rewards self-awareness. When a brand deliberately makes something weird, and the audience can tell the brand is in on the joke, it creates shared understanding. The ugliness becomes an inside joke. Sharing it signals cultural fluency. The ad becomes a meme voluntarily.
For fashion and DTC brands this is especially powerful. Every DTC brand on Meta has the same clean photography, the same lifestyle content, the same influencer UGC formula. A scrolling consumer can’t tell one brand from another without reading the logo. AI slop breaks that sameness entirely. A hyper-saturated, deliberately uncanny AI video exists in a completely different visual register to everything else in the feed.
OUR EXACT PROCESS FOR MAKING AI SLOP ADS
The process is less chaotic than the output suggests. We’ve systematised the entire thing into a repeatable workflow.
It starts with a reference. The creative team pulls an existing piece of AI-generated content, an animation style, or a stylised version of something real. Could be someone else’s AI generation, an animated character, a stylised photograph.
The key criteria: it needs to be unmistakably AI. The uncanny quality isn’t a limitation. It’s an aesthetic choice.
Here’s one of our top reference videos for creating AI slop ads:
This reference goes into Gemini for master image generation. The team pushes lighting to high contrast, cranks saturation, and deliberately amplifies everything that makes it look like AI. This master image establishes the visual world of the entire ad.
The master image then goes to ChatGPT along with the script and brief. We use ChatGPT to design specific prompts for Google’s VEO video generator. The back-and-forth becomes an iterative process: testing prompts, adjusting for issues in VEO’s output, refining until each clip matches the visual style we’ve established.
Each clip gets generated individually in VEO with the Gemini master image attached as a reference frame. This maintains visual consistency throughout. All clips get downloaded and spliced together in Premiere Pro.
The assembled video goes through ElevenLabs for voice changing, giving it a strong, consistent voiceover that contrasts deliberately with the visual chaos. Then we add what the team calls “the dressing”: classic AI slop slide transitions, sound effects, whatever the reference or brief calls for.
Final step: subtitles get generated in Submagic, downloaded as green screen overlays, and composited onto each iteration in Premiere before final export.
At every stage, a human is making creative decisions. Which reference to use. How far to push the style. What’s funny versus what’s just bad. Where to place the product message. How to structure the hook. Creative judgement drives every step.
TRY IT YOURSELF: THE 6-STEP AI SLOP AD SOP
You don’t need an agency to test this format. If you’re running paid social and want to see if AI slop works for your brand, you can run a first test with these six steps.
STEP 1: FIND YOUR REFERENCE
Pull an AI slop video or animation that made you stop scrolling. This sets the tone for everything. The more visually absurd, the better. Save it.
STEP 2: BUILD YOUR MASTER IMAGE IN GEMINI
Upload your reference to Gemini and generate a new image in the same style, but for your brand or product. Push the lighting to high contrast. Oversaturate it. Make it unmistakably AI. This image is the visual anchor for your whole ad.
STEP 3: PROMPT ENGINEER WITH CHATGPT
Attach your master image to a ChatGPT conversation. Give it the ad script or concept. Ask it to write specific prompts for VEO that match the visual style. Go back and forth. Iterate until the prompts describe exactly what you want to see on screen.
STEP 4: GENERATE CLIPS IN VEO
Paste your ChatGPT prompts into VEO one by one. Attach the Gemini master image as a reference frame for visual consistency. Generate each clip in your script individually. Expect to generate 10-20x more clips than you’ll actually use.
STEP 5: ASSEMBLE AND VOICE
Cut your best clips together in Premiere Pro (or CapCut). Run the assembled video through ElevenLabs for a consistent, strong voiceover. Add AI slop transitions, sound effects, and anything else that fits the brief.
STEP 6: SUBTITLE AND EXPORT
Upload to Submagic. Select your subtitle style. Download as green screen overlay. Composite onto your video in your editor. Export your iterations and launch.
That’s the whole workflow. The difference between a test that flops and one that performs usually comes down to two things: the quality of your reference (step 1) and how far you’re willing to push the absurdity (step 2). Most brands don’t go far enough. They try to make the AI look “good.” That defeats the purpose.
WANT US TO BUILD THIS FOR YOUR BRAND?
We’ll audit your ad account, show you the exact AI slop formats working right now for brands in your niche, and build a specific creative brief tailored to your products.
30 minutes. Free. No pitch deck.
We do this for fashion and DTC brands spending £20k+ per month on Meta. If that’s you, book a slot below.
MHI Media is a UK-based performance marketing agency founded by Kamal Razzak. We specialise in scaling fashion and DTC brands through Meta advertising. £1M+ in weekly ad spend under management. 50+ brands scaled to 7-8 figures. 300+ ad variations produced per month. AI slop ads are the latest format in our creative testing system, alongside founder ads, UGC-style content, and other performance creative.

