The AI landscape doesn't move in one direction — it lurches. Some techniques leap from experiment to table stakes in a single quarter; others stall against regulatory walls, technical ceilings, or organisational inertia that no amount of hype can dislodge. Knowing which is which is the hard part. The State of Play cuts through the noise with a rigorously maintained index of AI techniques across every major business domain — classified by maturity, evidenced by real-world adoption, and updated daily so you always know where you stand relative to the field. Stop guessing. Start knowing.
A daily newsletter distilling the past two weeks of movement in a domain or two — delivered to your inbox while the index updates in the background.
Each dot marks the weighted maturity of practices within a domain — hover for a brief summary, click for more detail
AI that generates ad creative variants and tests them for performance, iterating toward higher-performing combinations. Includes automated A/B creative testing and variant generation; distinct from image generation in creative media which produces general imagery rather than performance-optimised ads.
Advertising creative generation and testing has reached institutional deployment maturity across major platforms by February 2026, yet effectiveness remains fragmented and contingent on implementation excellence. Platform-native solutions—Google Performance Max and Meta Advantage+—control the majority of programmatic ad spend at 43% and 4M advertiser scale respectively. Generative AI has expanded scope from variant selection to full creative production (video, copy, image generation), with AI-generated video now appearing in 57% of online ads and small brands leading adoption (45% plan AI-built creatives by 2026 vs. 36% for enterprises). The fundamental tension persists: platforms deliver strong ROAS signals and real deployment wins (623% purchase increases documented in January 2026, 40% ROAS lift validated in February 2026), yet 61% of advertisers report no meaningful results from AI creative tools, exposing massive execution gaps. Context-dependent performance boundaries sharpen the picture: Advantage+ outperforms at high budgets ($3K+, 23% higher ROAS) but underperforms at low budgets (31% worse), revealing where automation succeeds and fails. Peer-reviewed research confirms AI matches average human creativity but cannot surpass elite creative talent. The practice sits at mandatory operational necessity—nearly all mainstream advertisers run automated creative testing—but predictable business value remains elusive. Adoption barriers sharpen: only 29.7% have moved AI creative to production, data quality and system integration challenges block 42-41% of organizations, new advertiser guidance produces failures rather than successes, and consumer perception risks around AI-generated advertising mount. Success increasingly depends on advertiser organizational maturity, budget scale, and implementation capability rather than platform capability alone.
By February 2026, automated creative testing and AI-powered ad generation have achieved institutional deployment across major platforms, with clear leadership consolidation and widening performance divergence. Platform performance metrics remain strong with critical boundaries: Google Performance Max at 43% of all Google Ads budgets continues delivering 27% average conversion increases; Meta's Advantage+ now at 71% advertiser adoption (up from 60% in 2024) with platforms delivering 51B impressions and 19.9% spend share. Real-world case studies from February 2026 document continued wins: Joybird achieved 40% higher ROAS, 95% revenue lift, and 52% more clicks vs. Smart Shopping; digital agencies report 623% purchase increases and 7.69 ROAS on Performance Max deployments. Peer-reviewed research (Université de Montréal, January 2026) confirms AI models (GPT-4, Claude, Gemini) exceed average human creative performance on linguistic tasks but elite human creativity still leads. Production efficiency accelerates: hybrid AI-scripted creative workflows achieve 280% output increases and 65% cost reductions with 2.8% CTR. However, critical execution gaps crystallize with newly documented boundaries: FreeWheel survey shows 61% of advertisers have not seen meaningful results from AI creative tools despite 41% adoption; Meta Advantage+ analysis of 312 DTC campaigns reveals 23% ROAS outperformance at $3K+ budgets but 31% underperformance below $1K, exposing context-dependent automation; new advertiser guidance produces failures (reduced control, wasted spend, poor conversion tracking) rather than successes. Marketer adoption sentiment shows friction: 33% adoption of AI for creative development vs. 43% for other AI applications—indicating fragmentation and implementation friction. February 2026 data confirms dual-track marketplace where advanced practitioners extract genuine incremental revenue while majority struggle with execution basics, budget thresholds, data quality, and consumer perception risks from AI-generated advertising. Investment thesis remains sound, but advertiser organizational maturity, budget scale, and implementation excellence—not platform capability—determine success probability.
— RedClaw benchmarks from 200+ accounts with $50M+ managed spend show 15-25% CTR improvements from AI creative adoption; creative quality identified as primary competitive lever.
— Comprehensive comparison of 10 AI creative testing platforms using synthetic audience methodology; benchmarked 80-95% accuracy, replacing $25k-$100k pretests with 30-minute testing cycles.
— EUR 150,000 budget test comparing AI vs human creative shows AI wins CTR/CPC (2.4% vs 1.9%, 22% savings) but human wins conversion (3.8% vs 2.9%); sector-dependent outcomes.
— Veo generative video integration into Ad Studio; generates 10-second videos from images; voiceover rollout for Performance Max; Demand Gen video enhancements show 16% conversion lift.
— Synthesis of two peer-reviewed studies (500M+ impressions, 105K field tests) showing AI ads achieve higher CTR (0.76% vs 0.65%), but hybrid ads perform worst; disclosure reduces effectiveness by 31.5%.
— Independent academic research on AI-generated vs human-created ad creative. Quasi-experimental study of 4,633 sibling ads (16B impressions) finds AI images outperform when not perceived as AI.
— Google extended AI Max to Shopping campaigns (April 30, 2026) with text customization generating ad copy tailored to shopper queries; expands automation layer across Google ad formats.
— Critical adoption paradox: 91% use AI but only 41% prove ROI (down from 49% in 2025); only 7% embedded AI for measurable outcomes. Data infrastructure identified as dividing line.