The AI landscape doesn't move in one direction — it lurches. Some techniques leap from experiment to table stakes in a single quarter; others stall against regulatory walls, technical ceilings, or organisational inertia that no amount of hype can dislodge. Knowing which is which is the hard part. The State of Play cuts through the noise with a rigorously maintained index of AI techniques across every major business domain — classified by maturity, evidenced by real-world adoption, and updated daily so you always know where you stand relative to the field. Stop guessing. Start knowing.
A daily newsletter distilling the past two weeks of movement in a domain or two — delivered to your inbox while the index updates in the background.
Each dot marks the weighted maturity of practices within a domain — hover for a brief summary, click for more detail
AI that analyses absence and turnover patterns to identify risk factors and predict future attrition across teams. Includes flight risk scoring and absence pattern detection; distinct from churn prediction in customer ops which predicts customer rather than employee departure.
Predictive attrition and absence analytics have crossed from experimental to operationally proven — but only at forward-leaning enterprises. Machine learning models now forecast employee departure risk and flag problematic absence patterns with documented accuracy in the 85-95% range, drawing on signals from engagement surveys, tenure data, collaboration metrics, and absence frequency. Organizations deploying these systems report 20-50% reductions in attrition and six-figure annual savings. The technology works. The harder question is whether most organizations are ready to use it. Adoption remains concentrated among data-mature enterprises with dedicated people analytics teams, while mid-market and smaller organizations face compounding barriers: data quality gaps, implementation complexity, and deep unresolved tensions around employee surveillance and algorithmic fairness. The practice's defining challenge is no longer technical feasibility but responsible scaling — bridging the gap between what the models can predict and what organizations should act on.
The vendor ecosystem for attrition and absence prediction is broad and maturing. SAP SuccessFactors, Workday, Dayforce, Visier, Oracle, ADP, and IBM all offer GA flight-risk and absence forecasting features; the market reached $2.33B in 2026 and is projected to grow to $4.43B by 2030 at sustained double-digit CAGR. Visier's deployment through its Paycor partnership now covers 2.1 million employees, the largest documented scale. Deployment outcomes are concrete: Experian achieved 2-3% attrition reduction with $8-10M savings over 18 months; Credit Suisse documented $70M annual savings from flight-risk modeling; IBM reports 95% turnover prediction accuracy and 25% reduction in unplanned absenteeism; healthcare organizations have saved $400K through attrition pattern analysis. Manufacturing deployments like Varun Beverages operationalize pattern detection across temporal and demographic dimensions, with 35-50% early-attrition reductions and $650K+ annual savings. U.S. voluntary turnover sits at 23.4% annually, with engagement serving as a primary attrition predictor: teams with low engagement experience 18-43% higher turnover. Organizations are increasing investment: 60% of large enterprises are expected to adopt AI-powered people analytics platforms by 2026.
These results coexist with stubborn adoption barriers and critical accuracy limitations. Realistic production-grade attrition models achieve AUC scores of 0.65-0.80, not the 90%+ accuracy cited in controlled studies — a gap often disguised by overall accuracy metrics that fail to reflect the class-imbalance problem in low-attrition populations. Employment law analysis has documented discrimination liability risks under Title VII, ADA, and ADEA when biased algorithms drive personnel decisions, and a single flawed model can affect thousands of employees. Systematic reviews of ML attrition approaches identify persistent gaps: domain-specific datasets remain sparse, model interpretability remains challenging, and ethical guardrails are inconsistently implemented. Analyses of predictive analytics failures cite poor data availability, algorithmic bias baked into historical training sets, and weak integration with broader business strategy as recurring failure modes. Perhaps most striking is a 2025 paradox: organizations successfully deploying AI frameworks face unintended attrition among AI-savvy employees who recognize enhanced marketability and pursue external opportunities at higher rates. The emerging consensus among practitioners frames these systems as diagnostic tools — a "flashlight, not a spotlight" — requiring human interpretation rather than automated action. Absence pattern analysis methodologies demonstrate promise even where overall sick-day reductions prove statistically uncertain, suggesting that engagement and organizational factors (not just absence metrics) drive meaningful outcomes.
— Multiple named organizations demonstrating production deployment of attrition pattern analysis with specific outcomes and interventions.
— AWS official product announcement of native integration between major cloud platform and attrition analysis platform. Explicitly lists attrition as a key metric (alongside headcount, tenure, requisitions). Signals ecosystem maturity and broad adoption by major infrastructure providers.
— AWS Machine Learning blog tutorial demonstrating production enterprise integration of attrition analysis into agentic AI workflows. Includes concrete examples of attrition risk assessment and organizational health evaluation using Visier's analytics platform.
— XAI framework using GAN and Transformer encoders achieving 92-96.95% accuracy; SHAP analysis identifies JobSatisfaction and Age as top predictors; demonstrates advancing sophistication in addressing class imbalance and model interpretability.
— Critical adoption reality check: only 17% report highly successful AI implementation, 57% have no AI in core systems, 14% actually use AI; reveals implementation gap between vendor promises and organizational execution.
— SHRM talent analytics resource with adoption metrics: 82% of organizations using people analytics apply it to retention/turnover, 72% say it adds most value there. Shows mainstream adoption signal from credible professional body.
— Critical analysis of turnover prediction tool limitations. Work Institute statistic: 77% of turnover is preventable but organizations discover reasons only post-resignation. Argues models fail on input data quality, not math. Proposes adaptive conversations over surveys.
— Direct adoption metrics: 34% of organizations use AI to predict turnover with 75-89% accuracy, delivering 421% ROI on retention; maturity progression shows only 6% at predictive AI stage but with 5.4-8.7x ROI.