Model Drift
A change in an AI model's behavior or performance over time, often without explicit notification.
A change in an AI model's behavior or performance over time, often without explicit notification.
Model drift occurs when an AI model's outputs change from its original or expected behavior. For enterprises using third-party AI tools, drift typically manifests in two ways:
Drift is particularly challenging because it's often invisible — the tool keeps working, but its outputs subtly change in ways that may affect accuracy, compliance, or risk posture.
A tool that was reviewed and approved six months ago may behave differently today. Continuous monitoring and periodic reassessment are necessary to catch drift before it causes problems.
We use cookies and similar technologies to improve your experience, analyze traffic, and support marketing. Cookie Policy