Skip to main content
    clarier.ai

    Model drift occurs when an AI model's outputs change from its original or expected behavior. For enterprises using third-party AI tools, drift typically manifests in two ways:

    • Provider-side drift: The vendor updates, retrains, or replaces the underlying model (e.g., a new GPT version), changing behavior for all users
    • Data drift: The distribution of inputs the model receives changes over time, causing performance degradation on certain types of queries

    Drift is particularly challenging because it's often invisible — the tool keeps working, but its outputs subtly change in ways that may affect accuracy, compliance, or risk posture.

    Why it matters

    A tool that was reviewed and approved six months ago may behave differently today. Continuous monitoring and periodic reassessment are necessary to catch drift before it causes problems.