AI model drift (also known as model decay) is the phenomenon where a machine learning model’s performance deteriorates over time because the data or relationships it relies on have changed since training.
A model is essentially a snapshot of patterns in historical data. When those patterns evolve—as they almost always do in real-world environments—the model becomes less aligned with reality. As a result, its predictions grow less accurate or reliable.
Types of AI Model Drift
Data drift (input drift)
Occurs when the distribution of input features changes. The model still applies the same logic, but the data it receives looks different from what it was trained on.
Example: A recommendation system trained on past user behavior struggles when user preferences shift due to new trends.
Concept drift
Happens when the relationship between inputs and outputs changes. This is often the most impactful type of drift because the underlying “rules” the model learned are no longer valid.
Example: In fraud detection, behaviors that used to indicate fraud may become normal, while new fraud patterns emerge.
Label drift
Occurs when the distribution of target labels changes over time, even if input features remain relatively stable.
Example: A sentiment analysis model sees a shift from mostly positive to mostly negative reviews due to external events.
Why Model Drift Matters
Unchecked model drift can lead to declining prediction quality, flawed decision-making, and business risk. In sensitive domains such as finance, healthcare, or legal systems, even small drops in accuracy can have significant consequences, including bias, compliance issues, or financial loss.
Causes of Model Drift
- Changes in user behavior or preferences
- Market dynamics or seasonality
- External disruptions such as regulatory changes or global events
- Data pipeline issues (missing values, schema changes, corrupted data)
- Feedback loops where model outputs influence future data
A useful way to think about model drift is to imagine a model as a rulebook written based on past observations. If the environment changes but the rulebook does not, decisions based on it will gradually become outdated.
.jpg)
.jpg)
.jpg)