Testing Validation Errors: The Most Neglected Test Category
Most API bugs live in input validation. Here's how to test it systematically.
Why validation testing matters
Look at any API bug tracker: the majority of real production incidents trace back to one of three things:
- Input the server wasn't expecting.
- A validation rule that changed silently.
- An error response shape that clients couldn't parse.
All three are validation concerns. And all three are under-tested in most suites, because happy-path tests are more fun to write.
The validation surface
Every input field on every endpoint has a set of rules:
- Type — string, int, bool, date, enum, object, array.
- Presence — required vs optional.
- Format — email, UUID, ISO 8601, regex.
- Range — min/max for numbers, min/max length for strings, min/max items for arrays.
- Enumeration — allowed values for string fields.
- Cross-field — "if country is US, state is required."
- Uniqueness — email must not already exist.
- Ownership — the
user_idin the body must match the authenticated user.
Every one is a place to fail. Every one needs a test.
Try some bad inputs
/api/v1/userscurl -X POST 'https://demo.totalshiftleft.ai/api/v1/users' \
-H 'Content-Type: application/json' \
-d '{"name":"A","email":"not-an-email"}'Expected response:
{
"success": false,
"error": "VALIDATION_ERROR",
"message": "Validation failed",
"details": [
{ "field": "name", "code": "too_short", "message": "Must be at least 2 characters" },
{ "field": "email", "code": "invalid_format", "message": "Not a valid email address" }
]
}
Now send an empty body:
/api/v1/userscurl -X POST 'https://demo.totalshiftleft.ai/api/v1/users' \
-H 'Content-Type: application/json' \
-d '{}'Expected:
{
"success": false,
"error": "VALIDATION_ERROR",
"message": "Validation failed",
"details": [
{ "field": "name", "code": "required" },
{ "field": "email", "code": "required" }
]
}
The error envelope contract
The shape of the error response is as important as the shape of the success response. Clients parse it to show field-level errors to users. If it's inconsistent, UIs break.
Test every validation error for:
- HTTP status is 400 (or 422 if the API uses "Unprocessable Entity" for validation specifically).
- Top-level error code is a stable string (
VALIDATION_ERROR, not"Validation failed."). details(or equivalent) is an array of per-field errors.- Each detail has
field,code, and optionallymessage. fieldis a path (e.g.,user.address.postal_code), not just a flat name.codeis machine-readable (too_short,invalid_format), not human copy.
Once you've locked this down, every new endpoint should return errors in the same shape. That's contract.
The validation test matrix
For every string field, test:
- Missing (required check).
- Empty string (
"") — is this treated as missing or as invalid? - Whitespace-only (
" ") — trimmed? accepted? rejected? - Under min-length.
- Over max-length.
- Exactly at min and max (boundary).
- Invalid format (e.g., email without
@). - Valid format but semantically wrong (a valid-looking but non-existent country code).
- Non-string value (number, null, array) — type coercion or error?
For every number field:
- Missing.
- Zero (often an edge case: "0 is falsy").
- Negative.
- Below min.
- Above max.
- Boundaries (exactly min, exactly max).
- Non-integer where integer expected (e.g.,
1.5). - Scientific notation (
1e5) — does it parse? - String that looks like a number (
"123") — coerced or rejected?
For every enum field:
- Valid value.
- Invalid value.
- Wrong case (
"ACTIVE"when enum is"active"). - Null.
For every array field:
- Empty array.
- Array with min items.
- Array with max items + 1.
- Array with one invalid item.
- Array with duplicates (if uniqueness is required).
For every object field:
- Missing required sub-fields.
- Unknown sub-fields (should they be rejected or ignored?).
- Deeply nested invalid values.
Cross-field validation
The most forgotten class. Examples:
- "If country is 'US', state is required."
- "If role is 'admin', department must not be null."
- "end_date must be >= start_date."
Tests:
- All fields valid → 200.
- Country 'US' + missing state → 400, error on
state. - Country 'UK' + missing state → 200 (not required).
- end_date before start_date → 400, error on
end_date.
These rules are often implemented by hand and decay first. Always test them.
Validation from the AI angle
Hand-writing this matrix for even a small API (20 endpoints × 10 fields each) is hundreds of test cases. Nobody writes them all. That's why most APIs leak validation bugs.
AI-generated validation tests read the OpenAPI or GraphQL schema, derive the matrix automatically, and fuzz every boundary. ShiftLeft's validation generator produces ~20 test cases per field in a couple of seconds, with assertions on both the happy path and the error-envelope shape.
Common mistakes in validation tests
1. Only testing the happy path. Covered everywhere, but worth repeating.
2. Asserting on human-readable error messages. "Email is invalid" becomes "Email address is invalid" in a copy update and every test fails. Assert on code, not message.
3. Testing one field at a time. What if two fields are invalid simultaneously? Does the server return both errors or short-circuit? Your tests should establish which and assert on it.
4. Ignoring silent coercion. Server accepts "123" for an int field, tests pass. But what it actually does is silently cast — now "123abc" becomes 123, and the user's typo goes undetected. Test that type-wrong inputs 400, not 200.
5. Not testing the envelope itself. Spin up a test that hits 10 different endpoints with bad input, and assert they all return the same shape. Shape drift is a real bug.
What's next
Validation covers "what if the client sends something weird." Retries and timeouts covers "what if the network or server misbehaves."
Related lessons
The network is unreliable. Here's how clients should retry, how servers should behave, and how to test both.
Happy paths prove your API works. Negative paths prove it doesn't break. Both matter.
A contract is a promise. Contract testing keeps you honest. Here's how to do it right.
Read more on the blog
API schema validation drift detection is the automated process of comparing live API responses against the documented OpenAPI specification to identify undocumented changes in field types, required properties, response formats, and nested structures. When drift goes undetected, it silently breaks API consumers and causes production integration failures.
REST API testing best practices in 2026 go far beyond basic status code checks — they encompass schema validation, authentication testing, negative and boundary testing, data-driven tests, and automated generation from OpenAPI specs. This guide covers every dimension of effective REST API testing with practical examples.