Product documentationUpdated October 3, 2025
AI-Generated Tests
Use AI-assisted generation to create baseline test cases from your API spec—speed up coverage, add edge cases, and refine results into CI-ready suites.
Overview
AI-generated tests help you get to a usable baseline quickly. The typical flow is:
- Import an OpenAPI/Swagger spec.
- Generate a baseline set of test cases (happy path + key negatives).
- Review, edit, and remove cases so they match your business rules.
- Parameterize auth and data.
- Run in CI as a quality gate.
What AI generation is good for
- fast onboarding for a new API or feature area
- expanding coverage with edge cases and invalid inputs
- producing consistent starting assertions (status codes, schema checks)
What you still need to provide
AI can’t know your environment or data constraints. For stable runs you still need:
- base URL + authentication configuration
- test data strategy
- environment variables and reusable parameters
Related: Configuration fundamentals and Test configuration.
Best practices
- Keep the spec up to date—generation quality depends on contract quality.
- Start with read-only endpoints to validate auth and connectivity.
- Treat AI output as a draft: refine assertions and business logic.
- Run small scopes first, then expand.
Related articles
Related articles
- Test Case · Product documentation
- Test Configuration · Product documentation
- Test Run · Product documentation
- Test Run Pack · Product documentation
Next steps
- Getting started · Install + connect your spec
- Configuration fundamentals · Stabilize runs
- Initial configuration · Users, licensing, projects
- Release notes · Updates and fixes
Still stuck?
Tell us what you’re trying to accomplish and we’ll point you to the right setup—installation, auth, or CI/CD wiring.