AI-Powered Test Case Generation
Generate comprehensive test cases from requirements, issues, and documentation using AI.
Prerequisites
Before using AI test generation, ensure:
- At least one active LLM integration is configured
- At least one active issue tracking integration (for issue-based generation)
- Project has test case templates configured
- User has appropriate permissions for test case creation
Generation Wizard
The AI test generation wizard guides you through a 4-step process:
Step 1: Select Source
Choose your test generation source:
From Issue:
- Select an existing issue from your integrated tracking system
- Issues are automatically fetched with full context including descriptions and comments
- Supports Jira, GitHub Issues, Azure DevOps work items
From Document:
- Enter requirements directly into the form
- Provide title, description, and priority
- Ideal for early-stage requirements or internal specifications
Step 2: Select Template
- Choose the test case template to use for generated cases
- All template fields are displayed for review
- Select which fields to populate with AI-generated content
- Required fields are automatically included
- Optional fields can be included or excluded based on your needs
Step 3: Configure Generation
Quantity Options:
- Just One: Generate a single, comprehensive test case
- A Couple: Generate 2-3 focused test cases
- A Few: Generate 3-5 test cases covering different scenarios
- Several: Generate 5-8 test cases with good coverage
- Many: Generate 8-12 test cases for thorough testing
- Maximum: Generate comprehensive test suite (12+ cases)
Additional Instructions:
- Provide specific guidance for the AI
- Example: "Focus on security testing scenarios"
- Common suggestions available as quick-add buttons:
- Security testing
- Edge cases
- Happy path scenarios
- Mobile compatibility
- API testing
- Accessibility testing
Auto-Generate Tags:
- Enable to automatically create and assign relevant tags
- Tags are generated based on test content and context
- Existing tags are reused when appropriate
Step 4: Review and Import
- Review all generated test cases
- Each case shows:
- Name and description
- Generated test steps (if applicable)
- Populated template fields
- Generated tags (if enabled)
- Priority and automation status
- Select specific test cases to import
- Bulk select/deselect options available
Generation Process
When you click "Generate":
- Context Analysis: The AI analyzes the source material and existing test cases
- Template Processing: Template fields and requirements are processed
- Content Generation: Test cases are generated based on your specifications
- Field Population: Custom fields are populated with relevant content
- Tag Generation: Tags are automatically created (if enabled)
- Quality Validation: Generated content is validated for completeness
Generated Content Structure
Test Case Fields
The AI automatically populates:
Core Fields:
- Name: Descriptive, action-oriented test case names
- Description: Detailed test objectives and scope (if template field exists)
- Priority: Inferred from source issue priority or requirement importance
Template Fields:
- Preconditions: Required setup or system state
- Test Data: Sample data needed for execution
- Environment: Target testing environment
- Expected Results: Detailed expected outcomes
- Post-conditions: Expected system state after testing
System Fields:
- Steps: Detailed action/expected result pairs
- Tags: Contextually relevant tags
- Automated: Suggestion for automation potential
- Estimate: Time estimate based on complexity
Test Steps Format
Generated test steps follow a consistent structure:
Step 1: Navigate to the login page
Expected Result: Login form is displayed with username and password fields
Step 2: Enter valid credentials ([email protected] / password123)
Expected Result: Credentials are accepted and validated
Step 3: Click the "Login" button
Expected Result: User is redirected to the dashboard
Advanced Features
Context Awareness
The AI considers:
- Existing Test Cases: Avoids duplication of current test scenarios
- Project Domain: Understands your application type and testing needs
- Template Structure: Adapts content to fit your specific template fields
- Issue History: Incorporates comments and updates from linked issues
Field Selection Optimization
- Required Fields: Always populated with essential content
- Optional Fields: Can be selectively included based on your workflow
- Field Types: Content is formatted appropriately for each field type:
- Rich text fields receive formatted content
- Dropdown fields receive valid option values
- Multi-select fields receive appropriate value arrays
Intelligent Tagging
Auto-generated tags include:
- Functional Areas: Based on the feature being tested (e.g., authentication, payment)
- Test Types: Based on testing approach (e.g., integration, unit, e2e)
- Priorities: Based on issue priority or risk assessment
- Platforms: Based on mentioned platforms or environments
Best Practices
Source Material Quality
- Detailed Issues: More detailed issues produce better test cases
- Clear Requirements: Well-written requirements lead to comprehensive test coverage
- Include Context: Add comments or descriptions that explain business logic
- Specify Constraints: Mention any technical limitations or dependencies
Template Configuration
- Field Naming: Use descriptive field names that clearly indicate their purpose
- Field Types: Choose appropriate field types for different content types
- Required vs Optional: Mark fields as required only if they're truly essential
- Field Ordering: Arrange fields logically in the template
Generation Settings
- Start Small: Begin with fewer test cases and adjust based on quality
- Review Carefully: Always review generated content before importing
- Iterate: Use additional instructions to refine generation
- Tag Strategy: Develop a consistent tagging strategy for your project
Quality Assurance
- Review Generated Steps: Ensure test steps are executable and complete
- Validate Field Content: Check that generated content fits field constraints
- Test Data Verification: Ensure generated test data is appropriate and valid
- Link Verification: Confirm that generated test cases properly link to source issues
Troubleshooting
Common Issues
No AI providers available:
- Verify that at least one LLM integration is configured and active
- Check that the integration is assigned to your project
- Confirm your user has appropriate permissions
Generation fails with timeout:
- Try reducing the quantity of test cases to generate
- Simplify additional instructions
- Check API rate limits for your provider
Poor quality test cases:
- Provide more detailed source material
- Add specific instructions about testing focus
- Review and refine your template field definitions
- Consider using a more capable AI model
Fields not populating correctly:
- Verify field types in your template
- Check field naming and descriptions
- Ensure selected fields are appropriate for AI generation
Error Messages
"No AI model is configured"
- Add an LLM integration in project settings
- Ensure the integration is active and properly configured
"API quota exceeded"
- Your AI provider's usage limits have been reached
- Wait for quota reset or upgrade your plan
- Consider switching to a different provider
"Invalid API configuration"
- Check API keys and credentials
- Verify the model name is correct
- Test the integration connection
Performance Optimization
- Model Selection: Balance quality needs with response time
- Batch Processing: Generate multiple test cases in single requests when possible
- Field Selection: Only populate fields you actually need
- Template Optimization: Streamline templates for AI generation
API Reference
For programmatic access to AI test generation:
Endpoints
LLM Integrations:
GET /api/llm-integrations- List available integrationsPOST /api/llm-integrations/test-connection- Test integrationGET /api/llm-integrations/{id}/models- Get available models
Test Generation:
POST /api/llm/generate-test-cases- Generate test casesPOST /api/llm/validate-content- Validate generated contentGET /api/llm/generation-history- Get generation history
Example Request
POST /api/llm/generate-test-cases
{
"projectId": 123,
"issue": {
"key": "PROJ-456",
"title": "User login functionality",
"description": "Implement secure user authentication..."
},
"template": {
"id": 789,
"fields": [...selectedFields]
},
"context": {
"userNotes": "Focus on security testing",
"existingTestCases": [...],
"folderContext": 10
},
"quantity": "several",
"autoGenerateTags": true
}