WireMock Recording¶
This guide provides step-by-step instructions for recording and saving API mocks into a WireMock Cloud project using the WireMock CLI and httpyac. You'll learn how to generate HTTP files, configure test scenarios, record API calls, and save comprehensive mocks including error cases.
๐ Overview¶
Recording API mocks allows you to capture real API interactions and replay them in your development and testing environments. This process involves:
- ๐ฏ Generate HTTP Files: Use GitHub Copilot to create comprehensive REST client files
- โ๏ธ Configure Variables: Set up dynamic variables and error scenarios
- ๐ฌ Record Sessions: Capture API calls using WireMock CLI proxy recording
- ๐ Execute Requests: Run all scenarios using httpyac CLI
- ๐พ Save Mocks: Store captured interactions in WireMock Cloud projects
๐ Prerequisites¶
Before you begin, ensure you have:
- โ
WireMock CLI installed (
npm install -g @wiremock/cli) - โ
httpyac CLI installed (
npm install -g httpyac) - โ WireMock Cloud Account with appropriate permissions
- โ API Endpoints to record (accessible from your machine)
- โ GitHub Copilot enabled in VS Code (for HTTP file generation)
๐ Installation Steps¶
Step 1: Install WireMock CLI¶
Open your terminal and install the WireMock CLI globally:
Verify the installation:
Step 2: Install httpyac CLI¶
Install httpyac CLI globally:
Verify the installation:
๐ Creating HTTP Files with Copilot¶
Use GitHub Copilot to generate comprehensive HTTP files that cover all your API endpoints and scenarios.
Step 1: Create the HTTP File¶
Create a new file with a .http or .rest extension:
Step 2: Use Copilot to Generate Requests¶
Use GitHub Copilot to generate comprehensive HTTP requests for your API. Here are effective prompts to get the best results:
โ ๏ธ NOTE The prompt samples below are generalized and should not be used as a copy/paste into Copilot, but instead as a starting point. The prompt content should be unique to the circumstances surounding the API you wish to record mocks for.
Sample Prompt 1: Based on Controller Code¶
When you have access to your API controller code:
Prompt: "Analyze the UserController.cs file and generate a comprehensive .http file that tests all endpoints. Include success cases and error scenarios (400, 401, 403, 404, 500). Use global variables for baseUrl, authToken, and userId. Ensure each request has proper headers and request bodies for POST/PUT operations. Include tests for invalid data, unauthorized access, and non-existent resources."
Sample Prompt 2: Based on Minimal API Endpoints¶
For APIs built with minimal API pattern:
Prompt: "Based on the minimal API endpoints defined in Program.cs, create a complete .http file with all CRUD operations. Include global variables (@baseUrl, @authToken) at the top before any ### sections. Generate error test cases for each endpoint including malformed JSON (400), invalid authentication (401/403), missing resources (404), and server errors (500). Add response extraction variables using @name for chaining requests."
Sample Prompt 3: Based on OpenAPI/Swagger Documentation¶
When you have OpenAPI specification files:
Prompt: "Using the swagger.json/openapi.yaml file, generate a comprehensive .http file that covers all documented endpoints. Include global variables for configuration and ensure proper variable scoping. For each endpoint, create both success and failure scenarios. Include edge cases like empty request bodies, invalid data types, oversized payloads, and authentication edge cases. Use httpyac syntax with @name annotations for response chaining."
Sample Prompt 4: Based on Deployed API Documentation¶
For APIs with accessible Swagger UI:
Prompt: "I have a REST API with Swagger documentation at https://api.example.com/swagger. Create a .http file that tests all endpoints shown in the Swagger UI. Include global variables (@baseUrl = https://api.example.com/api/v1) and authentication tokens. Generate comprehensive test scenarios including: success paths (2xx), client errors (400, 401, 403, 404, 409, 422), and server errors (500, 503). Ensure error scenarios test realistic failure cases like duplicate resources, validation failures, and rate limiting."
Sample Prompt 5: Complete API Test Suite¶
For comprehensive coverage:
Prompt: "Create a complete .http file for testing a [describe your API type, e.g., 'e-commerce user management API']. Structure it with global variables at the top, then organized sections for each resource. Include:
- Global variables: @baseUrl, @authToken, @contentType
- Authentication tests (valid/invalid tokens)
- CRUD operations for each resource
- Error scenarios: 400 (bad request), 401 (unauthorized), 403 (forbidden), 404 (not found), 409 (conflict), 422 (validation), 500 (server error)
- Edge cases: empty payloads, malformed JSON, special characters, boundary values
- Response chaining using @name annotations
- Comments explaining each test scenario"
Enhanced Prompt Template¶
Use this template and customize it for your specific API:
Prompt: "Generate a comprehensive .http file for [YOUR_API_DESCRIPTION] based on [CONTROLLER_FILE/OPENAPI_FILE/SWAGGER_URL]. Requirements:
Structure:
- Global variables at top (before any ### sections): @baseUrl, @authToken, @userId, @contentType
- Organized sections with ### headers for each resource/endpoint group
Coverage:
- All CRUD operations (GET, POST, PUT, DELETE, PATCH)
- Authentication scenarios (valid/invalid/missing tokens)
- Success responses (200, 201, 204)
- Client errors: 400 (validation), 401 (auth), 403 (forbidden), 404 (not found), 409 (conflict), 422 (unprocessable)
- Server errors: 500 (internal), 502 (bad gateway), 503 (unavailable)
Error Test Cases:
- Invalid JSON syntax
- Missing required fields
- Invalid data types/formats
- Boundary value testing
- Duplicate resource creation
- Operations on non-existent resources
Advanced Features:
- Response extraction with @name annotations
- Variable chaining between requests
- Dynamic data generation
- Proper HTTP headers and content types"
Tips for Better Copilot Results¶
When crafting your prompts, include these details for better results:
๐ Context Information:
- Mention the specific file types you're working with (controllers, minimal APIs, OpenAPI specs)
- Include the programming framework (.NET Core, Node.js, Python FastAPI, etc.)
- Specify authentication methods (JWT, API keys, OAuth, etc.)
๐ฏ Structure Requirements:
- Explicitly request global variables before ### sections
- Ask for organized sections by resource or functionality
- Request response chaining using @name annotations
๐ Error Coverage:
- Be specific about error codes you want to test
- Request realistic error scenarios for your domain
- Ask for boundary and edge case testing
๐ก Example Context Additions:
"This is a .NET Core Web API with JWT authentication"
"The API uses minimal APIs with validation attributes"
"Authentication is via Bearer tokens with role-based access"
"The API includes file upload endpoints"
"Rate limiting is implemented on certain endpoints"
Example Generated HTTP File Structure¶
# User Management API Tests
# This file contains comprehensive tests for user management endpoints
# Global variables (accessible to all requests)
@baseUrl = {{$dotenv %baseUrl}}
@authToken = {{$dotenv %authToken}}
@userId = {{$dotenv %userId}}
### Authentication Test
GET {{baseUrl}}/auth/validate
Authorization: Bearer {{authToken}}
### Get All Users (Success)
GET {{baseUrl}}/users
Authorization: Bearer {{authToken}}
Accept: application/json
### Get User by ID (Success)
GET {{baseUrl}}/users/{{userId}}
Authorization: Bearer {{authToken}}
Accept: application/json
### Get User by ID (404 - Not Found)
GET {{baseUrl}}/users/99999
Authorization: Bearer {{authToken}}
Accept: application/json
### Create User (Success)
POST {{baseUrl}}/users
Authorization: Bearer {{authToken}}
Content-Type: application/json
{
"name": "John Doe",
"email": "john.doe@example.com",
"role": "user"
}
### Create User (400 - Bad Request)
POST {{baseUrl}}/users
Authorization: Bearer {{authToken}}
Content-Type: application/json
{
"name": "",
"email": "invalid-email"
}
### Create User (403 - Forbidden)
POST {{baseUrl}}/users
Authorization: Bearer invalid-token
Content-Type: application/json
{
"name": "Jane Doe",
"email": "jane.doe@example.com"
}
### Server Error Simulation (500)
GET {{baseUrl}}/users/trigger-error
Authorization: Bearer {{authToken}}
### Update User
PUT {{baseUrl}}/users/{{userId}}
Authorization: Bearer {{authToken}}
Content-Type: application/json
{
"name": "John Updated",
"email": "john.updated@example.com"
}
### Delete User
DELETE {{baseUrl}}/users/{{userId}}
Authorization: Bearer {{authToken}}
โ๏ธ Configuring HTTP Variables and Routes¶
Step 1: Set Up Environment Variables¶
Create a .env file in your project root:
# API Configuration
baseUrl=https://api.example.com/v1
authToken=your-api-token-here
userId=123
# Alternative environments
# baseUrl=https://staging-api.example.com/v1
# baseUrl=http://localhost:3000/api/v1
Step 2: Configure Global Variables¶
Add global variables at the top of your HTTP file (before any ### sections):
# Global Variables (accessible to all requests)
@baseUrl = https://api.example.com/v1
@contentType = application/json
@authToken = your-api-token-here
### Get User Request
# @name getUserResponse
GET {{baseUrl}}/users/1
Authorization: Bearer {{authToken}}
### Create User Request
# @name createUserResponse
POST {{baseUrl}}/users
Content-Type: {{contentType}}
Authorization: Bearer {{authToken}}
{
"name": "Test User"
}
### Use Response Data in Next Request
# Extract data from previous response (section-scoped variable)
@newUserId = {{createUserResponse.response.body.$.id}}
GET {{baseUrl}}/users/{{newUserId}}
Authorization: Bearer {{authToken}}
Alternative: Section-Scoped Variables¶
You can also define variables within specific sections for section-specific use:
### User Management Section
# Variables scoped to this section only
@sectionBaseUrl = https://api.example.com/v1/users
@testUserId = 123
GET {{sectionBaseUrl}}/{{testUserId}}
### Admin Section
# Different variables scoped to this section
@adminBaseUrl = https://api.example.com/v1/admin
@adminToken = admin-specific-token
GET {{adminBaseUrl}}/users
Authorization: Bearer {{adminToken}}
Step 3: Add Error Scenarios¶
Ensure your HTTP file includes comprehensive error testing:
### Error Scenarios
### 400 Bad Request - Invalid JSON
POST {{baseUrl}}/users
Content-Type: application/json
{
"name": "",
"email": "not-an-email",
"age": "invalid-age"
}
### 401 Unauthorized - Missing Token
GET {{baseUrl}}/users
# No Authorization header
### 403 Forbidden - Invalid Token
GET {{baseUrl}}/users
Authorization: Bearer invalid-token-123
### 404 Not Found - Non-existent Resource
GET {{baseUrl}}/users/999999
Authorization: Bearer {{authToken}}
### 429 Rate Limit (if applicable)
### Repeat request rapidly
GET {{baseUrl}}/users
Authorization: Bearer {{authToken}}
### 500 Server Error - Trigger server error
GET {{baseUrl}}/admin/trigger-error
Authorization: Bearer {{authToken}}
๐ฌ Recording API Calls¶
Step 1: Start WireMock CLI Recording¶
Start the recording session with WireMock CLI. You can either record to a new project or an existing one:
Option A: Record to New Project (Interactive)¶
This will prompt you to create a new WireMock Cloud project at the end of the session.
Option B: Record to Existing Project¶
First, get your WireMock Cloud project ID from the URL in the WireMock Cloud dashboard (e.g., 33eye3l9):
Option C: Record with Custom Port¶
Step 2: Update HTTP File for Recording¶
Modify your HTTP file to point to the WireMock proxy (default: localhost:8000). Update your global variables:
# Global variables for recording (accessible to all requests)
@baseUrl = http://localhost:8000
@authToken = your-api-token-here
@userId = 123
### Authentication Test
GET {{baseUrl}}/auth/validate
Authorization: Bearer {{authToken}}
### Get All Users (Success)
GET {{baseUrl}}/users
Authorization: Bearer {{authToken}}
Accept: application/json
# Rest of your HTTP requests remain the same...
Step 3: Configure Batch Recording (Optional)¶
For large recording sessions, configure batch imports:
This will import mocks in batches of 50 requests to prevent memory issues and data loss.
๐ Executing HTTP Files with httpyac¶
Step 1: Execute All Requests¶
Run all HTTP requests in your file:
Step 2: Execute with Environment Variables¶
Step 3: Execute with Verbose Output¶
Step 4: Execute with JSON Output¶
For CI/CD or automated scenarios:
Step 5: Execute Specific Requests¶
Execute requests by name or line number:
# Execute by name
httpyac send api-tests.http --name "Get All Users"
# Execute by line number
httpyac send api-tests.http --line 25
Step 6: Execute with Custom Variables¶
Override variables at runtime:
๐พ Saving Mocks to WireMock Cloud¶
Step 1: Complete Recording Session¶
After executing all your HTTP requests through httpyac:
- Return to the WireMock CLI terminal
- Press
Enterto stop the recording session - Choose or confirm the destination WireMock Cloud project
Step 2: Verify Saved Mocks¶
- Open WireMock Cloud dashboard
- Navigate to your project
- Review captured stubs in the Stubs section
- Verify all scenarios were captured (success and error cases)
Step 3: Organize and Clean Up Mocks¶
Review and organize your captured mocks:
- โ Check Success Scenarios: Ensure all 2xx responses are captured
- โ Verify Error Scenarios: Confirm 4xx and 5xx responses are recorded
- โ Review Request Matching: Adjust matchers if needed
- โ Add Descriptions: Document each stub for clarity
- โ Remove Duplicates: Clean up redundant mocks
Example Mock Verification Checklist¶
โ
GET /users (200 OK)
โ
GET /users/123 (200 OK)
โ
GET /users/999999 (404 Not Found)
โ
POST /users (201 Created)
โ
POST /users (400 Bad Request - Invalid Data)
โ
POST /users (403 Forbidden - Invalid Token)
โ
PUT /users/123 (200 OK)
โ
DELETE /users/123 (204 No Content)
โ
GET /admin/trigger-error (500 Internal Server Error)
๐ Troubleshooting¶
Common Issues and Solutions¶
Issue: Recording Session Not Capturing Requests¶
Solution: Ensure your HTTP file is pointing to the correct proxy URL:
Issue: Authentication Errors During Recording¶
Solution: Verify your API tokens and credentials:
Issue: httpyac Command Not Found¶
Solution: Reinstall httpyac globally:
Issue: SSL Certificate Errors¶
Solution: Use the --insecure flag with httpyac:
Issue: Missing Error Scenarios in Mocks¶
Solution: Ensure your HTTP file explicitly tests error conditions:
### Force 404 error
GET {{baseUrl}}/nonexistent-endpoint
### Force 400 error with invalid data
POST {{baseUrl}}/users
Content-Type: application/json
{
"invalid": "data"
}
Debug Recording Issues¶
- Check WireMock CLI Output: Look for errors in the recording terminal
- Verify Proxy Traffic: Ensure requests are going through
localhost:8000 - Test Individual Requests: Run single requests to isolate issues
- Check Network Connectivity: Verify the target API is accessible
๐ Next Steps¶
After successfully recording your API mocks:
- Host WireMock CLI - Learn how to use recorded mocks in your Aspire applications
- WireMock Documentation - Explore advanced WireMock features
- httpyac Documentation - Master advanced HTTP testing scenarios
- Configure CI/CD Integration - Automate mock recording in your deployment pipeline
- Set up Mock Versioning - Implement version control for your API mocks
Advanced Recording Scenarios¶
- Multi-domain Recording: Record from multiple API endpoints simultaneously
- Mutual TLS Recording: Record from APIs requiring client certificates
- Advanced Configuration: Use configuration files for complex recording scenarios
- Non-interactive Recording: Automate recording in CI/CD pipelines
Best Practices¶
- ๐ Document Your Mocks: Add clear descriptions to each recorded stub
- ๐ Regular Updates: Keep your mocks synchronized with API changes
- ๐งช Test Coverage: Ensure all API scenarios are covered including edge cases
- ๐ Security: Avoid recording sensitive data in production environments
- ๐ Monitoring: Track mock usage and performance in your applications