docs/contributing/testing.md
Testing Guide
Running Tests
-
Run all tests:
-
Run with coverage:
-
Run specific test file:
-
Run integration tests with VCR replay (default, fast):
-
Record or refresh VCR cassettes (live API, slower):
-
Record only the tests you're touching:
Test Structure
tests/
├── __init__.py
├── conftest.py # Shared fixtures
├── unit/ # Unit tests
└── integration/ # Integration tests
Writing Tests
-
Fixtures: Add shared fixtures to
conftest.py: -
Test Files: Create test files with descriptive names:
from unittest.mock import Mock from fmp_data.company.models import CompanyProfile def test_get_company_profile(client): """Test retrieving company profile.""" client.company.client.request = Mock( return_value=[CompanyProfile(symbol="AAPL")] ) profile = client.company.get_profile("AAPL") assert profile.symbol == "AAPL"
Mocking HTTP Requests
Mock the client's internal request method in unit tests to avoid real HTTP calls:
from unittest.mock import Mock
from fmp_data.company.models import CompanyProfile
def test_api_call(client):
# Mock API response at the client request layer
client.company.client.request = Mock(
return_value=[CompanyProfile(symbol="AAPL")]
)
# Make request through business logic
result = client.company.get_profile("AAPL")
# Verify result
assert result.symbol == "AAPL"
Test Coverage
We maintain high test coverage:
- Minimum coverage: 80%
- Coverage report:
uv run pytest --cov=fmp_data --cov-report=html - View report:
open htmlcov/index.html
Continuous Integration
Tests run automatically on: - Every pull request - Push to main branch - Release creation Coverage is collected in a dedicated CI job separate from the test matrix.
Best Practices
- Test Organization:
- One test file per module
- Descriptive test names
-
Group related tests in classes
-
Test Data:
- Use fixtures for shared data
- Mock external API calls
-
Use realistic test data
-
Assertions:
- Be specific in assertions
- Test edge cases
- Handle exceptions properly