|
| 1 | +# Testing Guide for simple_ado |
| 2 | + |
| 3 | +## Overview |
| 4 | + |
| 5 | +This project uses **pure pytest** with a two-tier testing strategy: |
| 6 | + |
| 7 | +1. **Unit Tests** (in `tests/unit/`) - Fast, isolated tests that mock HTTP requests |
| 8 | +2. **Integration Tests** (in `tests/integration/`) - Tests against real Azure DevOps (optional) |
| 9 | + |
| 10 | +All tests are written in pytest style (no unittest classes). |
| 11 | + |
| 12 | +## Running Tests |
| 13 | + |
| 14 | +### Run All Unit Tests (Default) |
| 15 | + |
| 16 | +```bash |
| 17 | +pytest |
| 18 | +``` |
| 19 | + |
| 20 | +or |
| 21 | + |
| 22 | +```bash |
| 23 | +python -m pytest |
| 24 | +``` |
| 25 | + |
| 26 | +### Run Only Unit Tests (Explicit) |
| 27 | + |
| 28 | +```bash |
| 29 | +pytest tests/unit/ |
| 30 | +``` |
| 31 | + |
| 32 | +### Run Integration Tests |
| 33 | + |
| 34 | +Integration tests require Azure DevOps credentials and are skipped by default. |
| 35 | + |
| 36 | +```bash |
| 37 | +# Set environment variables first |
| 38 | +export SIMPLE_ADO_BASE_TOKEN="your-ado-token" |
| 39 | +export SIMPLE_ADO_TENANT="your-tenant" |
| 40 | +export SIMPLE_ADO_PROJECT_ID="your-project-id" |
| 41 | +export SIMPLE_ADO_REPO_ID="your-repo-id" |
| 42 | + |
| 43 | +# Run with integration flag |
| 44 | +pytest --integration |
| 45 | +``` |
| 46 | + |
| 47 | +### Run Specific Tests |
| 48 | + |
| 49 | +```bash |
| 50 | +# Run a specific test file |
| 51 | +pytest tests/unit/test_artifacts.py |
| 52 | + |
| 53 | +# Run a specific test class |
| 54 | +pytest tests/unit/test_artifacts.py::TestArtifactsClient |
| 55 | + |
| 56 | +# Run a specific test method |
| 57 | +pytest tests/unit/test_artifacts.py::TestArtifactsClient::test_list_packages |
| 58 | +``` |
| 59 | + |
| 60 | +### Run with Coverage |
| 61 | + |
| 62 | +```bash |
| 63 | +pytest --cov=simple_ado --cov-report=html |
| 64 | +``` |
| 65 | + |
| 66 | +Then open `htmlcov/index.html` to view the coverage report. |
| 67 | + |
| 68 | +## Test Organization |
| 69 | + |
| 70 | +``` |
| 71 | +tests/ |
| 72 | +├── conftest.py # Shared fixtures and configuration |
| 73 | +├── unit/ # Unit tests (always run) |
| 74 | +│ ├── test_client.py |
| 75 | +│ ├── test_artifacts.py |
| 76 | +│ ├── test_builds.py |
| 77 | +│ └── ... |
| 78 | +├── integration/ # Integration tests (optional) |
| 79 | +│ ├── test_integration_legacy.py |
| 80 | +│ └── ... |
| 81 | +└── fixtures/ # Mock response data |
| 82 | + ├── builds_list.json |
| 83 | + ├── packages_list.json |
| 84 | + └── ... |
| 85 | +``` |
| 86 | + |
| 87 | +## Writing Tests |
| 88 | + |
| 89 | +### Unit Tests |
| 90 | + |
| 91 | +Unit tests mock HTTP responses using the `responses` library. All tests are written in **pure pytest style**: |
| 92 | + |
| 93 | +```python |
| 94 | +import responses |
| 95 | +from simple_ado import ADOClient |
| 96 | + |
| 97 | +@responses.activate |
| 98 | +def test_something(mock_client, mock_project_id): |
| 99 | + # Mock the HTTP response |
| 100 | + responses.add( |
| 101 | + responses.GET, |
| 102 | + f"https://{mock_client.http_client.tenant}.visualstudio.com/...", |
| 103 | + json={"data": "value"}, |
| 104 | + status=200 |
| 105 | + ) |
| 106 | + |
| 107 | + # Call the method |
| 108 | + result = mock_client.some_method(project_id=mock_project_id) |
| 109 | + |
| 110 | + # Assert the result |
| 111 | + assert result["data"] == "value" |
| 112 | +``` |
| 113 | + |
| 114 | +### Integration Tests |
| 115 | + |
| 116 | +Integration tests are pure pytest functions marked with `@pytest.mark.integration`: |
| 117 | + |
| 118 | +```python |
| 119 | +import pytest |
| 120 | + |
| 121 | +@pytest.mark.integration |
| 122 | +def test_real_api(integration_client, integration_project_id): |
| 123 | + # This only runs with --integration flag |
| 124 | + result = integration_client.some_method(project_id=integration_project_id) |
| 125 | + assert result is not None |
| 126 | +``` |
| 127 | + |
| 128 | +## Fixtures |
| 129 | + |
| 130 | +Common fixtures are defined in `conftest.py`: |
| 131 | + |
| 132 | +- `mock_client` - A mock ADO client |
| 133 | +- `mock_tenant` - Mock tenant name |
| 134 | +- `mock_project_id` - Mock project ID |
| 135 | +- `mock_repository_id` - Mock repository ID |
| 136 | +- `mock_feed_id` - Mock feed ID |
| 137 | +- `load_fixture` - Load JSON fixture files |
| 138 | +- `integration_client` - Real ADO client (integration tests only) |
| 139 | +- `integration_project_id` - Real project ID (integration tests only) |
| 140 | + |
| 141 | +## Best Practices |
| 142 | + |
| 143 | +1. **Write unit tests first** - They're fast and don't require credentials |
| 144 | +2. **Use pure pytest style** - No unittest.TestCase classes |
| 145 | +3. **Mock external dependencies** - Use `responses` to mock HTTP calls |
| 146 | +4. **Use fixtures** - Reuse common test data via pytest fixtures |
| 147 | +5. **Test edge cases** - Include tests for error conditions |
| 148 | +6. **Keep tests isolated** - Each test should be independent |
| 149 | +7. **Mark destructive tests** - Use `@pytest.mark.destructive` for tests that modify data |
| 150 | +8. **Document complex tests** - Add docstrings explaining what's being tested |
| 151 | + |
| 152 | +## Continuous Integration |
| 153 | + |
| 154 | +The CI pipeline runs: |
| 155 | + |
| 156 | +- Unit tests on every commit/PR |
| 157 | +- Code coverage reporting |
| 158 | +- Integration tests on nightly builds (with credentials) |
| 159 | + |
| 160 | +## Troubleshooting |
| 161 | + |
| 162 | +### Tests fail with "need --integration option to run" |
| 163 | + |
| 164 | +This is expected. Integration tests are skipped by default. Use `--integration` flag to run them. |
| 165 | + |
| 166 | +### Mock responses not matching |
| 167 | + |
| 168 | +Check that the URL in `responses.add()` matches exactly what the client code generates. |
| 169 | + |
| 170 | +### Fixtures not found |
| 171 | + |
| 172 | +Make sure `conftest.py` is in the `tests/` directory and fixtures are properly defined. |
| 173 | + |
| 174 | +### Import errors |
| 175 | + |
| 176 | +Ensure the package is installed in development mode: |
| 177 | +```bash |
| 178 | +pip install -e . |
| 179 | +``` |
| 180 | + |
| 181 | +## Adding New Tests |
| 182 | + |
| 183 | +1. Create a new test file in `tests/unit/` (e.g., `test_newfeature.py`) |
| 184 | +2. Import necessary modules and fixtures |
| 185 | +3. Write test cases using `@responses.activate` for mocking |
| 186 | +4. Add fixture data to `tests/fixtures/` if needed |
| 187 | +5. Run tests with `pytest tests/unit/test_newfeature.py` |
0 commit comments