Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
10 changes: 5 additions & 5 deletions .github/workflows/ci-test.yml
Original file line number Diff line number Diff line change
Expand Up @@ -115,7 +115,7 @@ jobs:

- name: Unit tests (local)
if: matrix.backend == 'local'
run: pytest -m "not mongo and not sql and not redis and not s3" --cov=cachier --cov-report=term --cov-report=xml:cov.xml
run: pytest -m "not mongo and not sql and not redis and not s3" -n auto --cov=cachier --cov-report=term --cov-report=xml:cov.xml

- name: Setup docker (missing on MacOS)
if: runner.os == 'macOS' && matrix.backend == 'mongodb'
Expand Down Expand Up @@ -148,7 +148,7 @@ jobs:

- name: Unit tests (DB)
if: matrix.backend == 'mongodb'
run: pytest -m "mongo" --cov=cachier --cov-report=term --cov-report=xml:cov.xml
run: pytest -m "mongo" -n auto --cov=cachier --cov-report=term --cov-report=xml:cov.xml
- name: Speed eval
run: python tests/speed_eval.py

Expand All @@ -169,7 +169,7 @@ jobs:
if: matrix.backend == 'postgres'
env:
SQLALCHEMY_DATABASE_URL: postgresql+psycopg://testuser:testpass@localhost:5432/testdb
run: pytest -m sql --cov=cachier --cov-report=term --cov-report=xml:cov.xml
run: pytest -m sql -n auto --cov=cachier --cov-report=term --cov-report=xml:cov.xml

- name: Start Redis in docker
if: matrix.backend == 'redis'
Expand All @@ -183,11 +183,11 @@ jobs:

- name: Unit tests (Redis)
if: matrix.backend == 'redis'
run: pytest -m redis --cov=cachier --cov-report=term --cov-report=xml:cov.xml
run: pytest -m redis -n auto --cov=cachier --cov-report=term --cov-report=xml:cov.xml

- name: Unit tests (S3)
if: matrix.backend == 's3'
run: pytest -m s3 --cov=cachier --cov-report=term --cov-report=xml:cov.xml
run: pytest -m s3 -n auto --cov=cachier --cov-report=term --cov-report=xml:cov.xml

- name: Upload coverage to Codecov (non PRs)
continue-on-error: true
Expand Down
18 changes: 10 additions & 8 deletions tests/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -339,24 +339,26 @@ The CI pipeline runs a matrix job per backend. Each backend uses the commands be

```bash
# Local backends (memory, pickle, and other non-external tests)
pytest -m "not mongo and not sql and not redis and not s3"
pytest -m "not mongo and not sql and not redis and not s3" -n auto

# MongoDB backend
pytest -m mongo
pytest -m mongo -n auto

# PostgreSQL/SQL backend
pytest -m sql
pytest -m sql -n auto

# Redis backend
pytest -m redis
pytest -m redis -n auto

# S3 backend
pytest -m s3
pytest -m s3 -n auto
```

Note: local tests do not use `pytest-xdist` (`-n`) in CI. External backends
(MongoDB, PostgreSQL, Redis, S3) each run in their own isolated matrix job with
the corresponding Docker service started beforehand.
All backends use `pytest-xdist` (`-n auto`) in CI for parallel test execution.
Each backend runs in its own isolated matrix job with the corresponding Docker
service started beforehand. Per-worker isolation is handled automatically by
the fixtures in `conftest.py` (separate cache directories for pickle/maxage
tests, separate PostgreSQL schemas for SQL tests).

### Environment Variables

Expand Down
20 changes: 2 additions & 18 deletions tests/conftest.py
Original file line number Diff line number Diff line change
Expand Up @@ -55,16 +55,7 @@ def inject_worker_schema_for_sql_tests(monkeypatch, request):

# Rebuild the URL with updated query parameters
new_query = urlencode(query_params, doseq=True)
new_url = urlunparse(
(
parsed.scheme,
parsed.netloc,
parsed.path,
parsed.params,
new_query,
parsed.fragment
)
)
new_url = urlunparse((parsed.scheme, parsed.netloc, parsed.path, parsed.params, new_query, parsed.fragment))

# Override both the environment variable and the module constant
monkeypatch.setenv("SQLALCHEMY_DATABASE_URL", new_url)
Expand Down Expand Up @@ -155,14 +146,7 @@ def cleanup_test_schemas(request):
# Rebuild clean URL
clean_query = urlencode(query_params, doseq=True) if query_params else ""
clean_url = urlunparse(
(
parsed.scheme,
parsed.netloc,
parsed.path,
parsed.params,
clean_query,
parsed.fragment
)
(parsed.scheme, parsed.netloc, parsed.path, parsed.params, clean_query, parsed.fragment)
)

engine = create_engine(clean_url)
Expand Down