Conversation
- dask does not implement `broadcast_shapes` - torch emits a RuntimeError not ValueError
There was a problem hiding this comment.
Pull request overview
Updates the library-specific array-api-tests expectation lists to account for new/changed broadcast_shapes tests in the 2025.12 test suite, reflecting current gaps/behavior in Dask and PyTorch backends.
Changes:
- Add Dask xfails for
TestBroadcastShapes(test_broadcast_shapes,test_empty,test_error) for 2025.12. - Add a PyTorch xfail for
TestBroadcastShapes::test_error(exception type mismatch vs spec) for 2025.12.
Reviewed changes
Copilot reviewed 2 out of 2 changed files in this pull request and generated 1 comment.
| File | Description |
|---|---|
| torch-xfails.txt | Adds a 2025.12 xfail entry for TestBroadcastShapes::test_error (wrong exception type). |
| dask-xfails.txt | Adds a 2025.12 xfail block for TestBroadcastShapes tests (feature not implemented). |
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
| # 2025.12 support | ||
|
|
||
| # torch raises a RuntimeError while the spec requires a ValueError | ||
| array_api_tests/test_data_type_functions.py::TestBroadcastShapes::test_error |
There was a problem hiding this comment.
PR title/description says “add skips for broadcast_shapes”, but the changes here are adding entries to the xfails lists. If these tests should be skipped (e.g., due to hangs/flakiness), move them to the appropriate *-skips.txt; otherwise update the PR title/description to reflect that these are expected failures (xfails).
broadcast_shapes