Skip to content

migrate to transformers v5 #2503

migrate to transformers v5

migrate to transformers v5 #2503

Triggered via pull request February 11, 2026 03:16
Status Cancelled
Total duration 42m 14s
Artifacts 14

pr_tests_gpu.yml

on: pull_request
check_code_quality
39s
check_code_quality
check_repository_consistency
29s
check_repository_consistency
Setup Torch Pipelines CUDA Slow Tests Matrix
55s
Setup Torch Pipelines CUDA Slow Tests Matrix
Matrix: run_examples_tests
Matrix: torch_cuda_tests
Matrix: torch_pipelines_cuda_tests
Fit to window
Zoom out
Zoom in

Annotations

5 errors and 1 warning
Torch CUDA Tests (transformers main)
Process completed with exit code 1.
Torch CUDA Tests (transformers main)
Process completed with exit code 1.
Examples PyTorch CUDA tests on Ubuntu (transformers main)
Canceling since a higher priority waiting request for Fast GPU Tests on PR-transformers-v5-pr exists
Examples PyTorch CUDA tests on Ubuntu (transformers main)
The operation was canceled.
Fast GPU Tests on PR
Canceling since a higher priority waiting request for Fast GPU Tests on PR-transformers-v5-pr exists
Examples PyTorch CUDA tests on Ubuntu (transformers main)
No files were found with the provided path: reports. No artifacts will be uploaded.

Artifacts

Produced during runtime
Name Size Digest
pipeline_controlnet_flux_transformers_main_test_reports
76.5 KB
sha256:4095913e03a804d6bed07ea7103e343f6472ea5e2e1495e477c8e177574bf7a2
pipeline_controlnet_sd3_transformers_main_test_reports
79.6 KB
sha256:8ed02ff25a4a42c4cc2dfbaf36703a6be59ad1e4e9c2fc93964c820e77808bb2
pipeline_controlnet_transformers_main_test_reports
251 KB
sha256:b863060e274e2ac626d0798956fbef6c8c626fda0403d77266226c8bebd213af
pipeline_flux_transformers_main_test_reports
226 KB
sha256:0896bd51118e6e2b20266077106b91db6378c8461f2e5277954a2b1ba0ec6617
pipeline_ip_adapters_transformers_main_test_reports
2.11 KB
sha256:245681cfbf9c92ab8709dd7aadf72e435c9b77e052e94c112918850d9deba7ce
pipeline_stable_diffusion_2_transformers_main_test_reports
84.5 KB
sha256:0acf8a6eac8fee5fa81b224b4382602e4e865b3478c8d2798ee3ccc9333f098b
pipeline_stable_diffusion_3_transformers_main_test_reports
119 KB
sha256:a13845a2ab36031f417ac53cbea0add865afdc485b1e57c294256ab8414c5901
pipeline_stable_diffusion_transformers_main_test_reports
63.4 KB
sha256:40156e4d2b94aea39f9236f81713bd42dc2d843b40e5c8ac37a46c2e79084b52
pipeline_stable_diffusion_xl_transformers_main_test_reports
178 KB
sha256:350110c0196348bfcb65ec8a706d3faf0c73a0712b81de701158c77db3d6537d
test-pipelines.json
247 Bytes
sha256:fa19cd7b3b9f1c63dcd08318c8f631210e50516728b8d0ce653598281bbe3ed9
torch_cuda_test_reports_lora_transformers_main
1.91 MB
sha256:19adc52c9ddd09885e21667854d406de747124c7e2db6766b155ab5bf7f85f18
torch_cuda_test_reports_models_transformers_main
51.4 KB
sha256:83ea5a09052e26144f9deb74cf2ba0f2af5d948cc9594746fc9b24a12fdcba3c
torch_cuda_test_reports_others_transformers_main
7.94 KB
sha256:74d39a0f64a85a4de2e16d4917a0aafcd02e0d49ab783855acbedfa766ad43dd
torch_cuda_test_reports_schedulers_transformers_main
23.1 KB
sha256:3e729437ecd997e94d83f2c0b6ee48a1648847564b53f434260c38143d488ce5