Skip to content

Fix kernel cache miss and add RDNA configs #874

Fix kernel cache miss and add RDNA configs

Fix kernel cache miss and add RDNA configs #874

Triggered via pull request October 25, 2024 17:40
Status Failure
Total duration 27s
Artifacts

ruff.yml

on: pull_request
Matrix: ruff
Fit to window
Zoom out
Zoom in

Annotations

53 errors
Ruff (E501): vllm/attention/ops/triton_flash_attention.py#L213
vllm/attention/ops/triton_flash_attention.py:213:81: E501 Line too long (104 > 80)
Ruff (E501): vllm/attention/ops/triton_flash_attention.py#L231
vllm/attention/ops/triton_flash_attention.py:231:81: E501 Line too long (112 > 80)
Ruff (E501): vllm/attention/ops/triton_flash_attention.py#L232
vllm/attention/ops/triton_flash_attention.py:232:81: E501 Line too long (102 > 80)
Ruff (E501): vllm/attention/ops/triton_flash_attention.py#L236
vllm/attention/ops/triton_flash_attention.py:236:81: E501 Line too long (115 > 80)
Ruff (E501): vllm/attention/ops/triton_flash_attention.py#L237
vllm/attention/ops/triton_flash_attention.py:237:81: E501 Line too long (115 > 80)
Ruff (E501): vllm/attention/ops/triton_flash_attention.py#L242
vllm/attention/ops/triton_flash_attention.py:242:81: E501 Line too long (109 > 80)
Ruff (E501): vllm/attention/ops/triton_flash_attention.py#L244
vllm/attention/ops/triton_flash_attention.py:244:81: E501 Line too long (108 > 80)
Ruff (E501): vllm/attention/ops/triton_flash_attention.py#L246
vllm/attention/ops/triton_flash_attention.py:246:81: E501 Line too long (108 > 80)
Ruff (E501): vllm/attention/ops/triton_flash_attention.py#L248
vllm/attention/ops/triton_flash_attention.py:248:81: E501 Line too long (108 > 80)
Ruff (E501): vllm/attention/ops/triton_flash_attention.py#L250
vllm/attention/ops/triton_flash_attention.py:250:81: E501 Line too long (108 > 80)
Ruff (E501): vllm/attention/ops/triton_flash_attention.py#L213
vllm/attention/ops/triton_flash_attention.py:213:81: E501 Line too long (104 > 80)
Ruff (E501): vllm/attention/ops/triton_flash_attention.py#L231
vllm/attention/ops/triton_flash_attention.py:231:81: E501 Line too long (112 > 80)
Ruff (E501): vllm/attention/ops/triton_flash_attention.py#L232
vllm/attention/ops/triton_flash_attention.py:232:81: E501 Line too long (102 > 80)
Ruff (E501): vllm/attention/ops/triton_flash_attention.py#L236
vllm/attention/ops/triton_flash_attention.py:236:81: E501 Line too long (115 > 80)
Ruff (E501): vllm/attention/ops/triton_flash_attention.py#L237
vllm/attention/ops/triton_flash_attention.py:237:81: E501 Line too long (115 > 80)
Ruff (E501): vllm/attention/ops/triton_flash_attention.py#L242
vllm/attention/ops/triton_flash_attention.py:242:81: E501 Line too long (109 > 80)
Ruff (E501): vllm/attention/ops/triton_flash_attention.py#L244
vllm/attention/ops/triton_flash_attention.py:244:81: E501 Line too long (108 > 80)
Ruff (E501): vllm/attention/ops/triton_flash_attention.py#L246
vllm/attention/ops/triton_flash_attention.py:246:81: E501 Line too long (108 > 80)
Ruff (E501): vllm/attention/ops/triton_flash_attention.py#L248
vllm/attention/ops/triton_flash_attention.py:248:81: E501 Line too long (108 > 80)
Ruff (E501): vllm/attention/ops/triton_flash_attention.py#L250
vllm/attention/ops/triton_flash_attention.py:250:81: E501 Line too long (108 > 80)
ruff (3.12)
The job was canceled because "_3_9" failed.
Ruff (E501): vllm/attention/ops/triton_flash_attention.py#L213
vllm/attention/ops/triton_flash_attention.py:213:81: E501 Line too long (104 > 80)
Ruff (E501): vllm/attention/ops/triton_flash_attention.py#L231
vllm/attention/ops/triton_flash_attention.py:231:81: E501 Line too long (112 > 80)
Ruff (E501): vllm/attention/ops/triton_flash_attention.py#L232
vllm/attention/ops/triton_flash_attention.py:232:81: E501 Line too long (102 > 80)
Ruff (E501): vllm/attention/ops/triton_flash_attention.py#L236
vllm/attention/ops/triton_flash_attention.py:236:81: E501 Line too long (115 > 80)
Ruff (E501): vllm/attention/ops/triton_flash_attention.py#L237
vllm/attention/ops/triton_flash_attention.py:237:81: E501 Line too long (115 > 80)
Ruff (E501): vllm/attention/ops/triton_flash_attention.py#L242
vllm/attention/ops/triton_flash_attention.py:242:81: E501 Line too long (109 > 80)
Ruff (E501): vllm/attention/ops/triton_flash_attention.py#L244
vllm/attention/ops/triton_flash_attention.py:244:81: E501 Line too long (108 > 80)
Ruff (E501): vllm/attention/ops/triton_flash_attention.py#L246
vllm/attention/ops/triton_flash_attention.py:246:81: E501 Line too long (108 > 80)
Ruff (E501): vllm/attention/ops/triton_flash_attention.py#L248
vllm/attention/ops/triton_flash_attention.py:248:81: E501 Line too long (108 > 80)
Ruff (E501): vllm/attention/ops/triton_flash_attention.py#L250
vllm/attention/ops/triton_flash_attention.py:250:81: E501 Line too long (108 > 80)
ruff (3.11)
The job was canceled because "_3_9" failed.
Ruff (E501): vllm/attention/ops/triton_flash_attention.py#L213
vllm/attention/ops/triton_flash_attention.py:213:81: E501 Line too long (104 > 80)
Ruff (E501): vllm/attention/ops/triton_flash_attention.py#L231
vllm/attention/ops/triton_flash_attention.py:231:81: E501 Line too long (112 > 80)
Ruff (E501): vllm/attention/ops/triton_flash_attention.py#L232
vllm/attention/ops/triton_flash_attention.py:232:81: E501 Line too long (102 > 80)
Ruff (E501): vllm/attention/ops/triton_flash_attention.py#L236
vllm/attention/ops/triton_flash_attention.py:236:81: E501 Line too long (115 > 80)
Ruff (E501): vllm/attention/ops/triton_flash_attention.py#L237
vllm/attention/ops/triton_flash_attention.py:237:81: E501 Line too long (115 > 80)
Ruff (E501): vllm/attention/ops/triton_flash_attention.py#L242
vllm/attention/ops/triton_flash_attention.py:242:81: E501 Line too long (109 > 80)
Ruff (E501): vllm/attention/ops/triton_flash_attention.py#L244
vllm/attention/ops/triton_flash_attention.py:244:81: E501 Line too long (108 > 80)
Ruff (E501): vllm/attention/ops/triton_flash_attention.py#L246
vllm/attention/ops/triton_flash_attention.py:246:81: E501 Line too long (108 > 80)
Ruff (E501): vllm/attention/ops/triton_flash_attention.py#L248
vllm/attention/ops/triton_flash_attention.py:248:81: E501 Line too long (108 > 80)
Ruff (E501): vllm/attention/ops/triton_flash_attention.py#L250
vllm/attention/ops/triton_flash_attention.py:250:81: E501 Line too long (108 > 80)
ruff (3.10)
The job was canceled because "_3_9" failed.
Ruff (E501): vllm/attention/ops/triton_flash_attention.py#L213
vllm/attention/ops/triton_flash_attention.py:213:81: E501 Line too long (104 > 80)
Ruff (E501): vllm/attention/ops/triton_flash_attention.py#L231
vllm/attention/ops/triton_flash_attention.py:231:81: E501 Line too long (112 > 80)
Ruff (E501): vllm/attention/ops/triton_flash_attention.py#L232
vllm/attention/ops/triton_flash_attention.py:232:81: E501 Line too long (102 > 80)
Ruff (E501): vllm/attention/ops/triton_flash_attention.py#L236
vllm/attention/ops/triton_flash_attention.py:236:81: E501 Line too long (115 > 80)
Ruff (E501): vllm/attention/ops/triton_flash_attention.py#L237
vllm/attention/ops/triton_flash_attention.py:237:81: E501 Line too long (115 > 80)
Ruff (E501): vllm/attention/ops/triton_flash_attention.py#L242
vllm/attention/ops/triton_flash_attention.py:242:81: E501 Line too long (109 > 80)
Ruff (E501): vllm/attention/ops/triton_flash_attention.py#L244
vllm/attention/ops/triton_flash_attention.py:244:81: E501 Line too long (108 > 80)
Ruff (E501): vllm/attention/ops/triton_flash_attention.py#L246
vllm/attention/ops/triton_flash_attention.py:246:81: E501 Line too long (108 > 80)
Ruff (E501): vllm/attention/ops/triton_flash_attention.py#L248
vllm/attention/ops/triton_flash_attention.py:248:81: E501 Line too long (108 > 80)
Ruff (E501): vllm/attention/ops/triton_flash_attention.py#L250
vllm/attention/ops/triton_flash_attention.py:250:81: E501 Line too long (108 > 80)