Skip to content

Conversation

@lizhenyun01
Copy link
Collaborator

@lizhenyun01 lizhenyun01 commented Jan 8, 2026

Motivation

  • attention优化及重构V1 batch,当前只支持D节点,
  • 使用方式:export FD_ATTENTION_BACKEND=DECODE_APPEND_ATTN

Modifications

Usage or Command

Accuracy Tests

Checklist

  • Add at least a tag in the PR title.
    • Tag list: [[FDConfig],[APIServer],[Engine], [Scheduler], [PD Disaggregation], [Executor], [Graph Optimization], [Speculative Decoding], [RL], [Models], [Quantization], [Loader], [OP], [KVCache], [DataProcessor], [BugFix], [Docs], [CI], [Optimization], [Feature], [Benchmark], [Others], [XPU], [HPU], [GCU], [DCU], [Iluvatar], [Metax]]
    • You can add new tags based on the PR content, but the semantics must be clear.
  • Format your code, run pre-commit before commit.
  • Add unit tests. Please write the reason in this PR if no unit tests.
  • Provide accuracy results.
  • If the current PR is submitting to the release branch, make sure the PR has been submitted to the develop branch, then cherry-pick it to the release branch with the [Cherry-Pick] PR tag.

@paddle-bot
Copy link

paddle-bot bot commented Jan 8, 2026

Thanks for your contribution!

@lizhenyun01 lizhenyun01 changed the title 1131 new [Cherry-Pick][Optimization]Decode append attention support(#5767) Jan 8, 2026
@codecov-commenter
Copy link

codecov-commenter commented Jan 8, 2026

Codecov Report

❌ Patch coverage is 35.21127% with 92 lines in your changes missing coverage. Please review.
⚠️ Please upload report for BASE (release/online/20251131@3a94e6f). Learn more about missing BASE report.

Files with missing lines Patch % Lines
...ayers/attention/decode_append_attention_backend.py 21.90% 81 Missing and 1 partial ⚠️
fastdeploy/platforms/cuda.py 0.00% 2 Missing and 1 partial ⚠️
fastdeploy/spec_decode/mtp.py 0.00% 1 Missing and 1 partial ⚠️
fastdeploy/worker/gpu_model_runner.py 0.00% 1 Missing and 1 partial ⚠️
...cutor/layers/attention/ops/config_for_attention.py 85.71% 0 Missing and 1 partial ⚠️
...or/layers/attention/ops/decode_append_attention.py 88.88% 0 Missing and 1 partial ⚠️
...ers/attention/ops/decoder_write_cache_with_rope.py 88.88% 0 Missing and 1 partial ⚠️
Additional details and impacted files
@@                    Coverage Diff                     @@
##             release/online/20251131    #5954   +/-   ##
==========================================================
  Coverage                           ?   58.40%           
==========================================================
  Files                              ?      324           
  Lines                              ?    39340           
  Branches                           ?     5935           
==========================================================
  Hits                               ?    22977           
  Misses                             ?    14515           
  Partials                           ?     1848           
Flag Coverage Δ
GPU 58.40% <35.21%> (?)

Flags with carried forward coverage won't be shown. Click here to find out more.

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants