Skip to content

⚡ Bolt: Optimize rebuild_padding with lazy platform resolution#6486

Open
ZeyuChen wants to merge 7 commits intodevelopfrom
bolt/optimize-rebuild-padding-6475191789834059648
Open

⚡ Bolt: Optimize rebuild_padding with lazy platform resolution#6486
ZeyuChen wants to merge 7 commits intodevelopfrom
bolt/optimize-rebuild-padding-6475191789834059648

Conversation

@ZeyuChen
Copy link
Member

Motivation

The rebuild_padding function in fastdeploy/model_executor/pre_and_post_process.py is called frequently during inference (potentially per token or batch step). The previous implementation performed platform checks (current_platform.is_cuda(), etc.) and imported the corresponding implementation on every call. This introduced unnecessary overhead.

Modifications

  • Introduced a module-level global variable _rebuild_padding_impl to cache the resolved implementation.
  • Implemented a lazy initialization pattern within rebuild_padding. On the first call, it resolves the platform-specific function and assigns it to _rebuild_padding_impl.
  • Created wrapper functions for DCU, GCU, and CPU platforms. These wrappers adapt the 9-argument call signature (used by CUDA, MACA, Iluvatar) to the 6-argument implementation provided by these platforms, ignoring the unused optional arguments.
  • For CUDA, MACA, and Iluvatar, the imported function is assigned directly as the signature matches.

Usage or Command

No changes to usage. The optimization is internal.

Accuracy Tests

  • Verified with a custom unit test script tests/test_rebuild_padding_opt.py (created and then deleted) which mocked fastdeploy.platforms.current_platform and the underlying ops modules.
  • Confirmed that:
    • CUDA path correctly forwards all 9 arguments.
    • DCU, GCU, and CPU paths correctly forward the first 6 arguments.
    • Unsupported platforms raise RuntimeError.

Checklist

  • I have verified the changes locally.
  • Code style follows the repository standards (checked with flake8 and black).

PR created automatically by Jules for task 6475191789834059648 started by @ZeyuChen

Co-authored-by: ZeyuChen <1371212+ZeyuChen@users.noreply.github.com>
@google-labs-jules
Copy link
Contributor

👋 Jules, reporting for duty! I'm here to lend a hand with this pull request.

When you start a review, I'll add a 👀 emoji to each comment to let you know I've read it. I'll focus on feedback directed at me and will do my best to stay out of conversations between you and other bots or reviewers to keep the noise down.

I'll push a commit with your requested changes shortly after. Please note there might be a delay between these steps, but rest assured I'm on the job!

For more direct control, you can switch me to Reactive Mode. When this mode is on, I will only act on comments where you specifically mention me with @jules. You can find this option in the Pull Request section of your global Jules UI settings. You can always switch back!

New to Jules? Learn more at jules.google/docs.


For security, I will only act on instructions from the user who triggered this task.

@CLAassistant
Copy link

CLA assistant check
Thank you for your submission! We really appreciate it. Like many open source projects, we ask that you sign our Contributor License Agreement before we can accept your contribution.
You have signed the CLA already but the status is still pending? Let us recheck it.

@paddle-bot
Copy link

paddle-bot bot commented Feb 21, 2026

Thanks for your contribution!

Co-authored-by: ZeyuChen <1371212+ZeyuChen@users.noreply.github.com>
Co-authored-by: ZeyuChen <1371212+ZeyuChen@users.noreply.github.com>
Co-authored-by: ZeyuChen <1371212+ZeyuChen@users.noreply.github.com>
Co-authored-by: ZeyuChen <1371212+ZeyuChen@users.noreply.github.com>
Co-authored-by: ZeyuChen <1371212+ZeyuChen@users.noreply.github.com>
Co-authored-by: ZeyuChen <1371212+ZeyuChen@users.noreply.github.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants