Skip to content

[Feat] Frozen backbone finetuning #783

@Jaster1999

Description

@Jaster1999

Search before asking

  • I have searched the RF-DETR issues and found no similar feature requests.

Description

The ability to set the backbone to be frozen for finetuning. something like freeze_backbone=True in model.train args e.g.

model.train(
dataset_dir="",
freeze_backbone=True,
)

Use case

Specifically this would be useful for training very large models on consumer gpu hardware but also to prevent overfitting on smaller datasets. The logic here being the object365 pretraining can create generalized features for object detection and we just adapt the object detection neck and head for the downstream task.

If it works we could potentially train more generalized models on small dataset (sacrificing better metrics for generalization)

Additional

No response

Are you willing to submit a PR?

  • Yes I'd like to help by submitting a PR!

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementNew feature or request

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions