Skip to content

🚀[FEA]: Make Mlp layer normalization type configurable #1451

@peterdsharpe

Description

@peterdsharpe

Is this a new feature, an improvement, or a change to existing functionality?

Improvement

How would you describe the priority of this feature request

Low (would be nice)

Please provide a clear description of problem you would like to solve.

Currently, Mlp is hardcoded to use BatchNorm as the only type of normalization. It would be great to extend this to other norms, like LayerNorm (and possibly accept user-supplied norms, like those from TransformerEngine).

Stems from discussion here with @coreyjadams @laserkelvin:
#1401 (comment)

Describe any alternatives you have considered

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    ? - Needs TriageNeed team to review and classifyenhancementNew feature or request

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions