Skip to content

Conversation

@jianyuh
Copy link
Member

@jianyuh jianyuh commented Nov 11, 2025

Summary:
Support FP16 grouped GEMM:

  • torch.ops.fbgemm.bf16bf16bf16_grouped_stacked (fprop)
  • torch.ops.fbgemm.bf16bf16bf16_grouped_grad (dgrad)
  • torch.ops.fbgemm.bf16bf16bf16_grouped_wgrad (wgrad)

Differential Revision: D86718224

@netlify
Copy link

netlify bot commented Nov 11, 2025

Deploy Preview for pytorch-fbgemm-docs ready!

Name Link
🔨 Latest commit f711c72
🔍 Latest deploy log https://app.netlify.com/projects/pytorch-fbgemm-docs/deploys/69138d0791f5a0000887d20a
😎 Deploy Preview https://deploy-preview-5111--pytorch-fbgemm-docs.netlify.app
📱 Preview on mobile
Toggle QR Code...

QR Code

Use your smartphone camera to open QR code link.

To edit notification comments on pull requests, go to your Netlify project configuration.

@meta-cla meta-cla bot added the cla signed label Nov 11, 2025
@meta-codesync
Copy link
Contributor

meta-codesync bot commented Nov 11, 2025

@jianyuh has exported this pull request. If you are a Meta employee, you can view the originating Diff in D86718224.

jianyuh added a commit to jianyuh/FBGEMM that referenced this pull request Nov 11, 2025
Summary:
Pull Request resolved: pytorch#5111

X-link: https://github.com/facebookresearch/FBGEMM/pull/2116

Support FP16 grouped GEMM:

*   `torch.ops.fbgemm.bf16bf16bf16_grouped_stacked` (fprop)
*   `torch.ops.fbgemm.bf16bf16bf16_grouped_grad` (dgrad)
*   `torch.ops.fbgemm.bf16bf16bf16_grouped_wgrad` (wgrad)

Differential Revision: D86718224
@jianyuh jianyuh force-pushed the export-D86718224 branch 2 times, most recently from 3bb5290 to 3d0e56e Compare November 11, 2025 19:18
jianyuh added a commit to jianyuh/FBGEMM that referenced this pull request Nov 11, 2025
Summary:
Pull Request resolved: pytorch#5111

X-link: https://github.com/facebookresearch/FBGEMM/pull/2116

Support FP16 grouped GEMM:

*   `torch.ops.fbgemm.bf16bf16bf16_grouped_stacked` (fprop)
*   `torch.ops.fbgemm.bf16bf16bf16_grouped_grad` (dgrad)
*   `torch.ops.fbgemm.bf16bf16bf16_grouped_wgrad` (wgrad)

Reviewed By: jwfromm

Differential Revision: D86718224
Summary:
Pull Request resolved: pytorch#5111

X-link: https://github.com/facebookresearch/FBGEMM/pull/2116

Support FP16 grouped GEMM:

*   `torch.ops.fbgemm.bf16bf16bf16_grouped_stacked` (fprop)
*   `torch.ops.fbgemm.bf16bf16bf16_grouped_grad` (dgrad)
*   `torch.ops.fbgemm.bf16bf16bf16_grouped_wgrad` (wgrad)

Reviewed By: jwfromm

Differential Revision: D86718224
@meta-codesync
Copy link
Contributor

meta-codesync bot commented Nov 12, 2025

This pull request has been merged in 58ae4dd.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants