Skip to content

Conversation

HaoKang-Timmy
Copy link
Contributor

@HaoKang-Timmy HaoKang-Timmy commented Oct 7, 2021

  1. add transformer counter to rnn_hooks.py
  2. provide evaluate_transformer.py as an example

src = torch.rand((1, 1, 10)) # S,N,x


class Model_transformer(nn.Module):
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Class name should be CamelCased.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

fixed

thop/profile.py Outdated
nn.GRU: count_gru,
nn.LSTM: count_lstm,

nn.Transformer: count_Transformer,
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

function name should be lower case.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

fixed

m.total_ops += torch.DoubleTensor([int(total_ops)])


def count_Transformer(m: nn.Transformer, x, y):
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

same issue here.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

fixed, also changed its subfunction, sorry for forgeting changing it after learning camelcase

@liminn
Copy link

liminn commented Jan 5, 2022

why not merge this pull? It is very necessary to calculate the flops and prams of transformer.

@quancs
Copy link

quancs commented Mar 16, 2023

Is there any update now

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants