Skip to content

Conversation

@Quentin-Anthony
Copy link
Member

@Kyle1668

ModernBERT‑base

python calc_transformer_flops.py \
  --encoder-only \
  -l 22 \
  -hs 768 \
  --ffn-hidden-size 1152 \
  --num-mlp-linears 2 \
  --vocab-size 50368 \
  --sequence-length 8192 \   # or 512, 1024, etc.
  --mlm-ratio 0.15

ModernBERT‑large

python calc_transformer_flops.py \
  --encoder-only \
  -l 28 \
  -hs 1024 \
  --ffn-hidden-size 2624 \
  --num-mlp-linears 2 \
  --vocab-size 50368 \
  --sequence-length 8192 \
  --mlm-ratio 0.15

Since you're doing inference calculations, make sure you pass --infer

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants